Tuesday, February 5, 2019


Go - how to aggregate your logging with logrus and io.Writer

I'm always looking for better ways to observe microservices.  Recently I've been reading a lot of posts/blogs from Charity Majors (Charity Majors‏ @mipsytipsy).  Charity makes a compelling argument for producing structured data for every request that your services handle.  Currently, our microservices use Logrus for structured logging and, out of the box, Logrus writes to the log every time a log entry is created.   Basically, you call logrus.Info()and it writes structured data to the log, which is great... until your services start handling thousands of transactions per second (TPS) and they're writing 100s of times to the log per request.  At that point, your log infrastructure just starts falling over trying to ingest the thousands of TPS your services are sending it.  The solutions is obvious, you need to aggregate your logging in memory and only write one time to the log per request.

Luckily, the maintainers of Logrus provide a simple way to aggregate your logging.  The logrus.Logger exports Logger.Out io.Writer as an attribute.   That means we can do something like:

package main

import (
 "fmt"
 "string"

 "github.com/sirupsen/logrus"
)

func main() {
 var stringsBuilder strings.Builder
 log := &logrus.Logger{
  Out:       &stringsBuilder,
  Formatter: new(logrus.JSONFormatter),
  Hooks:     make(logrus.LevelHooks),
  Level:     logrus.DebugLevel,
 }
 logger := logrus.WithFields(logrus.Fields{
  "path":   "/test",
  "status": 200})
 logger.Logger = log3
 logger.Info("strings.Builder Info")
 logger.Debug("strings.Builder Debug")
 logger.Error("strings.Builder Error")
 fmt.Printf(stringsBuilder.String())
}

The code above will only make one write to the log, but that write will be an aggregate of all the structured data you've added to the logger.   Pretty slick and my hat's off to the Logrus maintainers for making it so easy.   There is one problem: the JSON would be outputted as a new object for each log entry.  That's easily fixed though with a new type that implements the io.Writer interface and encapsulates a strings.Builder:

import (
 "bytes"
 "strings"
)

// StringWriter - implement io.Writer inferface to append to a string
type logBuffer struct {
 Buff strings.Builder
}

// Write - simply append to the strings.Buffer but add a comma too
func (b *logBuffer) Write(data []byte) (n int, err error) {
 newData := bytes.TrimSuffix(data, []byte("\n"))
 return b.Buff.Write(append(newData, []byte(",")...))
}

// String - output the strings.Builder as one aggregated JSON object
func (b *logBuffer) String() string {
 return "{\"entries\":[" + strings.TrimSuffix(b.Buff.String(), ",") + "]}\n"
}
I've taken the above proof of concept for aggregating Logrus entries and written some Gin middleware that not only aggregates logging but also adds OpenTracing support:
I hope others find this helpful and learned something about the io.Writer interface as well.
# example aggregated log entry for a request with UseBanner == true
{
  "new-header-index-name": "this is how you set new header level data",
  "request-summary-info": {
    "comment": "",
    "ip": "::1",
    "latency": "     98.217µs",
    "method": "GET",
    "path": "/",
    "requestID": "4b4fb22ef51cc540:4b4fb22ef51cc540:0:1",
    "status": 200,
    "time": "2019-02-06T13:24:06Z",
    "user-agent": "curl/7.54.0"
  },
  "entries": [
    {
      "level": "info",
      "msg": "this will be aggregated into one write with the access log and will show up when the request is completed",
      "time": "2019-02-06T08:24:06-05:00"
    },
    {
      "comment": "this is an aggregated log entry with initial comment field",
      "level": "debug",
      "msg": "aggregated entry with new comment field",
      "time": "2019-02-06T08:24:06-05:00"
    },
    {
      "level": "error",
      "msg": "aggregated error entry with new-comment field",
      "new-comment": "this is an aggregated log entry with reset comment field",
      "time": "2019-02-06T08:24:06-05:00"
    }
  ],
  "banner": "[GIN] ----------- GinLogrusWithTracing -----------"
}



No comments:

Post a Comment