Logging: ELK Stack and Splunk
Introduction
Centralized logging is essential for debugging production systems. This guide covers ELK Stack and Splunk for log aggregation and analysis.
Good: Structured Logging
Using Logrus
package main
import (
"github.com/sirupsen/logrus"
"os"
)
// โ
GOOD: Structured logging
func setupLogging() *logrus.Logger {
logger := logrus.New()
logger.SetOutput(os.Stdout)
logger.SetFormatter(&logrus.JSONFormatter{
TimestampFormat: "2006-01-02 15:04:05",
})
return logger
}
// โ
GOOD: Log with context
func handleRequest(logger *logrus.Logger, userID string) {
logger.WithFields(logrus.Fields{
"user_id": userID,
"action": "login",
"ip": "192.168.1.1",
}).Info("User logged in")
}
Good: ELK Stack
Docker Compose
# โ
GOOD: ELK Stack setup
version: '3.8'
services:
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:8.0.0
environment:
- discovery.type=single-node
- xpack.security.enabled=false
ports:
- "9200:9200"
logstash:
image: docker.elastic.co/logstash/logstash:8.0.0
volumes:
- ./logstash.conf:/usr/share/logstash/pipeline/logstash.conf
ports:
- "5000:5000"
kibana:
image: docker.elastic.co/kibana/kibana:8.0.0
ports:
- "5601:5601"
Logstash Configuration
# โ
GOOD: Logstash configuration
input {
tcp {
port => 5000
codec => json
}
}
filter {
if [type] == "go-app" {
mutate {
add_field => { "[@metadata][index_name]" => "go-app-%{+YYYY.MM.dd}" }
}
}
}
output {
elasticsearch {
hosts => ["elasticsearch:9200"]
index => "%{[@metadata][index_name]}"
}
}
Good: Sending Logs
Logstash Output
// โ
GOOD: Send logs to Logstash
import "github.com/bshuster-repo/logrus-logstash-hook"
func setupLogstashLogging() *logrus.Logger {
logger := logrus.New()
hook, err := logrustash.NewHook("tcp", "localhost:5000", "go-app")
if err != nil {
panic(err)
}
logger.AddHook(hook)
return logger
}
Best Practices
1. Use Structured Logging
// โ
GOOD: Structured format
logger.WithFields(logrus.Fields{
"user_id": userID,
"action": "login",
}).Info("User logged in")
// โ BAD: Unstructured
log.Printf("User %s logged in", userID)
2. Include Context
// โ
GOOD: Include relevant context
logger.WithFields(logrus.Fields{
"request_id": requestID,
"user_id": userID,
"duration": duration,
}).Info("Request completed")
3. Use Appropriate Log Levels
// โ
GOOD: Appropriate levels
logger.Debug("Debug information")
logger.Info("General information")
logger.Warn("Warning message")
logger.Error("Error occurred")
logger.Fatal("Fatal error")
Resources
- ELK Stack: https://www.elastic.co/what-is/elk-stack
- Splunk: https://www.splunk.com/
- Logrus: https://github.com/sirupsen/logrus
Summary
Centralized logging enables efficient debugging and monitoring. Use structured logging with JSON format, send logs to ELK Stack or Splunk, and include relevant context. Proper logging practices make troubleshooting production issues much easier.
Comments