Logging to files on 10 servers? Here is a centralized solution in 10 minutes.
Stack: Loki + Promtail + Grafana¶
Loki is “Prometheus for logs” — lightweight, integrates with Grafana.
Docker Compose¶
services: loki: image: grafana/loki:latest ports: [“3100:3100”] promtail: image: grafana/promtail:latest volumes: - /var/log:/var/log - ./promtail.yml:/etc/promtail/config.yml grafana: image: grafana/grafana:latest ports: [“3000:3000”]
Promtail Config¶
server: { http_listen_port: 9080 } clients: [{ url: “http://loki:3100/loki/api/v1/push” }] scrape_configs: - job_name: system static_configs: - targets: [localhost] labels: { job: varlogs, __path__: /var/log/*.log }
Structured Logging¶
// JSON logs instead of plain text console.log(JSON.stringify({ level: “error”, msg: “request failed”, status: 500, path: “/api/users”, duration_ms: 1234 }));
Grafana Queries¶
{job=”varlogs”} |= “error” {job=”app”} | json | status >= 500
Tip¶
Log in structured format (JSON), tag environment and service. Set retention to 30 days.