node.jsexpresswinstongrafana-lokipromtail

Promtail with Cloud Grafana Loki and winston JS log fields not detected


I have setup promtail (docker image) reading my .log files. Those logs are sent to Grafana Loki cloud.

I receive those logs succesfully but fields are not detected. Exemple of a log: {"env":"development","file":"myFile","level":"info","message":"myMessage","service":"myApp"}

Here it is what I can see in Loki:

enter image description here

As you can see, log is fully ingested but I do only have 2 fields (the labels) and the fields in my JSON log. I don't know how to setup auto detection for fields located in the log.

Here it is my scrape_configs:

    scrape_configs:
  - job_name: system
    static_configs:
      - targets:
          - localhost
        labels:
          job: varlogs
          __path__: /var/log/sms-meeting/*.log

This is my winston logger config:

const logger = winston.createLogger({
  level: "info",
  defaultMeta: { service: "myApp", env: process.env.NODE_ENV },
  transports: [
    new winston.transports.File({ filename: "logs/error.log", level: "error" }),
    new winston.transports.File({ filename: "logs/combined.log" }),
  ],
});

Solution

  • As a best practice, you shouldn't setup auto detection of json fields during your log ingestion. You want to ensure you primarily use static labels, sparingly use dynamic labels, and never use unbounded labels. Refer to the official Loki Label Best Practices for further information on each point.

    You should instead leverage LogQL parsers to extract your json fields and make them available at query time. See the documentation for LogQL parser expressions.

    To give you an example from the public Play instance of Grafana:

    JSON Logs with no LogQL Parser (Open Demo)
    The only fields available are the ones applied during ingestion as indexed labels. enter image description here

    JSON Logs WITH a LogQL json Parser (Open Demo)
    As you can see, all of the json fields are now available as fields within Grafana, and can be used in the remainder of your query as if they were part of the index. enter image description here


    It's very important that you are mindful of your cardinality when choosing which labels to send with your logs during ingestion. In saying that, if there is something you really must extract from your log line during ingestion and apply as a label, you should refer to the label pipeline stage in your scrape config.