Application produces logs to docker container stdout:
{"message": "gateway 200 <== assistant PATCH /admin/info-banners/1/", "service_name": "НКЗ API Gateway", "level": "INFO", "timestamp": "2024-04-16T09:27:08.514573", "logger": "app.adapters.base_adapter", "filepath": "/api_gateway/app/adapters/base_adapter.py", "func": "_request_service"}
The log then transferred by Kubernetes to .log
file and there it looks like this:
2024-04-16T12:27:08.515010847+03:00 stdout F {"message": "gateway 200 <== assistant PATCH /admin/info-banners/1/", "service_name": "НКЗ API Gateway", "level": "INFO", "timestamp": "2024-04-16T09:27:08.514573", "logger": "app.adapters.base_adapter", "filepath": "/api_gateway/app/adapters/base_adapter.py", "func": "_request_service"}
Question 1: what is stdout
? the place where it was taken from?
Question 2: what if F
?
Logs in files .log
then consumed by filebeat.
Question 3: how filebeat decides how to parse 2024-04-16T12:27:08.515010847+03:00
, stdout
, F
? I mean there types (datetime, string, string). And how filebeat decides to parse a json?
Another related answer here. The Kubernetes log is basically in CRI log format.