I have a GKE cluster which has some applications running. Those application log to stdout for performance reasons. I know GKE comes with fluentbit for logging. I was wondering if there is a way to get the application logs to a memorystore for redis instance that I am running. My question is because fluentbit doesn't come with a redis output
i guess with fluentbit you have to use a custom plugin to push to Redis.
Memorystore is for caching not for logging Big query would be better incase planning for analytics or search.
You can use this custom plugin : https://github.com/majst01/fluent-bit-go-redis-output in that case you have to deploy your own Deamonset of Fluentbit.
If you can't use the Fluentbit custom plugin workaround is to go with Storage bucket & pub/Sub.
Create custom user sink for logs and push to Bucket or Pub/Sub as per need and from there parse/format and push to Memory store Use CloudRun/Function based need batch process or Scheduled job.
With Fluentbit also you can push to GCP pubsub with some record transformation if required to logs in JSON and using custom plugin.