I have a little question about spring kafka batch listener work. As I know, if we set this property
factory.setBatchListener(true);
Then we will be able to consume message batches. But will we be able to still consume messages one by one? Recently we had a problem on our project - our kafka listener stopped consuming messages, we tried to restart our pods in k8s, but nothing helped. It was a batch listener. I checked max.poll.records
property and it was set to 200, as I saw - there were barely 4 messages in the topic in test env, so my guess was that because we don't have enough messages - our consumer is not consuming and waiting for 200 messages. I set setBatchListener
to false, after this we've faced with deserialization problem after which I set it back to true and reduced max.poll.records
to 1 and it started to work. Now I set this property to 250 and it is still working but as I saw there are pretty much messages coming to our topic now in test env.
So is it possible that the problem was related to batch listener or maybe something else? Will batch listener consume messages one by one if there are not much of them or still will wait for bigger batch size?
Most likely the issue is not connected with batch listener.
KafkaListener heavily rely on kafka-clients lib, and, KafkaConsumer created only once for your listener (by default).
In other words, all "batching" technique depends only on KafkaConsumer object and its behavior which kafka-clients and kafka itself gives to him.
Polling is being done in a container thread. You can see source code here.
Spring "batch" listener just gives you another method signature (in your case), that's it.Batch listener is expected to catch single messages from container -> consumer.
PS: I had a project on spring kafka 3.0.7 but I updated it to 3.3.7 and checked the behavior on both.
I don't know which versions you're using, because there is even a 4.0.0 which utilizes kafka-clients 4.0.0 but as far as i know boot doesn't support it yet.