apache-kafkaspring-kafkakafka-producer-apifailover

Kafka producer posting messages to secondary cluster


Description of proposed cluster setup

What we want to achieve is, Lets say Producer 1 posts a message asynchronously to Cluster 1, but receives a negative acknowledgment ( after all the retry timeout has occurred). This is easily detected in the producer callback method

On receiving this failure, We use another KafkaTemplate (having details of Cluster 2) to be used by producer. Now producer tries posting the same message on to cluster 2 [ It applies other way round as well, if producer 2 unable to post locally , it will send message to cluster 1]

The advantage that we get here is

One down side we see is, We are handling the failover logic ourselves by producing to secondary cluster in exception handling block of either Metadata fetch timeout or on Negative acknowledgment

I could not find any where on the net showing a similar setup. Is there is something fundamentally wrong with this approach

enter image description here


Solution

  • Sure; just configure 2 sets of infrastructure beans - producer and consumer factories, container factories, templates.

    You can't use Boot's auto configuration for that, but you can define the beans yourself.