Assume we have a Kafka producer that sends messages to a topic, which is then read by a Kafka Streams application, which then puts these messages into a state store for further processing later.
Using a Confluent schema registry, is it possible to set the compatibility on this topic to Forward
?
Because when adding a new mandatory field, the messages in the state store won't have it. Can maybe some kind of default value be defined for reading this new field?
The described compatibly level describe an upstream producer and downstream consumer scenario, and in the end, the question if you need to upgrade your upstream producer or downstream consumer first.
For Kafka Streams with state stores, it's more complicated, because Kafka Streams is a consumer and producer at the same time. It read's not only from the input topic, but also changelog/store, and writes into the store/changelog.
Using a Confluent schema registry, is it possible to set the compatibility on this topic to
Forward
?
This is certainly possible, but it would not ensure that the Kafka Streams application would use the new field. You can set it to Forward
to force producer to add the new field, but the downstream Kafka Streams app would ignore this new field as long as it's not updated itself.
Because when adding a new mandatory field, the messages in the state store won't have it Can maybe some kind of default value be defined for reading this new field?
Correct. This implies that Kafka Streams apps itself requires backward compatibility. Kafka Streams need to be able to read the old schema and insert the default value for a new field on-read. Hence, the schema registered for the changelog topics must be set to backward compatible, and thus cannot be the same as your input topic schema (as only optional fields support default values, but mandatory fields don't).