Does confluent kafka jdbc connector (sink) allow you to pull data from a topic and then define which tables on an oracle database to distribute that data to?.
i.e.
A single topic record will have a lot of data that needs to be distributed to multiple tables on our CRM running on Oracle db is this feasible?
Is it possible to define which nodes in the payload to be destined to which tables? as the below example
Using ETL Middleware:
Another approach from @OneCricketeer answer is introducing an ETL (Extract, Transform, Load) middleware that consumes the Kafka topic, processes the payload, and then writes the processed data to the respective tables in the Oracle database.
Here are the steps involved:
Kafka Consumer: The ETL tool consumes records from the Kafka topic, which contains a complex JSON payload.
Data Parsing: The ETL tool parses the JSON payload.
Data Transformation:
Load into Oracle Database: The ETL tool writes the extracted and transformed data into the respective tables in the Oracle database.
Tools :
Note: Using an ETL middleware is a robust solution for complex data handling needs, like distributing a single Kafka topic record into multiple tables in an Oracle database. It allows for custom data transformations and ensures that the data is properly routed to the correct tables.