oracle-databaseapache-kafkaapache-kafka-connect

Confluent Kafka: Data from Topic (single payload) to multiple tables


Does confluent kafka jdbc connector (sink) allow you to pull data from a topic and then define which tables on an oracle database to distribute that data to?.

i.e.

A single topic record will have a lot of data that needs to be distributed to multiple tables on our CRM running on Oracle db is this feasible?

Is it possible to define which nodes in the payload to be destined to which tables? as the below example


Solution

  • Using ETL Middleware:

    Another approach from @OneCricketeer answer is introducing an ETL (Extract, Transform, Load) middleware that consumes the Kafka topic, processes the payload, and then writes the processed data to the respective tables in the Oracle database.

    Here are the steps involved:

    1. Kafka Consumer: The ETL tool consumes records from the Kafka topic, which contains a complex JSON payload.

    2. Data Parsing: The ETL tool parses the JSON payload.

    3. Data Transformation:

      • Table 1 ("Members Table"): Extract fields related to member information and load them into the members table.
      • Table 2 ("Address Table"): Extract fields related to address information and load them into the address table.
      • Table 3 ("Contactable Info Table"): Extract contact details and load them into the contactable_info table.
    4. Load into Oracle Database: The ETL tool writes the extracted and transformed data into the respective tables in the Oracle database.

    Ref - https://medium.com/@mariusz_kujawski/building-an-efficient-etl-elt-process-for-data-delivery-9ee775375418

    Tools :

    Note: Using an ETL middleware is a robust solution for complex data handling needs, like distributing a single Kafka topic record into multiple tables in an Oracle database. It allows for custom data transformations and ensures that the data is properly routed to the correct tables.