I want to read 5+ million events from an Azure SQL DB table and perform a BULK INSERT to Cassandra. The table has 2 columns. I see the SQL component available for reading from Azure SQL DB. https://camel.apache.org/components/3.7.x/sql-component.html
Question: Consuming from Azure SQL DB
There is a cql component available for Cassandra https://camel.apache.org/components/3.7.x/cql-component.html
Question: Producing to Cassandra
Can I use camel for this use case ?
For each table in your Azure database, we recommend that you export the data into a CSV file. There are lots of tools and methods available that would allow you to do that. For example, have a look at Import and export data from Azure SQL Database.
Once you have exported your data to CSV, you can use the DataStax Bulk Loader tool (DSBulk) to bulk load it to a Cassandra table.
Here are some references with examples to help you get started quickly:
DSBulk is open-source so it's free to use. Cheers!