apache-kafkaconfigurationkafka-topicmongodb-kafka-connector

MongoDB Kafka Connect can't send large kafka messages


I am trying to send json large data (more than 1 Mo ) from MongoDB with kafka connector, it's worked well for small data , but I got the following error when working with big json data :

[2022-09-27 11:13:48,290] ERROR [source_mongodb_connector|task-0] WorkerSourceTask{id=source_mongodb_connector-0} Task threw an uncaught and unrecoverable exception. Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:195) org.apache.kafka.connect.errors.ConnectException: Unrecoverable exception from producer send callback at org.apache.kafka.connect.runtime.WorkerSourceTask.maybeThrowProducerSendException(WorkerSourceTask.java:290) at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:351) at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:257) at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:188) at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:243) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Caused by: org.apache.kafka.common.errors.RecordTooLargeException: The message is 2046979 bytes when serialized which is larger than 1048576, which is the value of the max.request.size configuration.

I tried to configure Topic , here is the description *hadoop@vps-data1 ~/kafka $ bin/kafka-configs.sh --bootstrap-server 192.168.13.80:9092,192.168.13.81:9092,192.168.13.82:9092 --entity-type topics --entity-name prefix.large.topicData --describe Dynamic configs for topic prefix.large.topicData are: max.message.bytes=1280000 sensitive=false synonyms={DYNAMIC_TOPIC_CONFIG:max.message.bytes=1280000, STATIC_BROKER_CONFIG:message.max.bytes=419430400, DEFAULT_CONFIG:message.max.bytes=1048588}

Indeed I configured producer, consumer and server properties file but the same problem still stacking ....

Any help will be appreciated


Solution

  • The solution is to configure kafka and kafka connect properties