pythonrabbitmqcelerykombu

Celery: enqueuing multiple (100-1000) tasks at the same time via send_task?


We quite often have the need to enqueue many messages (we chunk them into groups of 1000) using Celery (backed by RabbitMQ). Does anyone have a way to do this? We're basically trying to "batch" a large group of messages in one send_task call.

If i were to guess we would need to go a step "deeper" and hook into kombu or even py-amqp.

Regards,
Niklas


Solution

  • No need to "go deeper" and use Kombu directly. - There are few solutions that are suitable for different use-cases: