pythonrabbitmqceleryamqp

Celery and custom consumers


To my knowledge, Celery acts as both the producer and consumer of messages. This is not what I want to achieve. I want Celery to act as the consumer only, to fire certain tasks based on messages that I send to my AMQP broker of choice. Is this possible?

Or do I need to make soup by adding carrot to my stack?


Solution

  • Celery brokers acts as a message stores and publish them to one or more workers that subscribe for those,

    so: celery pulishes messages to a broker (rabbitmq, redist, celery itself through django db, etc..) those messages are retrieved by a worker following the protocol of the broker, that memorizes them (usually they are persistent but maybe it dependes on your broker), and got executed by you workers.

    Task results are available on the executing worker task's, and you can configure where to store those results and you can retrieve them with this method .

    You can publish tasks with celery passing parameters to your "receiver function" (the task you define, the documentation has some examples, usually you do not want to pass big things here (say a queryset), but only the minimal information that permits you to retrieve what you need when executing the task.

    one easy example could be:

    You register a task

    @task
    def add(x,x):
        return x+y
    

    and you call the from another module with:

    from mytasks import add
    
    metadata1 = 1
    metadata2 = 2
    myasyncresult = add.delay(1,2)
    myasyncresult.get() == 3
    

    EDIT

    after your edit I saw that probably you want to construct messages from other sources other that celery, you could see here the message format, they default as pickled objects that respect that format, so you post those message in the right queue of your rabbitmq broker and you are right to go retrieving them from your workers.