pythoneventstwistedmessage-queue

Events framework for Python?


I'm building a system that works with web clients (Django) and remote APIs (probably a standalone daemon). I see it's easier to coordinate their work with some events framework like in JavaScript. Unfortunately, Django signals are synchronous, which will make replies to the clients very slow. Also, I might want to be able to migrate the daemon or its part to a separate machine, but still work the same way (not RPC, but just triggering an event or sending a message). (This might sound like Erlang's approach.)

Is there a framework that would use proven and reliable ways to communicate between processes (say, RabbitMQ), and require minimum boilerplate?

As for Twisted, that André Paramés suggested, I'd prefer a simpler code. Is this doable in Twisted?

from events_framework import subscribe, trigger
from django.http import Client
http_client = Client()  # just a sample

@subscribe('data_received'):
def reply(data):
     http_client.post('http://www.example.com', data)
     trigger('data_resent', data)

Here are more details. There is a Django views file that uses some models and notifies others of events. And there is a standalone daemon script that runs infinitely and reacts to events.

This is just pseudo code, I only mean how easy it should be.

# django_project/views.py (a Django views file)
from events_framework import publish, subscribe
from annoying import

@subscribe('settings_updated')
def _on_settings_update(event):  # listens to settings_updated event and saves the data
    Settings.object.get(user__id=event.user_id).update(event.new_settings)

@render_to('form.html')
def show_form(request):  # triggers 'form_shown' event
    publish('form_shown', {'user_id': request.user.id, 'form_data': request.GET})
    return {...}


# script.py (a standalone script)
from events_framework import publish, subscribe

@subscribe('form_shown')
def on_form_shown(event):  # listens to form_shown event and triggers another event
    pass
    result = requests.get('third party url', some_data)
    publish('third_party_requested', {'result': result})

Again, this couldn't be done just with Django signals: some events need to be published over network, others should be local but asynchronous.

It may be necessary to do instantiate something, like

from events_framework import Environment
env = Environment()  # will connect to default rabbitmq server from settings.

Solution

  • I decided Celery with RabbitMQ is the most mature software combination, and I will stick with them. Celery allows not just creating events, but flexible specialization via queue routing, and parallelization.