djangocelerycelerybeat

Celery beat creating multiple instances of the same task Django


This has been asked quite a few times but I couldn't find a satisfactory solution so requesting for help after searching for several hours

I have defined celery broker with rabitmq and backend with redis. Earlier broker was with redis however it was giving the problem of multiple instances even without celery beat. This is a known issue

When i am using celery beat, the issue is resurfacing This is my celery beat in settings.py of django project. It calls the async task update_database

CELERY_BEAT_SCHEDULE = {
    'update-every-minute': {
        'task': 'apiresult.tasks.update_database',
        'schedule': 60.0,
    },
}

Am calling the async task from apps.py of the apiresult app

from django.apps import AppConfig

    class ApiresultConfig(AppConfig):
        default_auto_field = 'django.db.models.BigAutoField'
        name = 'apiresult'
        def ready(self):
            from apiresult.tasks import update_database
            update_database.delay()

tasks.py in apiresult app contains the update_database function

@shared_task
def update_database():

I have read we can use distributed locks to achieve it, but is there a simpler way to ensure only one instance of a task runs. I have tried celery redbeat but thats not helping either. Any help here is appreciated


Solution

  • Your issue seems to be that you are calling the task twice. The schedule, which runs every 60 seconds, will run the task and then when the application has initialised it is calling the task to run again.

    You will need to remove the call from the ready function.