djangocelerybackground-processredis-server

Celery connecting to rabbitmq-server instead of redis-server


I have a Django application which I want to configure it celery to run background tasks.

Packages:

  1. celery==4.2.1

  2. Django==2.1.3

  3. Python==3.5

  4. Redis-server==3.0.6

Configuration of celery in settings.py file is:

CELERY_BROKER_URL = 'redis://localhost:6379'

CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Asia/Kolkata'
CELERY_BEAT_SCHEDULE = {
    'task-number-one': {
            'task': 'app.tasks.task_number_one',
            'schedule': crontab(minute='*/1'),
    },
}

And celery.py file:

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'project.settings.prod')

app = Celery('project')

# Using a string here means the worker don't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings')

# Load task modules from all registered Django app configs.
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)


@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))

When I run : celery -A project worker -l info -B -E

It points to rabmmitmq server, instead it should point to redis-server, shown below:

 -------------- celery@user-desktop v4.2.1 (windowlicker)
---- **** ----- 
--- * ***  * -- Linux-4.15.0-39-generic-x86_64-with-Ubuntu-18.04-bionic 2018-11-21 12:04:51
-- * - **** --- 
- ** ---------- [config]
- ** ---------- .> app:         project:0x7f8b80f78d30
- ** ---------- .> transport:   amqp://guest:**@localhost:5672//
- ** ---------- .> results:     redis://localhost:6379/
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ---- .> task events: ON
--- ***** ----- 
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery


[tasks]
  . app.tasks.task_number_one
  . project.celery.debug_task

[2018-11-21 12:04:51,741: INFO/Beat] beat: Starting...

The same happend in production enviornment. In production I have deployed the Django application with Gunicorn and Nginx, and now I want to implement some method to run background tasks, as django-crontab package is not working.

Problem:

  1. What is the problem with celery configuration?

  2. Could anyone please recommend a method to run periodic background task?

**Note: I have tried implementing supervisor, but it seems supervisor is not compatible with python3, and therefore could not configure it.


Solution

  • The setting for the broker url changed in v4. It should be BROKER_URL and not CELERY_BROKER_URL.