pythondjangoceleryflower

Celery tasks succeeding in Django but not showing on Flower or Celery Logs


I have a Django project using cookiecutter's template, I have been trying to get celery working with it. I have done the setup according to the celery docs and when I run a task, on Django's output it shows as succeeded but if I check the terminal running celery, celery logs don't even show if it received the tasks, I am running it using celery -A proj_name worker -l DEBUG. I also tried it with INFO but same thing. The tasks also dont show up on the Flower dashboard and I am using django-celery-results with both redis/postgres backends and both don't get the results populated.

I am not really sure what's going on but as far as I can tell celery is not receiving the tasks at all despite what Django's logs show. Also, when I try to print the task's state using AsyncResult it always shows PENDING despite django again saying it succeeded.

Here's my celery.py

import os

from celery import Celery

os.environ.setdefault("DJANGO_SETTINGS_MODULE", "proj_name.config.settings.local")

app = Celery("proj_name")

app.config_from_object("django.conf:settings", namespace="CELERY")

app.autodiscover_tasks()

@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))

and my celery related configs

if USE_TZ:
    CELERY_TIMEZONE = TIME_ZONE
CELERY_BROKER_URL = env("CELERY_BROKER_URL", default="redis://localhost:6379/0")
# CELERY_RESULT_BACKEND = f"db+postgresql://{env('POSTGRES_DB_USER')}:{env('POSTGRES_DB_PWD')}@localhost/{env('POSTGRES_DB_NAME')}"
CELERY_RESULT_BACKEND = CELERY_BROKER_URL
CELERY_CACHE_BACKEND = 'django-cache'
CELERY_TASK_TRACK_STARTED = True
CELERY_RESULT_EXTENDED = True
CELERY_RESULT_BACKEND_ALWAYS_RETRY = True
CELERY_RESULT_BACKEND_MAX_RETRIES = 10
CELERY_ACCEPT_CONTENT = ["application/json"]
CELERY_TASK_SERIALIZER = "json"
CELERY_RESULT_SERIALIZER = "json"
CELERY_TASK_TIME_LIMIT = 5 * 60
CELERY_TASK_SOFT_TIME_LIMIT = 60
CELERY_BEAT_SCHEDULER = "django_celery_beat.schedulers:DatabaseScheduler"
CELERY_WORKER_SEND_TASK_EVENTS = True
CELERY_TASK_SEND_SENT_EVENT = True
CELERY_TASK_ALWAYS_EAGER = True
CELERY_TASK_EAGER_PROPAGATES = True

And here's how my task is defined:

@app.task(bind=True)
def send_email_task(self,email_action, recipient_email = "", context={}, attachments = [], 
               is_sender_email_dynamic = False, dynamic_cc_emails = []):
    print('before task',self.request.id, self.request,)
    print(self.AsyncResult(self.request.id).state)
    # insert async func call to send email
    print('after task',self.AsyncResult(self.request.id).state, self.request)

and a sample Django Log for celery

INFO 2023-07-24 03:01:54,640 trace 7804 140679160931904 Task email_services.tasks.send_email_task[5fbbc289-f337-415c-8fad-3100042c422a] succeeded in 0.08931779300019116s: None

and my celery celery -A proj_name worker -l DEBUG output

(venv_name) ➜  proj_name git:(main) ✗ celery -A proj_name worker -l DEBUG
[2023-07-24 02:39:31,265: DEBUG/MainProcess] | Worker: Preparing bootsteps.
[2023-07-24 02:39:31,266: DEBUG/MainProcess] | Worker: Building graph...
[2023-07-24 02:39:31,266: DEBUG/MainProcess] | Worker: New boot order: {Beat, Timer, Hub, Pool, Autoscaler, StateDB, Consumer}
[2023-07-24 02:39:31,268: DEBUG/MainProcess] | Consumer: Preparing bootsteps.
[2023-07-24 02:39:31,268: DEBUG/MainProcess] | Consumer: Building graph...
[2023-07-24 02:39:31,274: DEBUG/MainProcess] | Consumer: New boot order: {Connection, Events, Mingle, Gossip, Heart, Agent, Tasks, Control, event loop}

 -------------- celery@DESKTOP-SGH5F1FL v5.3.1 (emerald-rush)
--- ***** -----
-- ******* ---- Linux-5.10.16.3-microsoft-standard-WSL2-x86_64-with-glibc2.35 2023-07-24 02:39:31
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app:         proj_name:0x7fe89adf1180
- ** ---------- .> transport:   redis://localhost:6379/0
- ** ---------- .> results:     redis://localhost:6379/0
- *** --- * --- .> concurrency: 20 (prefork)
-- ******* ---- .> task events: ON
--- ***** -----
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery


[tasks]
  . celery.accumulate
  . celery.backend_cleanup
  . celery.chain
  . celery.chord
  . celery.chord_unlock
  . celery.chunks
  . celery.group
  . celery.map
  . celery.starmap
  . email_services.tasks.send_email_task
  . proj_name.celery.debug_task

[2023-07-24 02:39:31,280: DEBUG/MainProcess] | Worker: Starting Hub
[2023-07-24 02:39:31,280: DEBUG/MainProcess] ^-- substep ok
[2023-07-24 02:39:31,280: DEBUG/MainProcess] | Worker: Starting Pool
[2023-07-24 02:39:32,539: DEBUG/MainProcess] ^-- substep ok
[2023-07-24 02:39:32,539: DEBUG/MainProcess] | Worker: Starting Consumer
[2023-07-24 02:39:32,539: DEBUG/MainProcess] | Consumer: Starting Connection
[2023-07-24 02:39:32,541: WARNING/MainProcess] /home/<name>/.pyenv/versions/3.10.10/envs/<proj_name>/lib/python3.10/site-packages/celery/worker/consumer/consumer.py:498: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine
whether broker connection retries are made during startup in Celery 6.0 and above.
If you wish to retain the existing behavior for retrying connections on startup,
you should set broker_connection_retry_on_startup to True.
  warnings.warn(

[2023-07-24 02:39:32,545: INFO/MainProcess] Connected to redis://localhost:6379/0
[2023-07-24 02:39:32,545: DEBUG/MainProcess] ^-- substep ok
[2023-07-24 02:39:32,545: DEBUG/MainProcess] | Consumer: Starting Events
[2023-07-24 02:39:32,545: WARNING/MainProcess] /home/<name>/.pyenv/versions/3.10.10/envs/proj_name/lib/python3.10/site-packages/celery/worker/consumer/consumer.py:498: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine
whether broker connection retries are made during startup in Celery 6.0 and above.
If you wish to retain the existing behavior for retrying connections on startup,
you should set broker_connection_retry_on_startup to True.
  warnings.warn(

[2023-07-24 02:39:32,546: DEBUG/MainProcess] ^-- substep ok
[2023-07-24 02:39:32,546: DEBUG/MainProcess] | Consumer: Starting Mingle
[2023-07-24 02:39:32,546: INFO/MainProcess] mingle: searching for neighbors
[2023-07-24 02:39:33,552: INFO/MainProcess] mingle: all alone
[2023-07-24 02:39:33,552: DEBUG/MainProcess] ^-- substep ok
[2023-07-24 02:39:33,553: DEBUG/MainProcess] | Consumer: Starting Gossip
[2023-07-24 02:39:33,555: DEBUG/MainProcess] ^-- substep ok
[2023-07-24 02:39:33,555: DEBUG/MainProcess] | Consumer: Starting Heart
[2023-07-24 02:39:33,557: DEBUG/MainProcess] ^-- substep ok
[2023-07-24 02:39:33,557: DEBUG/MainProcess] | Consumer: Starting Tasks
[2023-07-24 02:39:33,560: DEBUG/MainProcess] ^-- substep ok
[2023-07-24 02:39:33,560: DEBUG/MainProcess] | Consumer: Starting Control
[2023-07-24 02:39:33,562: DEBUG/MainProcess] ^-- substep ok
[2023-07-24 02:39:33,562: DEBUG/MainProcess] | Consumer: Starting event loop
[2023-07-24 02:39:33,562: DEBUG/MainProcess] | Worker: Hub.register Pool...
[2023-07-24 02:39:33,562: INFO/MainProcess] celery@DESKTOP-SGH5F1FL ready.
[2023-07-24 02:39:33,563: DEBUG/MainProcess] basic.qos: prefetch_count->80

Solution

  • I found the issue, we are using cookiecutter's Django template and in their settings/local.py, they have specified,

    CELERY_TASK_ALWAYS_EAGER = True
    CELERY_TASK_EAGER_PROPAGATES = True
    

    which basically directs to celery to not use the queue at all and just block the thread until the task is executed locally (essentially as if you were not using celery). Source