pythondjangocelerydjango-celerycelery-task

Time Limit not Working for Django Celery 5.3.1


I am having a trouble with trying time_limit and soft_time_limit feature in celery and django app.

import os

from celery import Celery

# Set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'server.settings')

app = Celery('server')

# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
app.conf.update(
    task_soft_time_limit=1,
    task_time_limit=1,
)

# Load task modules from all registered Django apps.
app.autodiscover_tasks()


@app.task(bind=True, ignore_result=True)
def debug_task(self):
    print(f'Request: {self.request!r}')

I have this celery configuration. I'm trying time limit config with for celery task in django and run this task.

from celery import shared_task
from celery.exceptions import SoftTimeLimitExceeded
from celery.signals import task_success, task_failure, task_prerun
import time
from server.celery import app

@shared_task
def scrape(domain: str, route: str = None):
    try:
        time.sleep(5)
        
    except SoftTimeLimitExceeded:
        print("Time Limit")

@task_prerun.connect(sender=scrape)
def scrape_prerun_notifier(task_id=None, task=None, *args, **kwargs):
    print("From task_prerun_notifier ==> Running just before add() executes")

@task_success.connect(sender=scrape)
def scrape_success_notifier(sender=None, result=None, **kwargs):
    print("Success")

    task_id = sender.request.id

@task_failure.connect(sender=scrape)
def scrape_failure_notifier(task_id=None, exception=None, *args, **kwargs):
    print("Failure")

How ever, the scrape task always succeed without catching any exception. I've also tried modifying the decorator but it doesn't work either.

@app.task(soft_time_limit=1, time_limit=1)

Also tried adding this to django settings but useless.

CELERYD_TASK_SOFT_TIME_LIMIT = 1 or CELERY_TASK_SOFT_TIME_LIMIT = 1

This is how I start celery using command line.

celery -A server worker -l info --pool=threads or celery -A server worker -l info --pool=solo

I haven't tried prefork pool because I don't know how to enable logging for it. Is there something wrong with my configuration that makes the time_limit and soft_time_limit doesn't work? Any help is really appreciated.


Solution

  • The docs on task_time_limit say the following:

    Task hard time limit in seconds. The worker processing the task will be killed and replaced with a new one when this is exceeded.

    With a solo pool, this would mean Celery killing itself, so although not explicitly stated, I believe time limits are simply not supported for the solo pool.

    Since threads cannot be stopped, probably that's also not supported.