pythondockerfilecelerywindows-subsystem-for-linuxcelerybeat

Beat scheduler not working automatically but does work when manually triggered


I am learning to use celery. I want to automate API triggers using beat. The follwing code works if I manually call increment_value.delay() from the shell. However, it does not automate. Also, I think Redis still gets updated at every interval. Just the task method does not get called.

config.py

broker_url = f'redis://{REDIS_IP}:6379/0'
result_backend = f'redis://{REDIS_IP}:6379/0'
task_serializer = 'json'
result_serializer = 'json'
accept_content = ['json']
timezone = 'UTC'

CELERY_BEAT_SCHEDULE = {
    'run-every-15-seconds': {
        'task': 'scheduler.increment_value',
        'schedule': timedelta(seconds=15),
    },
}

scheduler.py

app = Celery()
app.config_from_object('config')


@app.task
def increment_value():
    response = requests.post(url=f"http://{FASTAPI_IP}:8000/redis/first")

    logging.info("Task Has Been Triggered")

    return response.json()

Dockerfile

FROM python:3.8-slim

WORKDIR /app

# Copy only the necessary files
COPY ./requirements.txt /app/requirements.txt
COPY . /app

# Install dependencies
RUN pip install --no-cache-dir -r requirements.txt

# Start Celery worker and beat in the background using a shell script
COPY ./start.sh /app/start.sh
RUN chmod +x /app/start.sh

CMD ["/app/start.sh"]

start.sh

#!/bin/sh
celery -A scheduler worker --loglevel=info &
celery -A scheduler beat --loglevel=info

Both worker and beat start when I run the application.

I should also mention that I am working on WSL2.

I would really appreciate if someone could point it out to me what I am doing wrong.


Solution

  • You are using a different configuration variable to set beat schedule. Replace CELERY_BEAT_SCHEDULE with CELERYBEAT_SCHEDULE and see if it works. Also as @Shivam suggested, you can use supervisor instead of running worker in the background for better signal handling mechanisms.