Initially, I was using Redis installed locally without any password, and Celery tasks worked fine.
Later, I switched to a Redis Docker container and added a password (PASSWORD
) using the --requirepass
flag. Since then, my Celery tasks are failing to authenticate with the Redis container.
celery.py
import os
from celery import Celery
from dotenv import load_dotenv
load_dotenv()
# Redis password (currently hardcoded for testing)
redis_password = "PASSWORD"
# Debugging
print(f"Redis password: {redis_password}")
app = Celery(
'connectors',
broker=f'redis://:{redis_password}@localhost:6380/0',
backend=f'redis://:{redis_password}@localhost:6380/0',
include=['connectors.tasks.cricket_tasks']
)
# Load custom configuration
app.config_from_object('connectors.tasks.celeryconfig')
if __name__ == '__main__':
app.start()
celeryconfig.py
)from celery.schedules import crontab
from dotenv import load_dotenv
load_dotenv()
redis_password = "PASSWORD"
# Debugging
print(f"config password: {redis_password}")
broker_url = f'redis://:{redis_password}@localhost:6380/0'
result_backend = f'redis://:{redis_password}@localhost:6380/0'
accept_content = ['json']
result_accept_content = ['json']
task_serializer = 'json'
enable_utc = False
timezone = 'Asia/Kolkata'
task_time_limit = 300
task_annotations = {
'*': {'rate_limit': '20/s'},
'tasks.add': {'rate_limit': '10/s', 'time_limit': 60},
}
beat_schedule = {
'run-daily-match-scheduler': {
'task': 'connectors.tasks.cricket_tasks.run_match_scraper',
'schedule': crontab(hour=8, minute=0),
},
'run-daily-table': {
'task': 'connectors.tasks.cricket_tasks.schedule_today_table',
'schedule': crontab(minute=30, hour=23),
},
'run-daily-mvp': {
'task': 'connectors.tasks.cricket_tasks.schedule_today_mvp',
'schedule': crontab(minute=30, hour=23),
},
'run-daily-scorecard': {
'task': 'connectors.tasks.cricket_tasks.schedule_today_scorecard',
'schedule': crontab(minute=30, hour=23),
},
'run-daily-btb': {
'task': 'connectors.tasks.cricket_tasks.schedule_today_btb',
'schedule': crontab(minute=30, hour=23),
},
}
docker-compose.yml
)services:
qdrant:
image: qdrant/qdrant:latest
restart: always
container_name: qdrant
ports:
- 6333:6333
- 6334:6334
expose:
- 6333
- 6334
- 6335
configs:
- source: qdrant_config
target: /qdrant/config/production.yaml
volumes:
- ./qdrant_data:/qdrant/storage
redis:
image: redis:latest
hostname: redis
ports:
- "6379:6380"
command: ["redis-server", "--requirepass", "PASSWORD"]
configs:
qdrant_config:
content: |
log_level: INFO
I tried running my Celery worker using the command:
celery -A connectors.tasks.cricket_tasks worker --loglevel=info
It shows the Redis password being printed correctly:
Redis password: PASSWORD
config password: PASSWORD
And successfully detects the tasks:
[tasks]
. connectors.tasks.cricket_tasks.run_match_scraper
. connectors.tasks.cricket_tasks.schedule_today_btb
. connectors.tasks.cricket_tasks.schedule_today_matches
. connectors.tasks.cricket_tasks.schedule_today_mvp
. connectors.tasks.cricket_tasks.schedule_today_scorecard
. connectors.tasks.cricket_tasks.schedule_today_table
. connectors.tasks.cricket_tasks.test_task
However, I get the following error:
[2025-04-20 12:32:30,349: ERROR/MainProcess] consumer: Cannot connect to redis://localhost:6379/0: Connection closed by server..
Trying again in 2.00 seconds... (1/100)
So, I tried setting a password on my local Redis instance and retried, but still received the same error.
After reading this Stack Overflow post, I switched from local Redis to using a Dockerized Redis instance, and to avoid port conflicts, I mapped container port 6379 to host port 6380. Still, I ran into the same issue — Celery workers were unable to authenticate.
To ensure Redis was running and accepting connections, I tested it manually:
docker exec -it backend-redis-1 redis-cli
127.0.0.1:6379> auth PASSWORD
OK
127.0.0.1:6379> ping
PONG
Redis responded correctly, confirming it is running and the password is correct.
Despite all this, Celery still fails to authenticate.
In your docker-compose.yml
You are binding ports: 6379:6380"
But by default, Redis listens on port 6379, the port on which Celery needs to connect inside your container. So your Celery config should target port 6379 on the host, not 6380. Your host machine uses redis using port 6380.
app = Celery(
'connectors',
broker=f'redis://:{redis_password}@localhost:6379/0',
backend=f'redis://:{redis_password}@localhost:6379/0',
include=['connectors.tasks.cricket_tasks']
)
Also, add to your docker-compose file
depends_on: redis