Introduction
Web applications often need to perform time-consuming tasks such as sending emails, processing payments, generating reports, or handling real-time notifications. Running these tasks synchronously can slow down your app and hurt user experience.
This is where Celery and Redis come in. Celery is a distributed task queue for running tasks asynchronously, while Redis serves as a message broker (or sometimes result backend). Together, they allow Django apps to handle background tasks efficiently, freeing up the main request/response cycle.
Why Use Celery with Django?
- Asynchronous execution: Offload heavy tasks to background workers.
- Scalability: Handle thousands of tasks concurrently.
- Retry mechanism: Automatically re-execute failed tasks.
- Scheduling: Run periodic tasks (like CRON jobs).
How Celery Works
- Client (Django app) → Sends task to broker.
- Broker (Redis) → Queues the task.
- Workers (Celery processes) → Pick tasks from the queue and execute them.
- Result Backend (optional, Redis/Database) → Stores results.
Step 1: Install Celery and Redis
pip install celery redis
Make sure Redis server is running. On Linux/macOS:
brew install redis
redis-server
Step 2: Configure Celery in Django
Create a celery.py file inside your project folder:
# project/celery.py
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'project.settings')
app = Celery('project')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
Update __init__.py in the project folder:
# project/__init__.py
from .celery import app as celery_app
__all__ = ('celery_app',)
Step 3: Configure Redis as Broker in settings.py
CELERY_BROKER_URL = "redis://127.0.0.1:6379/0"
CELERY_RESULT_BACKEND = "redis://127.0.0.1:6379/0"
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
Step 4: Create a Task
Inside any Django app, create a tasks.py file:
# app/tasks.py
from celery import shared_task
from time import sleep
@shared_task
def send_welcome_email(user_id):
from django.contrib.auth.models import User
from django.core.mail import send_mail
user = User.objects.get(id=user_id)
send_mail(
"Welcome!",
f"Hello {user.username}, thank you for registering!",
"admin@example.com",
[user.email],
fail_silently=False,
)
return f"Email sent to {user.email}"
Step 5: Calling a Task
Instead of calling the function directly:
send_welcome_email.delay(user.id)
.delay()queues the task in Redis.- A Celery worker will execute it asynchronously.
Step 6: Running Celery Worker
Start the worker process:
celery -A project worker -l info
For scheduled tasks (periodic jobs), also run:
celery -A project beat -l info
Step 7: Example Use Cases in Django
- Sending Emails – Confirmation emails, newsletters, or password resets.
- Report Generation – Exporting CSV/PDF in background.
- Image/Video Processing – Resize images or transcode videos.
- Notifications – Push notifications, SMS, in-app alerts.
- Scheduled Jobs – Daily data cleanup, weekly summaries.
Step 8: Monitoring Celery Tasks
- Use Flower (a Celery monitoring tool):
pip install flower
celery -A project flower
Access at http://localhost:5555 to view tasks in real-time.
Best Practices
- Use retry policies for failed tasks:
@shared_task(bind=True, max_retries=3)
def process_payment(self, order_id):
try:
# process payment logic
pass
except Exception as e:
raise self.retry(exc=e, countdown=60)
- Avoid blocking tasks inside Celery workers.
- Use task chaining & groups for workflows.
- Scale workers horizontally with multiple queues.
Conclusion
By integrating Celery with Redis, Django applications can handle asynchronous tasks efficiently, improve performance, and scale with growing user demand.







One Response
Потрясающая штука, буквально
пару дней назад разбирался по схожему разделу:)