Run background jobs in Python with celery and Redis

SINCE 2013

performance test
Performance Testing: Types, Tools and Tutorial
17th June 2025
Improving Skills
Future-Proofing Your Career: The Skills You Need to Stay Relevant
23rd June 2025
performance test
Performance Testing: Types, Tools and Tutorial
17th June 2025
Improving Skills
Future-Proofing Your Career: The Skills You Need to Stay Relevant
23rd June 2025
Show all

Run background jobs in Python with celery and Redis

Python

Hi Folks, when I was working on rails or node I was very used to using Sidekiq or bull-queue for the background job when I recently started using Python I came across some use cases where we had to use background services for our backend system. In this blog, I will discuss how we overcome this with the help of Redis and celery.

celery_redis

What is celery?

Celery is an open-source, distributed task queue system for handling asynchronous and periodic tasks in Python applications.

What is redis?

Redis is primarily an in-memory data store and it can store different types of data-structure like queue or list. Here redis will act as a message broker by distributing and facilitating tasks and messages between celery workers and celery applications.

Python with celery and Redis

How task get enqueued and executed under the hood?

  1. when you enqueue a task in Celery with apply_async other methods then the Celery application will serialize task and the send the task metadata and arguments as a message in a specific redis-queue.
  2. Now the celery worker processes its polling messages from the Redis queue continuously for new tasks and when it gets the new tasks. when it detects a new task message in the queue it will pull and deserialize that message and get the task metadata and arguments after that it executes that task.

Setup celery application –

we will be writing our code in a file called task.py and installing this requirement in the environment.

celery
redis

settings.py

CELERY_BROKER_URL = os.getenv('CELERY_BROKER_URL')  # Use Redis as broker
CELERY_ACCEPT_CONTENT = ["json"]
CELERY_TASK_SERIALIZER = "json"
CELERY_BROKER_CONNECTION_RETRY_ON_STARTUP = True

celery.py

import os
from celery.schedules import crontab
from celery import Celery

os.environ.setdefault("DJANGO_SETTINGS_MODULE", "project.settings")

app = Celery("project_name")

# Load task modules from all registered Django app configs
app.config_from_object("django.conf:settings", namespace="CELERY")
app.autodiscover_tasks()

app.conf.beat_schedule = {
    "name_of_task": {
        "task": "task_function_path",
        "schedule": crontab(hour=0, minute=0),  # Runs every 24 hours at midnight
    },
}

task.py

import redis

redis_client = redis.StrictRedis(host="localhost", port=PORT, db=0, decode_responses=True)

@shared_task
def task_function_name():
   // Your Rest Function Here
    redis_key = "any title"
    your_data = data
    redis_client.set(redis_key, json.dumps(your_data))

Now next step is to run the celery works in our server or system for this we will run this command in terminal

celery -A project_name worker --loglevel=info --pool=solo

More info..

Related Blog..