Celery 5.2.6 with Django Server Configuration for Ubuntu 20.04

Its always hectic to do the configuration as the celery version gets upgraded. so does the configuration.

Here in this tutorial i am going to install Celery 5.2.6 with Django 4.0

Considering this Project Structure

- proj/
- manage.py
- proj/
- __init__.py
- settings.py
- urls.py

lets install celery first

pip install celery
pip install django-celery-beat
pip install django-celery-results

Execute Migrate

python3 manage.py migrate

add celery.py file in proj folder with settings.py file

# yourvenv/cfehome/celery.py
from __future__ import absolute_import, unicode_literals # for python2

import os
from celery import Celery

# set the default Django settings module for the 'celery' program.
# this is also used in manage.py
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')

## Get the base REDIS URL, default to redis' default
BASE_REDIS_URL = os.environ.get('REDIS_URL', 'redis://localhost:6379')

app = Celery('proj')

# Using a string here means the worker don't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django app configs.
app.autodiscover_tasks()

app.conf.broker_url = BASE_REDIS_URL

# this allows you to schedule items in the Django admin.
app.conf.beat_scheduler = 'django_celery_beat.schedulers.DatabaseScheduler'

from celery.schedules import crontab

app.conf.beat_schedule = {
'add-every-fifteen-minute-contrab': {
'task': 'fetch_locations_ip',
'schedule': crontab(minute='*/15'),

},

}

I also added a function with a Beat Schedule to fetch location that will run after every fifteen minutes.

now add this line code to __init__.py in the same main proj directory so the celery app gets loaded with the project

from .celery import app as celery_app

Update settings.py with following Changes

INSTALLED_APPS += [ 'django_celery_beat',
'django_celery_results' ]
CELERY_RESULT_BACKEND = "django-db"

now lets write a tasks.py in any app directory with a function to fetch locations ip

import json
from celery import shared_task



@shared_task(name="fetch_locations_ip")
def fetch_locations():
// rest of the function goes here
pass

Now we can check the Celery worker and Celery Beat by running following Commands

celery -A proj worker -l INFO # For deeper logs use DEBUG
celery -A proj beat -l INFO

Now lets Deploy our Code to Ubuntu 20.04 machine with Django Already Configured with Nginx.
There are many ways to Demonize the Celery Beat or work. But i will be using systemd because its builtin.

Lets First create new users and users group for celery

sudo useradd celery -d /home/celery -b /bin/bash
sudo mkhomedir_helper celery

sudo mkdir /var/log/celery
sudo chown -R celery:celery /var/log/celery
sudo chmod -R 755 /var/log/celery

sudo mkdir /var/run/celery
sudo chown -R celery:celery /var/run/celery
sudo chmod -R 755 /var/run/celery

And the celery configuration file by doing

sudo nano /etc/default/celeryd

#   most people will only start one node:
CELERYD_NODES="worker1"
# but you can also start multiple and configure settings
# for each in CELERYD_OPTS
#CELERYD_NODES="worker1 worker2 worker3"
# alternatively, you can specify the number of nodes to start:
#CELERYD_NODES=10

# Absolute or relative path to the 'celery' command:
CELERY_BIN="/home/ubuntu/project/env/bin/celery"
#CELERY_BIN="/virtualenvs/def/bin/celery"

# App instance to use
# comment out this line if you don't use an app
CELERY_APP=“proj”
# or fully qualified:
#CELERY_APP="proj.tasks:app"

# Where to chdir at start.
CELERYD_CHDIR="/home/ubuntu/project/OneCard/"

# Extra command-line arguments to the worker
CELERYD_OPTS="--time-limit=300 --concurrency=8"
# Configure node-specific settings by appending node name to arguments:
#CELERYD_OPTS="--time-limit=300 -c 8 -c:worker2 4 -c:worker3 2 -Ofair:worker1"

# Set logging level to DEBUG
#CELERYD_LOG_LEVEL="DEBUG"

# %n will be replaced with the first part of the node name.
CELERYD_LOG_FILE="/var/log/celery/%n%I.log"
CELERYD_PID_FILE="/var/run/celery/%n.pid"

# Workers should run as an unprivileged user.
# You need to create this user manually (or you can choose
# a user/group combination that already exists (e.g., nobody).
CELERYD_USER="celery"
CELERYD_GROUP="celery"
CELERYD_LOG_LEVEL="INFO"
# If enabled PID and log directories will be created if missing,
# and owned by the userid/group configured.
CELERY_CREATE_DIRS=1
# Options for Celery Beat
CELERYBEAT_PID_FILE="/var/run/celery/beat.pid"
CELERYBEAT_LOG_FILE="/var/log/celery/beat.log"

Now lets create a Celery worker service

sudo nano /etc/systemd/system/celery.service

[Unit]
Description=Celery Service
After=network.target

[Service]
Type=forking
User=celery
Group=celery

EnvironmentFile=/etc/default/celeryd
WorkingDirectory=/home/ubuntu/project/OneCard
ExecStart=/home/ubuntu/project/env/bin/celery multi start ${CELERYD_NODES} \
-A ${CELERY_APP} --pidfile=${CELERYD_PID_FILE} \
--logfile=${CELERYD_LOG_FILE} --loglevel=${CELERYD_LOG_LEVEL} ${CELERYD_OPTS}
ExecStop=/home/ubuntu/project/env/bin/celery ${CELERY_BIN} multi stopwait ${CELERYD_NODES} \
--pidfile=${CELERYD_PID_FILE}
ExecReload=/home/ubuntu/project/env/bin/celery ${CELERY_BIN} multi restart ${CELERYD_NODES} \
-A ${CELERY_APP} --pidfile=${CELERYD_PID_FILE} \
--logfile=${CELERYD_LOG_FILE} --loglevel=${CELERYD_LOG_LEVEL} ${CELERYD_OPTS}

[Install]
WantedBy=multi-user.target

Now the time for Celery Beat

sudo nano /etc/systemd/system/celerybeat.service

[Unit]
Description=Celery Beat Service
After=network.target

[Service]
Type=forking
User=celery
Group=celery
EnvironmentFile=/etc/default/celeryd
WorkingDirectory=/home/ubuntu/project/OneCard
ExecStart=/bin/sh -c '${CELERY_BIN} -A ${CELERY_APP} beat \
--pidfile=${CELERYD_PID_FILE} \
--logfile=${CELERYD_LOG_FILE} \
--loglevel=${CELERYD_LOG_LEVEL} \
--schedule=/home/celery/celerybeat-schedule'
Restart=always

[Install]
WantedBy=multi-user.target

Now lets reload the systemctl daemon which is needed whenever you make changes in systemctl

sudo systemctl daemon-reload

Enable both Services

sudo systemctl enable celery
sudo systemctl enable celerybeat

Now start:

sudo systemctl start celery
sudo systemctl start celerybeat

If there is any error you can check the status by doing

sudo systemctl status celery
sudo systemctl status celerybeat

or check log files that we mentioned in celery configuration.

Cheers!!

--

--

--

I’m a Lead Full Stack Engineer who enjoys building things on internet, fuelled by instant coffee and driven by passion. https://saaadmirza.net

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Getting started with AWS Lambda and Containers

Slash commands

How I reduced infrastructure costs by 10x (From AWS to Kops Kubernetes; leaving Dynamo for…

Day 11 | Script Communication in Unity using GetComponent

Benefits of Skype Integration with ConvergeHub CRM

Jenkins master and slave with docker

Why Scala is better than R

A day with Vyper

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Saad Mirza

Saad Mirza

I’m a Lead Full Stack Engineer who enjoys building things on internet, fuelled by instant coffee and driven by passion. https://saaadmirza.net

More from Medium

Deploying Django application with Docker, Postgres, Gunicorn, NGINX (Part-2)

Deploy multiple Django websites on a single server with Gunicorn and Nginx  — a step-by-step guide

Copying Django Model Objects with foreign keys Efficiently with Postgres

Develop a chat application using React js, FastAPI and websocket