fir3net
PPS-Firenetbanner-780.5x190-30-03-17

How to Configure Celery within Django

Celery is an open source asynchronous task queue/job queue based on distributed message passing[1].
Within this article we will provide the configuration steps requiring for installing celery within Django. Our tutorial

Example

Broker

Within this example we will use Redis as the broker. Celery uses a broker to pass messages between your application and Celery worker processes[2].

Layout

The layout in this example is based on a project called projectx, and an app called appy. Below is the layout,

    |-- projectx
    |   |-- __init__.py
    |   |-- celery.py
    |   |-- settings.py
    |   |-- urls.py
    |   |-- views.py
    |   |-- wsgi.py
    |-- manage.py
    `-- appy
        |-- __init__.py
        |-- base.py
        |-- main.py
        `-- tasks.py

Install

First of all we install our packages.

yum install redis-server
pip install celery[redis]
pip install django-celery

Project Configuration

Next we configure the necessary files within our project.

projectx/__init__.py

To ensure that the Django app initiates the Celery app each time it is run.

from __future__ import absolute_import
from .celery import app as celery_app

projectx/settings.py

We then configure Django with the necessary celery settings.

BROKER_URL = 'redis://localhost:6379/0'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'

INSTALLED_APPS += ("djcelery", )

projectx/celery.py

A file is created called celery.py. Within this file we define our celery app.

from __future__ import absolute_import
   
import os
from celery import Celery
  
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'projectx.settings')

from django.conf import settings app = Celery('projectx') app.config_from_object('django.conf:settings') app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

App Configuration

appy/tasks.py

Within our app we create a file called tasks.py. Within tasks.py a simple function is created using the @shared_task decorator. This is the 'task' that will be sent to the task queue.

from celery import shared_task

@shared_task() def do_something(): return 1 + 1

Running Worker

/opt/pythonenv/projectx/bin/python2.7 /opt/pythonenv/projectx/bin/celery --app=projectx.celery:app worker --loglevel=INFO

Test

Now that everything is configured. We can test that everything is working. We import our tasks.py and then send our task to the queue.

>>> from appy.tasks import do_something
>>> do_something.delay('celery test')
<AsyncResult: 99e35cf7-0a2d-4781-b446-2e3dd45a5c16>

Reference

[1] https://django-celery.readthedocs.org/en/2.4/introduction.html

[2] http://michal.karzynski.pl/blog/2014/05/18/setting-up-an-asynchronous-task-queue-for-django-using-celery-redis/

Tags: Python, Django, Celery, Redis

About the Author

RDonato

R Donato

Rick Donato is the Founder and Chief Editor of Fir3net.com. He currently works as a Principal Network Security Engineer and has a keen interest in automation and the cloud.

You can find Rick on Twitter @f3lix001