Article

Auto-reload Development Mode — For celery worker using docker-compose and Django management commands.

Last updated 
Jan 17, 2019
 min read

If you are using docker-compose for Django projects with celery workers, I can feel your frustration and here is a possible solution to that problem.

Celery provided auto-reload support until version 3.1, but discontinued because they were facing some issues and having a hard time implementing this feature as you can read on this GitHub issue. So now we are undoubtedly on our own.

Syncing Code with Docker Volumes

Docker provides functionality called volumes and so it has provided us a way to use ‘python manage.py runserver’ command so we don’t have to restart the server every time. Following is the common way to do it-

1web:
2  build: .
3  restart: always
4  env_file:
5    - .env
6  entrypoint: ./docker-entrypoint-local.sh
7  volumes:
8    - .:/code
9  ports:
10    - "8082:8000"
11  links:
12    - postgres
13    - rabbitmq

Docker Entrypoint Configuration

As you can see, I have used `volumes` to sync my current directory with the code directory of the container. It will use the same code and in entrypoint file is as follows.

1#!/bin/sh
2
3./wait-for-it.sh db:5432
4python3 manage.py makemigrations
5python3 manage.py migrate
6python3 my_celery_project/setup_db.py
7python3 manage.py collectstatic --noinput
8
9python3 manage.py runserver 0.0.0.0:8000

So now the code is in continuous sync and now we can use runserver command to make our development process faster and we can also use this same trick to auto-reload celery worker. You can see `runserver` command file on GitHub.

Django also provides a functionality to create management command for our apps. To create command, You will need to create a management package with commands package inside it.

Django apps package hierarchy for `celery` command

Creating a Celery Management Command

Now, You will need to create celery.py file in commands package as you can see in the image. copy&paste the following code.

1import shlex
2import subprocess
3
4from django.core.management.base import BaseCommand
5from django.utils import autoreload
6
7
8def restart_celery(*args, **kwargs):
9    kill_worker_cmd = 'pkill -9 celery'
10    subprocess.call(shlex.split(kill_worker_cmd))
11    start_worker_cmd = 'celery -A my_celery_project worker -l info'
12    subprocess.call(shlex.split(start_worker_cmd))
13
14
15class Command(BaseCommand):
16
17    def handle(self, *args, **options):
18        self.stdout.write('Starting celery worker with autoreload...')
19        autoreload.main(restart_celery, args=None, kwargs=None)

Configuring Celery Auto-Restart in Docker Compose

As you can see we make use of the same auto-reload functionality which Django developed to restart its local development server. You can use this command in the docker-compose file as following.

1worker:
2  build: .
3  restart: always
4  env_file:
5    - .env
6  command: python3 manage.py celery
7  volumes:
8    - .:/code
9  links:
10    - rabbitmq
11    - postgres

It will auto-restart itself whenever you make the code changes.

Also Read: Getting Started with Two-Factor Authentication in Django Admin Panel

Authors

Aniket Patel

Software Engineer
I’m a Software Engineer with a passion for building efficient, scalable solutions. I focus on writing clean, maintainable code and solving complex technical challenges. I’m always learning and exploring new tools to improve my craft and deliver high-quality software.

Tags

No items found.

Have a project in mind?

Read