Celery is a great tool for running asynchronous Django tasks, but it can be complicated to configure. One use case I often face is running multiple web applications on the same server, each with their own Celery daemon.
The web apps are typically completely unrelated and may be running different versions of Django, Celery and other packages in separate virtualenvs. For this reason, I also want to keep their Celery backends separated.
There are a quite a few ways to do it:
To use a local Redis server as the Celery backend, all you need in Django's settings.py is this:
BROKER_URL = 'redis://'
CELERY_DEFAULT_QUEUE = 'myapp'
Another web application would then use a different name for the queue:
BROKER_URL = 'redis://'
CELERY_DEFAULT_QUEUE = 'anotherapp'
When you run the Celery daemon processes, each just needs to know which queue it's watching:
manage.py celeryd -Q myapp
manage.py celeryd -Q anotherapp
How is all this stored in the Redis database? You will see a new entry appear:
1) "_kombu.binding.celery"
Under that key, which is a SET, you can see the names of all the configured queues as something like this (use the Redis SMEMBERS command):
1) "celery\x06\x16\x06\x16myapp"
2) "celery\x06\x16\x06\x16anotherapp"
Whenever a new task is created, a Redis key temporarily appears for its queue:
2) "myapp"
As the Celery daemon picks it up, the key is immediately deleted so you won't usually see it unless you stop the daemon.
Note that the queue names are used as a top level Redis keys without any prefixes, so you should choose them wisely.