Using Django Async from the command line

Created 23rd September, 2014 05:12 (UTC), last edited 23rd September, 2014 11:28 (UTC)


To run jobs in the queue you can use the command flush_queue.

python flush_queue

flush_queue will run once through the jobs that are scheduled to run at that time, but will exit early if any job throws an exception. Normally you would use it from an external script that simply keeps re-running the command.

while :; do ( python flush_queue && sleep 10 ) ; done

Jobs are executed in priority order first (higher numbers execute earlier), then by the scheduled time (unscheduled jobs will go last, but of course only jobs whose scheduled time has arrived will run) and finally by their ID order (which should be the order they were added). A failed task will be re-scheduled for later execution.

Limit number of jobs in one execution

In order to limit problems caused by potential memory leaks (for example, the use of DEBUG=True) the number of jobs in one run is limited, by default to 300.

python flush_queue -j 300
python flush_queue --jobs 300

Multiple workers

It is also possible to run more than one queue processor, with each one taking a different block of jobs for executing. You need to tell each run which jobs it should choose and how many runners are going to be used.

python flush_queue -w 1 -o 2
python flush_queue -w 2 -o 2
python flush_queue --which 1 --outof 2
python flush_queue --which 2 --outof 2

Jobs are allocated to workers based on their job IDs.


Queue health is done via the queue_health command:

python queue_health