For windows user, you can get executable file of redis from here
After installing, try if it is correctly installed or not.
It should respond with
Also install python package of the redis
First Step with Django
Now that you have successfully installed the packages, now lets get’s hand on Django Project
Add some of the setting configuration in your settings.py
Make sure you have changed your timezone from YOUR_TIMEZONE. You can get your timezone from here
Create an celery.py file in your main Django project directory
Add the following code in the celery.py module. This module is used to define the celery instance.
Make sure you have changed your project name (<your project name>) with your django project name
Then we need to import the app defined the celery.py to __init__.py of your main project directory. By doing this, we can ensure that app is loaded when Django project starts
Now let’s create some task
Create a new file in your any app registered in the INSTALLED_APPS
Starting Worker Process
Open a NEWterminal and run the following command to run the worker instance of celery, and also change the directory to where your main project directory is, i,e, the directory where manage.py file is placed, and also make sure you have activated your virtualenv(if created).
Change the project name with your project name
You will get this type of output
NOTE: Check for [tasks] above, it should contain name of the task which you have created in the module tasks.py.!!
For more info and logs, you also run the worker instance in a DEBUG MODE
NOTE: DO NOT CLOSE THIS TERMINAL, IT SHOULD REMAIN OPEN!!
Testing the Task
Now let’s run the tasks from django shell
Open up your Django shell
And run the function with delay.
When you check your second terminal where your celery worker instance is running, you will get this type of output, showing your tasks have been recieved and also they have successfully completed
We often need to periodically run our tasks in our django project, here celery fulfills our need with celery beat which is nothing but a scheduler, which kicks its target at a regular interval and it can defined both implictly and explictly.
Please do ensure that single scheduler is running for a schedule at a time, otherwise you’d end up with duplicate tasks
Set the timezone in the settings.py according to your time zone, which we have done that earlier in this tutorial.
Now we can create periodic tasks by two ways, either by manually adding a code of scheduler in celery.py or by installing a package django-celery-beat which can allows us to create schedulers in the Django Admin
1. Writing scheduler manually
Add the following schedule configuration in your celery.py file
Open the NEW terminal and run the following command
Make sure you are running worker process in a seperate terminal
You will get the output in the terminal where you have started celery beat process
And now if you look at the worker process terminal, you will find tasks are running periodically!!
2. Using django-celery-beat
Let’s now do the above same thing with django-celery-beat
1. Install django-celery-beat
2. Add into Installed apps
Add the django_celery_beat module to INSTALLED_APPS in your Django project’ settings.py:
3. Run the django migrations
Note: The database scheduler won’t reset when timezone related settings change, so you must do this manually:
Now go to your Django Admin and create a Periodic Task as follows
Choose any name, and select the task which you have created, and also create a Crontab according to your need.
Please refer the guide or some of the examples of Crontab from here.
Run the celery beat process in new terminal with –scheduler
Make sure you have running worker process in a seperate terminal with django server and celery beat process
Check the output logs in both of the terminals and check the logs in the respective terminals.
Running on Production
Now that celery is perfectly running locally, last thing we need to take care of the production. A question arises here that how we can run these process terminals together for all the time in our Production server, becuase we need to run both the process(beat and worker) to keep the celery working.
So here Supervisor comes in handy that helps to run both of the instances seperately.
Supervisor is a client/server system that allows its users to control and keeps it running the process in any unix-like Operating System. So we can use this, for runing celery processes.