Skip to main content
If your main app process has been successfully deployed on Sevalla, adding a background task worker only requires a few additional steps. For this example, we use Celery and Redis; however, the same approach can be applied to other task queues, such as Huey and RQ. To start, add both celery and redis to your dependencies.
pip install celery redis
pip freeze > requirements.txt
Then, create a file called celery.py in your project directory. The following is the same starter code used by the Celery documentation.
# celery.py
import os  

from celery import Celery

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'example.settings')

app = Celery('example')

app.config_from_object('django.conf:settings', namespace='CELERY')

app.autodiscover_tasks()

@app.task(bind=True, ignore_result=True)
def debug_task(self):
    print(f'Request: {self.request!r}')
The project name needs to be updated from example to your project name in both os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'example.settings') and app = Celery('example'). In the __init__.py file in the same directory, you’ll need to import the celery app object.
#__init__.py
from .celery import app as celery_app 

__all__ = ('celery_app',)
In your settings.py file, you’ll need to add your broker URL to the CELERY_BROKER_URL setting. Since Redis is used in this example, you can use the REDIS_URL environment variable as the value.
CELERY_BROKER_URL = env('REDIS_URL') 
Within Sevalla, you can create a Redis database and connect it to your app. Make sure you add the REDIS_URL to your app when connecting the Redis service to your app. To get the Celery worker to start, you can create a new background worker with the following start command: celery -A example worker -c 1 -l INFO Celery starts by referencing the name of your project. The concurrency is set to one here to avoid overuse of your resources. You can adjust the value to match the needs of your app. Once you deploy your app, you’ll see Celery startup information in your logs. To test the debug task defined above, you can go to the web terminal and manually trigger the debug task. In the web terminal, start the virtual environment and run the task from the Django shell.
. /opt/venv/bin/activate
python manage.py shell
Inside the shell:
from example.celery import debug_task
debug_task.delay()
The addition of delay to the debug_task call will send the task to the Celery broker instead of running it directly. Within the runtime logs, you can view the output message.