Skip to main content
The built-in FastAPI background tasks functionality will work with Sevalla, but it is designed primarily for lightweight tasks, as it runs in the same process as your app. If you need to handle background tasks outside of your app, you can create a worker to manage these tasks for you. If your main app process has been successfully deployed on Sevalla, adding a background task worker only requires a few additional steps. For this example, we use Celery and Redis; however, the same approach can be applied to other task queues, such as Huey and RQ. Add both celery and redis to your dependencies.
pip install celery redis
pip freeze > requirements.txt
Create a new file called app/worker.py. This file holds the Celery object and tasks. The following debug_task is an example that demonstrates Celery is working properly.
# app/worker.py
from celery import Celery
from app.core.config import settings

celery = Celery(__name__)
celery.config_from_object(settings, namespace='CELERY')

@celery.task(bind=True)
def debug_task(self):
    print(f'Request: {self.request!r}')
Celery loads its configuration from the app’s settings object, so you’ll need to update the settings model in config.py. Since Redis will serve as the task broker, set the broker configuration value to REDIS_URL.
# app/core/config.py
from pydantic_settings import BaseSettings, SettingsConfigDict

class Settings(BaseSettings):
    model_config = SettingsConfigDict(env_file=".env",)

    SQLALCHEMY_DATABASE_URI: str = ""
    REDIS_URL: str = ""

    @property
    def CELERY_BROKER_URL(self) -> str:
        return self.REDIS_URL

settings = Settings()
If you want to run Celery in your local environment, add the following to your .env.
# .env
REDIS_URL=redis://localhost:6379/0

Deploy on Sevalla

Within Sevalla, create a Redis database and connect it to your app. Make sure you add the REDIS_URL value to your environment variables. To start the Celery worker, create a new background worker with the following start command: celery -A app.worker worker -c 1 -l INFO Celery is started by referencing the name of the entry point file. The concurrency is set to one here to avoid overuse of your resources. You can adjust the value to match the needs of your app. Once you deploy your app, you’ll see Celery startup information in your logs. To test the debug task defined above, you can go to the web terminal and manually trigger the debug task. Inside the web terminal, start the virtual environment and run the task from the Python REPL.
. /opt/venv/bin/activate
python
Inside the shell, run the following:
from app.worker import debug_task
debug_task.delay()
The addition of delay to the debug_task call will send the task to the Celery broker instead of running it directly. Within the runtime logs, you can view the output message.