Refer to the examples repository for the complete example’s source code.
Overview
This example will build out:@prefect.taskdefinitions representing the work you want to run in the background- A
fastapiapplication providing API endpoints to:- Receive task parameters via
POSTrequest and submit the task to Prefect with.delay() - Allow polling for the task’s status via a
GETrequest using itstask_run_id
- Receive task parameters via
- A
Dockerfileto build a multi-stage image for the web app, Prefect server and task worker(s) - A
compose.yamlto manage lifecycles of the web app, Prefect server and task worker(s)
uv to bootstrap a your own new project:
src/foo directory for portability and organization.
This example does not require:
- Prefect Cloud
- creating a Prefect Deployment
- creating a work pool
Useful things to remember
- You can call any Python code from your task definitions (including other flows and tasks!)
- Prefect Results allow you to save/serialize the
returnvalue of your task definitions to your result storage (e.g. a local directory, S3, GCS, etc), enabling caching and idempotency.
Defining the background task
The core of the background processing is a Python function decorated with@prefect.task. This marks the function as a unit of work that Prefect can manage (e.g. observe, cache, retry, etc.)
src/foo/task.py
@task: Decorator to define our task we want to run in the background.cache_policy: Caching based onINPUTSandTASK_SOURCE.serve(create_structured_output): This function starts a task worker subscribed to newlydelay()ed task runs.
Building the FastAPI application
The FastAPI application provides API endpoints to trigger the background task and check its status.src/foo/api.py
Checking Task Status with the Prefect Client
Checking Task Status with the Prefect Client
The This function fetches the
get_task_result helper function (in src/foo/_internal/_prefect.py) uses the Prefect Python client to interact with the Prefect API:src/foo/_internal/_prefect.py
TaskRun object from the API and checks its state to determine if it’s Completed, Failed, or still Pending/Running. If completed, it attempts to retrieve the result using task_run.state.result(). If failed, it tries to get the error message.Building the Docker Image
A multi-stageDockerfile is used to create optimized images for each service (Prefect server, task worker, and web API). This approach helps keep image sizes small and separates build dependencies from runtime dependencies.
Dockerfile
Dockerfile Key Details
Dockerfile Key Details
- Base Stage (
base): Sets up Python,uv, installs all dependencies frompyproject.tomlinto a base layer to make use of Docker caching, and copies the source code. - Server Stage (
server): Builds upon thebasestage. Sets the default command (CMD) to start the Prefect server. - Task Worker Stage (
task): Builds upon thebasestage. Sets theCMDto run thesrc/foo/task.pyscript, which is expected to contain theserve()call for the task(s). - API Stage (
api): Builds upon thebasestage. Sets theCMDto start the FastAPI application usinguvicorn.
compose.yaml file then uses the target build argument to specify which of these final stages (server, task, api) to use for each service container.Declaring the application services
We usecompose.yaml to define and run the multi-container application, managing the lifecycles of the FastAPI web server, the Prefect API server, database and task worker(s).
compose.yaml
- write a
Dockerfilefor each service - add a
postgresservice and configure it as the Prefect database. - remove the hot-reloading configuration in the
developsection
Key Service Configurations
Key Service Configurations
-
prefect-server: Runs the Prefect API server and UI.build: Uses a multi-stageDockerfile(not shown here, but present in the example repo) targeting theserverstage.ports: Exposes the Prefect API/UI on port4200.volumes: Uses a named volumeprefect-datato persist the Prefect SQLite database (/root/.prefect/prefect.db) across container restarts.PREFECT_SERVER_API_HOST=0.0.0.0: Makes the API server listen on all interfaces within the Docker network, allowing thetaskandapiservices to connect.
-
task: Runs the Prefect task worker process (executingpython src/foo/task.pywhich callsserve).build: Uses thetaskstage from theDockerfile.depends_on: Ensures theprefect-serverservice is started before this service attempts to connect.PREFECT_API_URL: Crucial setting that tells the worker where to find the Prefect API to poll for submitted task runs.PREFECT_LOCAL_STORAGE_PATH=/task-storage: Configures the worker to store task run results in the/task-storagedirectory inside the container. This path is mounted to the host using thetask-storagenamed volume viavolumes: - ./task-storage:/task-storage(or justtask-storage:if using a named volume without a host path binding).PREFECT_RESULTS_PERSIST_BY_DEFAULT=true: Tells Prefect tasks to automatically save their results using the configured storage (defined byPREFECT_LOCAL_STORAGE_PATHin this case).PREFECT_LOGGING_LOG_PRINTS=true: Configures the Prefect logger to capture output fromprint()statements within tasks.OPENAI_API_KEY=${OPENAI_API_KEY}: Passes secrets needed by the task code from the host environment (via a.envfile loaded by Docker Compose) into the container’s environment.
-
api: Runs the FastAPI web application.build: Uses theapistage from theDockerfile.depends_on: Waits for theprefect-server(required for submitting tasks and checking status) and optionally thetaskworker.PREFECT_API_URL: Tells the FastAPI application where to send.delay()calls and status check requests.PREFECT_LOCAL_STORAGE_PATH: May be needed if the API itself needs to directly read result files (though typically fetching results viatask_run.state.result()is preferred).
-
volumes: Defines named volumes (prefect-data,task-storage) to persist data generated by the containers.
Running this example
Assuming you have obtained the code (either by cloning the repository or usinguv init as described previously) and are in the project directory:
-
Prerequisites: Ensure Docker Desktop (or equivalent) with
docker composesupport is running. -
Build and Run Services:
This example’s task uses marvin, which (by default) requires an OpenAI API key. Provide it as an environment variable when starting the services:
This command will:
--build: Build the container images if they don’t exist or if the Dockerfile/context has changed.--watch: Watch for changes in the project source code and automatically sync/rebuild services (useful for development).- Add
--detachor-dto run the containers in the background.
-
Access Services:
- If you cloned the existing example, check out the basic htmx UI at http://localhost:8000
- FastAPI docs: http://localhost:8000/docs
- Prefect UI (for observing task runs): http://localhost:4200
Cleaning up
Next Steps
This example provides a repeatable pattern for integrating Prefect-managed background tasks with any python web application. You can:- Explore the background tasks examples repository for more examples.
- Adapt
src/**/*.pyto define and submit your specific web app and background tasks. - Configure Prefect settings (environment variables in
compose.yaml) further, for example, using different result storage or logging levels. - Deploy these services to cloud infrastructure using managed container services.