Create a deployment for a flow by calling the serve
method.
The simplest way to create a deployment for your flow is by calling its serve
method.
The serve method creates a deployment for the flow and starts a long-running process that monitors for work from the Prefect server. When work is found, it is executed within its own isolated subprocess.
This interface provides the configuration for a deployment (with no strong infrastructure requirements), such as:
Schedules are auto-paused on shutdown
By default, stopping the process running flow.serve
will pause the schedule
for the deployment (if it has one).
When running this in environments where restarts are expected use the
pause_on_shutdown=False
flag to prevent this behavior:
The serve
method on flows exposes many options for the deployment.
Here’s how to use some of those options:
cron
: a keyword that allows you to set a cron string schedule for the deployment; see
schedules for more advanced scheduling optionstags
: a keyword that allows you to tag this deployment and its runs for bookkeeping and filtering purposesdescription
: a keyword that allows you to document what this deployment does; by default the
description is set from the docstring of the flow function (if documented)version
: a keyword that allows you to track changes to your deployment; uses a hash of the
file containing the flow by default; popular options include semver tags or git commit hashestriggers
: a keyword that allows you to define a set of conditions for when the deployment should run; see
triggers for more on Prefect Events conceptsNext, add these options to your deployment:
Triggers with .serve
See this example that triggers downstream work on upstream events.
When you rerun this script, you will find an updated deployment in the UI that is actively scheduling work.
Stop the script in the CLI using CTRL+C
and your schedule automatically pauses.
serve()
is a long-running process
To execute remotely triggered or scheduled runs, your script with flow.serve
must be actively running.
Serve multiple flows with the same process using the serve
utility along with the to_deployment
method of flows:
The behavior and interfaces are identical to the single flow case. A few things to note:
flow.to_deployment
interface exposes the exact same options as flow.serve
; this method
produces a deployment objectserve(...)
is calledA few optional steps for exploration include:
"sleeper"
deployment"sleeper"
deployment with different values for sleep
"sleeper"
deployment from the UIHybrid execution option
Prefect’s deployment interface allows you to choose a hybrid execution model. Whether you use Prefect Cloud or self-host Prefect server, you can run workflows in the environments best suited to their execution. This model enables efficient use of your infrastructure resources while maintaining the privacy of your code and data. There is no ingress required. Read more about our hybrid model.
Just like the .deploy
method, the flow.from_source
method is used to define how to retrieve the flow that you want to serve.
from_source
The flow.from_source
method on Flow
objects requires a source
and an entrypoint
.
source
The source
of your deployment can be:
path/to/a/local/directory
https://github.com/org/repo.git
GitRepository
object that accepts
GitCredentials
for private repositoriesentrypoint
A flow entrypoint
is the path to the file where the flow is located within that source
, in the form
For example, the following code will load the hello
flow from the flows/hello_world.py
file in the PrefectHQ/prefect
repository:
For more ways to store and access flow code, see the Retrieve code from storage page.
You can serve loaded flows
You can serve a flow loaded from remote storage with the same serve
method as a local flow:
When you serve a flow loaded from remote storage, the serving process periodically polls your remote storage for updates to the flow’s code. This pattern allows you to update your flow code without restarting the serving process. Note that if you change metadata associated with your flow’s deployment such as parameters, you will need to restart the serve process.
Create a deployment for a flow by calling the serve
method.
The simplest way to create a deployment for your flow is by calling its serve
method.
The serve method creates a deployment for the flow and starts a long-running process that monitors for work from the Prefect server. When work is found, it is executed within its own isolated subprocess.
This interface provides the configuration for a deployment (with no strong infrastructure requirements), such as:
Schedules are auto-paused on shutdown
By default, stopping the process running flow.serve
will pause the schedule
for the deployment (if it has one).
When running this in environments where restarts are expected use the
pause_on_shutdown=False
flag to prevent this behavior:
The serve
method on flows exposes many options for the deployment.
Here’s how to use some of those options:
cron
: a keyword that allows you to set a cron string schedule for the deployment; see
schedules for more advanced scheduling optionstags
: a keyword that allows you to tag this deployment and its runs for bookkeeping and filtering purposesdescription
: a keyword that allows you to document what this deployment does; by default the
description is set from the docstring of the flow function (if documented)version
: a keyword that allows you to track changes to your deployment; uses a hash of the
file containing the flow by default; popular options include semver tags or git commit hashestriggers
: a keyword that allows you to define a set of conditions for when the deployment should run; see
triggers for more on Prefect Events conceptsNext, add these options to your deployment:
Triggers with .serve
See this example that triggers downstream work on upstream events.
When you rerun this script, you will find an updated deployment in the UI that is actively scheduling work.
Stop the script in the CLI using CTRL+C
and your schedule automatically pauses.
serve()
is a long-running process
To execute remotely triggered or scheduled runs, your script with flow.serve
must be actively running.
Serve multiple flows with the same process using the serve
utility along with the to_deployment
method of flows:
The behavior and interfaces are identical to the single flow case. A few things to note:
flow.to_deployment
interface exposes the exact same options as flow.serve
; this method
produces a deployment objectserve(...)
is calledA few optional steps for exploration include:
"sleeper"
deployment"sleeper"
deployment with different values for sleep
"sleeper"
deployment from the UIHybrid execution option
Prefect’s deployment interface allows you to choose a hybrid execution model. Whether you use Prefect Cloud or self-host Prefect server, you can run workflows in the environments best suited to their execution. This model enables efficient use of your infrastructure resources while maintaining the privacy of your code and data. There is no ingress required. Read more about our hybrid model.
Just like the .deploy
method, the flow.from_source
method is used to define how to retrieve the flow that you want to serve.
from_source
The flow.from_source
method on Flow
objects requires a source
and an entrypoint
.
source
The source
of your deployment can be:
path/to/a/local/directory
https://github.com/org/repo.git
GitRepository
object that accepts
GitCredentials
for private repositoriesentrypoint
A flow entrypoint
is the path to the file where the flow is located within that source
, in the form
For example, the following code will load the hello
flow from the flows/hello_world.py
file in the PrefectHQ/prefect
repository:
For more ways to store and access flow code, see the Retrieve code from storage page.
You can serve loaded flows
You can serve a flow loaded from remote storage with the same serve
method as a local flow:
When you serve a flow loaded from remote storage, the serving process periodically polls your remote storage for updates to the flow’s code. This pattern allows you to update your flow code without restarting the serving process. Note that if you change metadata associated with your flow’s deployment such as parameters, you will need to restart the serve process.