- Ship production-ready orchestration code with zero boilerplate.
- See live, structured logs without writing any logging boilerplate.
- Understand how the very same Python stays portable from a laptop to Kubernetes (or Prefect Cloud).
Importing Prefect and setting up
We start by importing the essentialflow
decorator from Prefect.
Defining a flow
Prefect takes your Python functions and transforms them into flows with enhanced capabilities. Let’s write a simple function that takes a name parameter and prints a greeting. To make this function work with Prefect, we just wrap it in the@flow
decorator.
Running our flow locally and with parameters
Now let’s see different ways we can call that flow:- As a regular call with default parameters
- With custom parameters
- Multiple times to greet different people
What just happened?
When we decorated our function with@flow
, the function was transformed into a Prefect flow. Each time we called it:
- Prefect registered the execution as a flow run
- It tracked all inputs, outputs, and logs
- It maintained detailed state information about the execution
- Added tags to the flow run to make it easier to find when observing the flow runs in the UI
But why does this matter?
This simple example demonstrates Prefect’s core value proposition: taking regular Python code and enhancing it with production-grade orchestration capabilities. Let’s explore why this matters for real-world data workflows.You can change the code and run it again
For instance, change the greeting message in thehello
function to a different message and run the flow again.
You’ll see your changes immediately reflected in the logs.
You can process more data
Add more names to thecrew
list or create a larger data set to process. Prefect will handle each execution and track every input and output.
You can run a more complex flow
Thehello
function is a simple example, but in its place imagine something that matters to you, like:
- ETL processes that extract, transform, and load data
- Machine learning training and inference pipelines
- API integrations and data synchronization jobs
Key Takeaways
Remember that Prefect makes it easy to:- Transform regular Python functions into production-ready workflows with just a decorator
- Get automatic logging, retries, and observability without extra code
- Run the same code anywhere - from your laptop to production
- Build complex data pipelines while maintaining simplicity
- Track every execution with detailed logs and state information
@flow
decorator is your gateway to enterprise-grade orchestration - no complex configuration needed!
For more information about the orchestration concepts demonstrated in this example, see the Prefect documentation.