Build and monitor bulletproof data pipelines
Turn your failing scheduled jobs into resilient, recurring workflows without torturing your code.
![](https://cdn.statically.io/img/www.prefect.io/_next/image?url=https%3A%2F%2Fcdn.sanity.io%2Fimages%2F3ugk85nk%2Fproduction%2F6351b791f1858fe2c405ef7dc41b48382dc437ce-800x600.png%3Fq%3D90%26fit%3Dmax%26auto%3Dformat&w=3840&q=75)
![](https://cdn.statically.io/img/www.prefect.io/_next/image?url=https%3A%2F%2Fcdn.sanity.io%2Fimages%2F3ugk85nk%2Fproduction%2Fd1db39e14cf6e23df2972a90c68d57b46651aac5-1920x1920.png%3Fq%3D90%26fit%3Dmax%26auto%3Dformat&w=3840&q=75)
![Lee Mendelowitz](https://cdn.statically.io/img/www.prefect.io/_next/image?url=https%3A%2F%2Fcdn.sanity.io%2Fimages%2F3ugk85nk%2Fproduction%2Fd46e0fd8947af8ac37e028c1153d52412d500536-461x461.jpg%3Fw%3D220%26fit%3Dmax%26auto%3Dformat&w=3840&q=75)
Build data pipelines faster
Stop writing boilerplate and ship your workflows to production at speed.
- Pythonic syntax
- Flexible infrastructure model
- Customizable IFTTT-style automations
- Native events and webhooks
Recover from failure, quickly
Build self-healing workflows with automations and retries, or return to normal with insights into failure and granular inspection of individual flows.
![](https://cdn.statically.io/img/www.prefect.io/_next/image?url=https%3A%2F%2Fcdn.sanity.io%2Fimages%2F3ugk85nk%2Fproduction%2Fc2050a7d1eee4fc0065c1f826b773589bc4fa626-1589x1182.png%3Ffit%3Dmax%26auto%3Dformat&w=3840&q=75)
From dev to prod without a hitch
Prefect's architecture means you can promote to production without worry - your code runs as expected on the infrastructure you choose, without changes.
![alt](https://cdn.statically.io/img/www.prefect.io/_next/image?url=https%3A%2F%2Fcdn.sanity.io%2Fimages%2F3ugk85nk%2Fproduction%2F81a9ae60c2e1b9753263b36800b409df6284d2d0-3840x2290.png%3Fq%3D90%26fit%3Dmax%26auto%3Dformat&w=3840&q=75)
![](https://cdn.statically.io/img/www.prefect.io/_next/image?url=https%3A%2F%2Fcdn.sanity.io%2Fimages%2F3ugk85nk%2Fproduction%2F12458df8c650e84fda7059b81c6bce44d6714ba4-2350x1276.png%3Ffit%3Dmax%26auto%3Dformat&w=3840&q=75)
![](https://cdn.statically.io/img/www.prefect.io/_next/image?url=https%3A%2F%2Fcdn.sanity.io%2Fimages%2F3ugk85nk%2Fproduction%2F1b9208b58437a475edb6a67f9260e5a07cd2a47c-960x540.jpg%3Fw%3D300%26fit%3Dmax%26auto%3Dformat&w=3840&q=75)
Learn About Prefect
A versatile, dynamic framework
From batch ETL scheduling to complex operational workflows: if python can write it, Prefect can orchestrate it.
- Events & webhooks
- Automations
- DAG discovery at runtime
1from prefect import flow, task
2
3@task
4def add_one(x: int):
5 return x + 1
6
7@flow
8def main():
9 for x in [1, 2, 3]:
10 first = add_one(x)
11 second = add_one(first)
Consolidate your scheduling
Stop stitching together fragile workflows that break across multiple tools.
Unify your scheduling and get a complete picture of your system health and granular insights into what broke, when.
![](https://cdn.statically.io/img/www.prefect.io/_next/image?url=https%3A%2F%2Fcdn.sanity.io%2Fimages%2F3ugk85nk%2Fproduction%2F8b04579235f152e767a70af48fc4ca98993d4773-1292x989.png%3Ffit%3Dmax%26auto%3Dformat&w=3840&q=75)
Don't change how you write code
1import yfinance as yf
2
3
4def extract(ticker):
5 data = yf.download(tickers=ticker, period='10d', interval='1h')
6 return data['Close'] # Extract only the 'Close' data
7
8def transform(data):
9 sma = data.rolling(48).mean()
10 return sma
11
12def load(sma):
13 print(sma)
14
15def etl(ticker='SNOW'):
16 extracted_data = extract(ticker)
17 transformed_data = transform(extracted_data)
18 load(transformed_data)
19
20
21if __name__ == "__main__":
22 etl()
23
24
25
26
27
1import yfinance as yf
2from prefect import flow, task
3
4
5@task
6def extract(ticker):
7 data = yf.download(tickers=ticker, period='10d', interval='1h')
8 return data['Close'] # Extract only the 'Close' data
9
10@task
11def transform(data):
12 sma = data.rolling(48).mean()
13 return sma
14
15@task
16def load(sma):
17 print(sma)
18
19@flow
20def etl(ticker='SNOW'):
21 extracted_data = extract(ticker)
22 transformed_data = transform(extracted_data)
23 load(transformed_data)
24
25
26if __name__ == "__main__":
27 etl.serve(name="moving_average_deployment",cron="* * * * *")