Creating the DAG
After setting the default arguments, it's time to define your DAG and create the first task that checks the API. This step is essential for automating your data and machine learning workflows. The following modules were imported: DAG, PythonOperator, and datetime. You also have a custom check_updates_api function available. Time to build your DAG!
Diese Übung ist Teil des Kurses
Designing Forecasting Pipelines for Production
Anleitung zur Übung
- Define the DAG using the right function.
- Set the schedule to run daily.
- Create the
check_apitask using a Python operator. - Provide the
check_updates_apifunction as the callable.
Interaktive Übung
Vervollständige den Beispielcode, um diese Übung erfolgreich abzuschließen.
# Define the DAG
with ____(
'data_pipeline',
default_args=default_args,
description='Data pipeline for ETL process',
# Set the schedule to run daily
schedule='@____',
tags = ["python", "etl", "forecast"]
) as dag:
# Create check_api
check_api = ____(
task_id='check_api',
# Use the check_updates_api function
python_callable=____)
print(f"DAG object created: {dag}")
print(f"PythonOperator for API check created: {check_api}")