ComeçarComece de graça

Creating the DAG

After setting the default arguments, it's time to define your DAG and create the first task that checks the API. This step is essential for automating your data and machine learning workflows. The following modules were imported: DAG, PythonOperator, and datetime. You also have a custom check_updates_api function available. Time to build your DAG!

Este exercício faz parte do curso

Designing Forecasting Pipelines for Production

Ver curso

Instruções do exercício

  • Define the DAG using the right function.
  • Set the schedule to run daily.
  • Create the check_api task using a Python operator.
  • Provide the check_updates_api function as the callable.

Exercício interativo prático

Experimente este exercício completando este código de exemplo.

# Define the DAG
with ____(
    'data_pipeline',
    default_args=default_args,
    description='Data pipeline for ETL process',
  	# Set the schedule to run daily
    schedule='@____',
    tags = ["python", "etl", "forecast"]
) as dag:
  # Create check_api
  check_api = ____(
    task_id='check_api',
    # Use the check_updates_api function
    python_callable=____)

print(f"DAG object created: {dag}")
print(f"PythonOperator for API check created: {check_api}") 
Editar e executar o código