Get startedGet started for free

Defining a BashOperator task

The BashOperator allows you to specify any given Shell command or script and add it to an Airflow workflow. This can be a great start to implementing Airflow in your environment.

As such, you've been running some scripts manually to clean data (using a script called cleanup.sh) prior to delivery to your colleagues in the Data Analytics group. As you get more of these tasks assigned, you've realized it's becoming difficult to keep up with running everything manually, much less dealing with errors or retries. You'd like to implement a simple script as an Airflow operator.

The Airflow DAG analytics_dag is already defined for you and has the appropriate configurations in place.

This exercise is part of the course

Introduction to Apache Airflow in Python

View Course

Exercise instructions

  • Import the BashOperator object.
  • Define a BashOperator called cleanup with the task_id of cleanup_task.
  • Use the command cleanup.sh.

Hands-on interactive exercise

Have a go at this exercise by completing this sample code.

# Import the BashOperator
from airflow.operators.bash import ____

with DAG(dag_id="test_dag", default_args={"start_date": "2024-01-01"}) as analytics_dag:
  # Define the BashOperator 
  cleanup = ____(
      task_id=____,
      # Define the bash_command
      bash_command=____,
  )
Edit and Run Code