Skip to main content
Big Data Test Infrastructure (BDTI)

Use case code (DAG)

DAG (Directed Acyclic Graph) is the core concept of Airflow, collecting Tasks together, organized with dependencies and relationships to say how they should run.

1) Import relevant libraries

from airflow import DAG
from airflow.operators.python import PythonOperator
from airflow.operators.bash import BashOperator
from datetime import datetime,timedelta
 

2) Define DAG default_args

default_args = {
    'owner': 'airflow',
    'depends_on_past': False,
    'email_on_failure': False,
    'email_on_retry': False,
    'retries': 0,
    'retry_delay': timedelta(minutes=1)
}

3) Define the DAG

dag = DAG(
    dag_id='meteo_data_fetching',
    default_args=default_args,
    schedule_interval="@daily",
    start_date=datetime(2022, 1, 1),
    catchup=False
)

4) Define the task to be performed through the BashOperator when the DAG is triggered - to execute the data fetching script

t1 = BashOperator(
    task_id='testairflow',
    bash_command='python /opt/airflow/dags/UAT_Scenario_5.py', 
    dag=dag
)