Implement an ETL process that extracts data from a free fake API, transforms it, and automatically loads it into BigQuery using Airflow. Utilize Agile methodologies, testing and version control tools.
Install all requiriments in a Python environment using pip
Create a folder named DAGs
in the Airflow directory containing the file etl_dag.py
and its dependencies.
airflow db init
airflow webserver --port 8080
Open a browser and go to localhost:8080 to access the Airflow UI.
Cristian MB [email protected]