Unleash the full potential of your data pipelines with DBT and Snowflake!
In this video, we'll go through the step-by-step process on how to create end-to-end ELT pipelines using Apache Airflow, DBT, and Snowflake.
Learn how to extract, transform, and load your data in a scalable and efficient way.
Discover the benefits of using DBT for data transformation and Snowflake for data warehousing, and how Airflow can orchestrate the entire process. Whether you're a data engineer, data scientist, or business analyst, this video is for you!
Get ready to take your data pipelines to the next level!
Video Covers:
- How to install apache airflow locally
- How to run apache airflow locally
- How to setup and connect dbt to snowflake
- How to configure airflow dag to run dbt models
- How to transform snowflake data using dbt
- How to use airflow dag to run dbt and snowflake models
- How to troubleshoot common airflow pipeline errors and issues
- How to run and monitor airflow dags for your models
Useful Videos referenced:
- Install and run AIRFLOW locally with docker -
https://youtu.be/iN6wEj9AlhA?si=u_UO7_zxY1ZJd9UG
- Setup dbt core connection to snowflake -
https://youtu.be/ZbLzOgAMAwk?si=fXdvIwSi6hNqjzN4
Airflow Source Code:
https://github.com/kalekem/dbt_tutorial/blob/main/airflow/dbt_snowflake_airflow.py
#etl #snowflake #dbt #airflow