Unlocking The Power Of Dynamic Workflows With Metadata In Databricks

Unlocking The Power Of Dynamic Workflows With Metadata In Databricks

600 Lượt nghe
Unlocking The Power Of Dynamic Workflows With Metadata In Databricks
In this video, I explore new workflow controls that allow building dynamic pipelines to orchestrate Databricks workloads. I explain how 'For each' and 'If/else condition' tasks work and demonstrate how to build end-to-end metadata-base dynamic workflows. Chapters: 00:00 - Overview of dynamic job controls 03:09- Basic functionality of 'For each' iterative task- building iteration collection 09:31- Using job parameters as an input for 'For each' task 10:49- Using other task's output as an input for 'For each' task 16:40- Basic functionality of 'If/else condition' task 20:26- How to create metadata-based dynamic workflow using iterative and conditional tasks? You can download the notebook demonstrated in video from this link: https://github.com/fazizov/youtube/blob/main/Data%20engineering%20with%20Databricks/Collect_metadata.dbc Read more about dynamic workflow capabilities here: https://www.databricks.com/blog/streamlining-repetitive-tasks-databricks-workflows