How Northern Trust Builds Dynamic Data Pipelines
While there are many ETL/ELT tools for building data pipelines, using all Snowflake features eliminates dependence on external tools and provides metadata within Snowflake for creating dynamic data pipelines. However, this requires creating a custom framework using different features of Snowflake. In this video, Pramod Gupta of Norther Trust discusses how to use the following features of Snowflake for building a dynamic data pipeline: Streams, Tasks, SQL (Session) Variables, Information Schema, and Javascript Stored Procedure. After building a simple data pipeline, you'll learn about other features you can add to the dynamic data pipeline, such as logging, error handling, and email notifications.
Watch the webinar "Batch and Streaming Data Ingestion Best Practices with Snowflake" to learn how to leverage COPY, Snowpipe, and Snowpipe Streaming to build batch and streaming data pipelines at scale. Register now: https://tinyurl.com/2cezbtp9