Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

In this article

Table of Contents
minLevel1
maxLevel2
absoluteUrltrue

...

The Snowpipe Streaming Snap uses the role defined in the accounts' account URL properties field. If a role is not defined, the Snap runs a SELECT CURRENT_ROLE() to determine a suitable role, failing which it falls back to the PUBLIC role.

...

Error

Reason

Resolution

Schema name not found

The schema name is required for Snowpipe Streaming.

Provide a schema name in the Snap configuration.

Examples

Ingest Data from One Table to Another in Snowflake

This example pipeline demonstrates real-time data ingestion from one table to another within Snowflake. As new data is added to the source table, the Snowpipe Streaming Snap detects these changes and then streams the new data directly into the target table without significant delays. Here's the execution flow of the pipeline:

  • Source Table: The table where new data entries are continuously added.

  • Snowpipe Streaming Snap: This Snap monitors the source table for new data and streams the data in real time.

  • Target Table: The destination table where the streamed data is inserted.

...

Download this pipeline.

Step 1: Configure the Snowflake - Execute Snap to delete the existing table.

...

Step 2: Configure the Snowflake - Select Snap with Schema name and Table name fields. Add the second output view for Snowflake - Select to pass the schema to the downstream Snowflake - Snowpipe Streaming Snap to enable table creation.

snowflake-select-config.pngImage Added

snowflake-select-second-output.pngImage Added

Step 3: Configure the Snowflake - Snowpipe Streaming Snap to ingest the data into the Snowflake database as and when available. On validation, the data is successfully inserted into the Snowflake database.

snowpipe-streaming-config.pngImage Added

snowpipe-streaming-output.pngImage Added

Ingest Data from a Source into Snowflake

This example pipeline demonstrates how to use Snowflake - Snowpipe Streaming Snap to ingest data into the Snowflake database from a source as and when it is available. Here's the execution flow of the pipeline:

  1. Data generation: The Sequence Snap generates a set number of data records.

  2. Data formatting: The JSON Formatter Snap converts these records into JSON format.

  3. Data Streaming: The Snowflake Snowpipe Streaming Snap ingests the JSON-formatted data into a specified Snowflake table as soon as the data is available.

...

Download this pipeline.

Step 1: Configure the Sequence Snap to generate ten documents.

Step 2: Configure the Mapper Snap to pass the values to the downstream Snowflake- Snowpipe Streaming Snap.

...

Step 3: Configure the Snowflake - Execute Snap to create or replace an existing table with the column datatype.

...

Step 4: Configure the Mapper Snap to pass the values to the downstream Snowflake- Snowpipe Streaming Snap.

...

Step 5: Configure the Snowflake—Snowpipe Streaming Snap to ingest the data into the Snowflake database as and when available.

...

On validation, the data is successfully inserted into the Snowflake database.

...

Attachments
patterns*.slp, *.zip

Snap Pack History

Expand
titleClick here to expand...

Insert excerpt
Snowflake Snap Pack
Snowflake Snap Pack
nameSnowflake_SPH
nopaneltrue

...

Related Content

...