Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

In this article

Table of Contents
minLevel1
maxLevel2
absoluteUrltrue

...

The Snowpipe Streaming Snap uses the role defined in the accounts' Url account URL properties field. If a role is not defined, the Snap runs a SELECT CURRENT_ROLE() to determine a suitable role, failing which it falls back to the PUBLIC role.

...

  • Valid Snowflake KeyPair or OAuth 2.0 account.

  • A valid account with the required permissions.

Support for Ultra

...

Tasks

Limitations and Known Issues

...

Field Name

Field Type

Description

Label*

 

Default ValueSnowflake – Snowpipe Streaming
ExampleStream Data

String

Specify the name for the Snap. You can modify this to be more specific, especially if you have more than one of the same Snap in your pipeline.

 

Schema name

 

Default Value: N/A
Example: “PUBLIC”

String/Expression/Suggestion

Specify the database schema name.

Table Name*

 

Default Value: N/A
Example: “PUBLIC”.”SNOWPIPESTREAMING”

String/Expression/Suggestion

Specify the table name on which the insert operation has to be executed.

Create table if not present

 

Default ValueDeselected

Checkbox

Select this checkbox if you want the table to be automatically created if it does not already exist.

Max client lag*

 

Default Value1000
Example: 1500
Max Value: 60,000

Integer/Expression/Suggestion

Specify the client data flush interval in milliseconds. Adjust this value based on the maximum latency your target system can handle (60,000 ms).

This field also accepts inputs in n-second and n-minute formats. Maximum value is 10 minutes.

Snap execution

Default Value
Example: Validate & Execute

Dropdown list

Select one of the following three modes in which the Snap executes:

  • Validate & Execute: Performs limited execution of the Snap and generates a data preview during pipeline validation. Subsequently, performs full execution of the Snap (unlimited records) during pipeline runtime.

  • Execute only: Performs full execution of the Snap during pipeline execution without generating preview data.

  • Disabled: Disables the Snap and all Snaps that are downstream from it.

...

Error

Reason

Resolution

Schema name not found

The schema name is required for Snowpipe Streaming.

Provide a schema name in the Snap configuration.

Examples

Ingest Data from One Table to Another in Snowflake

This example pipeline demonstrates real-time data ingestion from one table to another within Snowflake. As new data is added to the source table, the Snowpipe Streaming Snap detects these changes and then streams the new data directly into the target table without significant delays. Here's the execution flow of the pipeline:

  • Source Table: The table where new data entries are continuously added.

  • Snowpipe Streaming Snap: This Snap monitors the source table for new data and streams the data in real time.

  • Target Table: The destination table where the streamed data is inserted.

...

Download this pipeline.

Step 1: Configure the Snowflake - Execute Snap to delete the existing table.

...

Step 2: Configure the Snowflake - Select Snap with Schema name and Table name fields. Add the second output view for Snowflake - Select to pass the schema to the downstream Snowflake - Snowpipe Streaming Snap to enable table creation.

snowflake-select-config.pngImage Added

snowflake-select-second-output.pngImage Added

Step 3: Configure the Snowflake - Snowpipe Streaming Snap to ingest the data into the Snowflake database as and when available. On validation, the data is successfully inserted into the Snowflake database.

snowpipe-streaming-config.pngImage Added

snowpipe-streaming-output.pngImage Added

Ingest Data from a Source into Snowflake

This example pipeline demonstrates how to use Snowflake - Snowpipe Streaming Snap to ingest data into the Snowflake database from a source as and when it is available. Here's the execution flow of the pipeline:

  1. Data generation: The Sequence Snap generates a set number of data records.

  2. Data formatting: The JSON Formatter Snap converts these records into JSON format.

  3. Data Streaming: The Snowflake Snowpipe Streaming Snap ingests the JSON-formatted data into a specified Snowflake table as soon as the data is available.

...

Download this pipeline.

Step 1: Configure the Sequence Snap to generate ten documents.

Step 2: Configure the Mapper Snap to pass the values to the downstream Snowflake- Snowpipe Streaming Snap.

...

Step 3: Configure the Snowflake - Execute Snap to create or replace an existing table with the column datatype.

...

Step 4: Configure the Mapper Snap to pass the values to the downstream Snowflake- Snowpipe Streaming Snap.

...

Step 5: Configure the Snowflake—Snowpipe Streaming Snap to ingest the data into the Snowflake database as and when available.

...

On validation, the data is successfully inserted into the Snowflake database.

...

Attachments
patterns*.slp, *.zip

Snap Pack History

Expand
titleClick here to expand...

Insert excerpt
Snowflake Snap Pack
Snowflake Snap Pack
nameSnowflake_SPH
nopaneltrue

...

Related Content

...