In this article
Table of Contents | ||||||
---|---|---|---|---|---|---|
|
...
The Snowpipe Streaming Snap uses the role defined in the accounts' account URL properties field. If a role is not defined, the Snap runs a SELECT CURRENT_ROLE()
to determine a suitable role, failing which it falls back to the PUBLIC role.
...
Error | Reason | Resolution |
---|---|---|
| The schema name is required for Snowpipe Streaming. | Provide a schema name in the Snap configuration. |
Examples
Ingest Data from One Table to Another in Snowflake
This example pipeline demonstrates real-time data ingestion from one table to another within Snowflake. As new data is added to the source table, the Snowpipe Streaming Snap detects these changes and then streams the new data directly into the target table without significant delays. Here's the execution flow of the pipeline:
Source Table: The table where new data entries are continuously added.
Snowpipe Streaming Snap: This Snap monitors the source table for new data and streams the data in real time.
Target Table: The destination table where the streamed data is inserted.
...
Step 1: Configure the Snowflake - Execute Snap to delete the existing table.
...
Step 2: Configure the Snowflake - Select Snap with Schema name and Table name fields. Add the second output view for Snowflake - Select to pass the schema to the downstream Snowflake - Snowpipe Streaming Snap to enable table creation.
Step 3: Configure the Snowflake - Snowpipe Streaming Snap to ingest the data into the Snowflake database as and when available. On validation, the data is successfully inserted into the Snowflake database.
Ingest Data from a Source into Snowflake
This example pipeline demonstrates how to use Snowflake - Snowpipe Streaming Snap to ingest data into the Snowflake database from a source as and when it is available. Here's the execution flow of the pipeline:
Data generation: The Sequence Snap generates a set number of data records.
Data formatting: The JSON Formatter Snap converts these records into JSON format.
Data Streaming: The Snowflake Snowpipe Streaming Snap ingests the JSON-formatted data into a specified Snowflake table as soon as the data is available.
...
Step 1: Configure the Sequence Snap to generate ten documents.
Step 2: Configure the Mapper Snap to pass the values to the downstream Snowflake- Snowpipe Streaming Snap.
...
Step 3: Configure the Snowflake - Execute Snap to create or replace an existing table with the column datatype.
...
Step 4: Configure the Mapper Snap to pass the values to the downstream Snowflake- Snowpipe Streaming Snap.
...
Step 5: Configure the Snowflake—Snowpipe Streaming Snap to ingest the data into the Snowflake database as and when available.
...
On validation, the data is successfully inserted into the Snowflake database.
...
Attachments | ||
---|---|---|
|
Snap Pack History
Expand | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
| ||||||||||
|
...
Related Content
...