Snowflake - Execute

In this article

Overview

Snap type:

Write

Description:

This Snap allows you to execute arbitrary Snowflake SQL.

The Snap substitutes the valid JSON paths that are defined in the WHERE clause for queries/statements with the values present in the incoming document. If the incoming document does not carry a substituting value, the document is written to the error view. 

If the Snap executes a SELECT query, it merges the query results into the incoming document and overwrites the values of all existing keys. On the other hand, the Snap writes the original document if there are no results from the query. If an output view is available and an UPDATE/INSERT/MERGE/DELETE statement is executed, the original document that was used to create the statement becomes output with the status of the statement executed.

Expected upstream Snaps: You can provide values upstream to define the dynamic variables used in the execute query. You can use the document generator Snaps, such as JSON Generator, upstream.
 
Expected downstream Snaps: The Snap produces one output document for every record retrieved; hence, any document-processing Snap can be used downstream. 

The Snowflake Execute Snap is for simple DML (SELECT, INSERT, UPDATE, DELETE) type statements. 

Prerequisites:

You should have minimum permissions on the database to execute Snowflake Snaps. To understand if you already have them, you must retrieve the current set of permissions. The following commands enable you to retrieve those permissions.

SHOW GRANTS ON DATABASE <database_name>
 
SHOW GRANTS ON SCHEMA <schema_name>
 
SHOW GRANTS TO USER <user_name>

Security Prerequisites

You should have the following permissions (but not limited to) in your Snowflake account to execute this Snap:

  • Usage (DB and Schema): Privilege to use database, role, and schema.
  • Create table: Privilege to create a table on the database. role, and schema.

The following commands enable minimum privileges in the Snowflake Console:

grant usage on database <database_name> to role <role_name>;
grant usage on schema <database_name>.<schema_name>;
 
grant "CREATE TABLE" on database <database_name> to role <role_name>;
grant "CREATE TABLE" on schema <database_name>.<schema_name>;

For more information on Snowflake privileges, refer to Access Control Privileges.

Internal SQL CommandsThe permissions to grant for usage on database and creating tables depend on the queries you provide in this Snap.
Support and limitations:

Snowflake Execute and Multi-Execute Snaps may break existing Pipelines if the JDBC Driver is updated to a newer version.

With the updated JDBC driver (version 3.12.3), the Snowflake Execute and Multi-Execute Snaps' output displays a Status of "-1" instead of "0" without the Message field upon successfully executing DDL statements. If your Pipelines use these Snaps and downstream Snaps use the Status field's value from these, you must modify the downstream Snaps to proceed on a status value of -1 instead of 0.

This change in the Snap behavior follows from the change introduced in the Snowflake JDBC driver in version 3.8.1:
"Statement.getUpdateCount() and PreparedStatement.getUpdateCount() return the number of rows updated by DML statements. For all other types of statements, including queries, they return -1."

Behavior change
  • In 4.26, when the stored procedures were called using the Database Execute Snaps, the queries were treated as write queries instead of read queries. So the output displayed message and status keys after executing the stored procedure.
    In 4.27, all the Database Execute Snaps run stored procedures correctly, that is, the queries are treated as read queries. The output now displays message key, and OUT params of the procedure (if any). The status key is not displayed.
  • If the stored procedure has no OUT parameters then only the message key is displayed with value success.

If you have any existing Pipelines that are mapped with status key or previous description then those Pipelines will fail. So, you might need to revisit your Pipeline design.

Account: 

This Snap uses account references created on the Accounts page of SnapLogic Manager to handle access to this endpoint. See Configuring Snowflake Accounts for information on setting up this type of account.

Views:
Input

This Snap has at most one document input view. If the input view is defined, the where clause can substitute incoming values for a given expression. 

Output

This Snap has at most one document output view.

Error

This Snap has at most one error view and produces zero or more documents in the view.

Settings

Label*

Specify the name for the Snap. You can make the name more specific, especially if your Pipeline has more than one of the same Snap.

SQL statement*


Specify the Snowflake SQL statement to execute on the server. Document value substitution is performed on literals starting with '$', for example, $people.name is substituted with its value available in the incoming document.

In DB Execute Snaps, if the Snowflake SQL statement is not an expression, the JSON path, such as $para, is allowed in the WHERE clause only.

If the query statement starts with SELECT (case-insensitive), the Snap regards it as a select-type query and executes once per input document. If not, the Snap regards it as a write-type query and executes in batch mode.

This Snap does not allow you to inject Snowflake SQL, for example, select * from people where $columnName = abc.
Only values can be substituted since it uses prepared statements for execution, which, for example, results in select * from people where address = ?.

Without using expressions

  • email = 'you@example.com' or email = $email 
  • emp=$emp

Using expressions

  • "EMPNO=$EMPNO and ENAME=$EMPNAME"
  • "emp='" + $emp + "'"
  • "EMPNO=" + $EMPNO + " and ENAME='" + $EMPNAME+ "'"

Caution

Using expressions that join strings together to create SQL queries or conditions has a potential SQL injection risk and hence unsafe. Ensure that you understand all implications and risks involved before using concatenation of strings with '=' Expression enabled. 

Note: 

  • If '$' is not part of the JSON path, escape it as "\$" so that it can be executed as it is, such as SELECT \$2, \$3 FROM mytable.
    If the character before '$' is alphanumeric, there is no need to escape '$'. For example, SELECT metadata$filename ... 
  • When an escape character is an integral part of an expression-enabled query statement, precede it with another escape character.
    For example, if you enable expression for the SQL statement property, precede the backslash with another backslash when entering such a statement. See the following image:
  • The '$' sign and identifier characters, such as double quotes (“), single quotes ('), or back quotes (`), are reserved characters and should not be used in comments or for purposes other than their originally intended purpose.

Single quotes in values must be escaped

Any relational database (RDBMS) treats single quotes (') as special symbols. So, single quotes in the data or values passed through a DML query may cause the Snap to fail when the query is executed. Ensure that you pass two consecutive single quotes in place of one within these values to escape the single quote through these queries.

For example:

If String To pass this valueUse
Has no single quotes
Schaum Series
'Schaum Series'
Contains single quotes
O'Reilly's Publication
'O''Reilly''s Publication'

Pass through


Select this checkbox to enable the Snap to pass the input document to the output view under the key named original. This option applies only to the Execute Snaps with SELECT statement.

Default Value: Selected

Ignore empty result


Select this checkbox to not write any document to the output view when a SELECT operation does not produce any result. If this checkbox is not selected and the Pass-through checkbox is selected, the input document is passed through to the output view.

Default Value: Not selected

Number of Retries

Specify the maximum number of attempts to be made to receive a response. The request is terminated if the attempts do not result in a response.

Default Value: 0
Example3

No macro found named "retries" on page "File Reader" in space "SD".

If you're experiencing issues please see our Troubleshooting Guide.

Retry Interval (seconds)

Specify the time interval between two successive retry requests. A retry happens only when the previous attempt resulted in an exception. 

Default Value: 1
Example: 10

Use Result Query

Select this checkbox to write the query execution result to the Snap's output view after the successful execution. The output of the Snap will be enclosed within the key Result Query, and the value will be the actual output produced by the SQL query. See the example Snowflake Execute with Use Result Query enabled to know more about this option. 

This option allows users to effectively track the query's execution by clearly indicating the successful execution and the number of records affected, if any, after the execution.


Handle Timestamp and Date Time Data

Specify how the Snap must handle timestamp and date time data. The available options are:

  • Default Date Time format in UTC Time Zone: The Snowflake date time data are represented in UTC Time Zone.

  • SnapLogic Date Time format in Regional Time Zone: The Snowflake date time data are represented in the same regional time zone value, as provided in the Snowflake account.

Default value: Default Date Time format in UTC Time Zone

Recommendation:

If you use the Timestamp TZ and Timestamp LTZ in this Snap, we recommend you to use SnapLogic Date Time format in Regional TimeZone to ensure that you get the Timestamp data output of the target table in the same format as in the source table.

Source Table

Target Table

Manage Queued Queries

Select an option from the list to determine whether the Snap should continue or cancel the execution of the queued Snowflake Execute SQL queries when you stop the Pipeline. The available options are:

  • Continue to execute queued queries when pipeline is stopped or if it fails
  • Cancel queued queries when pipeline is stopped or if it fails

If you select Cancel queued queries when pipeline is stopped or if it fails, the read queries under execution are canceled, whereas the write type of queries under execution are not canceled. Snowflake internally determines which queries are safe to be canceled and cancels those queries.

Default value: Continue to execute queued queries when pipeline is stopped or if it fails

Snap Execution

Select one of the three modes in which the Snap executes. Available options are:

  • Validate & Execute: Performs limited execution of the Snap, and generates a data preview during Pipeline validation. Subsequently, performs full execution of the Snap (unlimited records) during Pipeline runtime.
  • Execute only: Performs full execution of the Snap during Pipeline execution without generating preview data.
  • Disabled: Disables the Snap and all Snaps that are downstream from it.

Examples 

Snowflake Execute with Use Result Query enabled

This example Pipeline demonstrates how to insert data into a table using the Snowflake Execute Snap.

First, we configure the Snowflake Execute Snap as follows. Note that we select the Use Result Query checkbox to view the statement result output.


Upon execution, we see the following output enclosed within the key Result Query.

The following screenshot displays the output preview when we disable the Use Result Query checkbox.



Executing the Snowflake SQL query using the Execute Snap

The following example demonstrates the execution of Snowflake SQL query using the Snowflake Execute Snap.

First, we configure the Execute Snap with this query—select * from "PRASANNA"."ADOBEDATA" , which returns the data from ADOBEDATA. 

Upon successful execution, the Snap displays the following output in its data preview.

Snowflake Execute Snap supports UDFs

User-defined functions (UDFs) created in the Snowflake console can be executed using Snowflake - Execute Snap. In the following example, the SQL statement is defined and then the Snap is executed with that conditions. 

First, the Snowflake Execute Snap is used to give the user-defined SQL statement. area_of_circle(3.0) is a UDF here. The Snap settings and the output view are as follows:

Then the Mapper Snap is used to define columns that need to be picked up from the Output of the Snowflake Execute. 


See Also

https://docs.snowflake.com/en/developer-guide/udf/sql/udf-sql-scalar-functions.html

https://docs.snowflake.com/en/sql-reference/udf-overview.html

https://docs.snowflake.com/en/user-guide-getting-started.html

Snap Pack History

 Click to view/expand
Release Snap Pack VersionDateType  Updates
4.29main15993 StableUpgraded with the latest SnapLogic Platform release.
4.28 Patch428patches15236 Latest
  • Updated the Snowflake - Bulk Upsert Snap for the following:
    • The Snap displayed a incorrect resolution when the length of the value exceeded the value defined in a column.
    • The Snap failed with a NullPointer Exception when no value was provided for Error Limit field and Error percentage limit fields.
    • The Snap displayed an incorrect error message when S3 details were not provided.
    • Added the On Error dropdown list, where you can select an action to perform when the Snap encounters errors in a file.
  • Removed the Username field in the following accounts to allow reusing of accounts among different users:
  • The following enhancements were done for Snowflake - Bulk Load Snap:
    • The Snap skips AWS account validation if S3 Storage Integration property is provided.

    • The Snap skips the validate command if S3 Storage Integration property is not provided

    • The Snap overrides the storage integration specified in the account settings with the storage integration specified in the Snap settings.

  • Enhanced the Snowflake - Unload Snap to skip AWS account validation if S3 Storage Integration property is provided.

4.28main14627 Stable
4.27427patches12999 LatestEnhanced the Snowflake SCD2 Snap to support Pipeline parameters for Natural key and Cause-historization fields.
4.27main12833 Stable
  • Enhanced the Snowflake S3 Database Account to skip batch execution, when the Batch size value is one. When the Batch size is greater than one, the batch is executed.
  • Added the following truncate options to Snowflake - Bulk Load Snap to truncate existing data before performing data load to more efficiently transfer data where possible.
    • Truncate data: Truncates existing data before performing data load. 
    • Truncate columns: Truncates column values that are larger than the maximum column length in the table.
  • Enhanced the Snowflake - Execute Snap to invoke stored procedures.
4.26 Patch426patches11469 LatestFixed an issue with Snowflake Insert and Snowflake Bulk Load Snaps where the schema names or database names containing underscore (_) caused the time out of Pipelines.
4.26main11181 Stable
  • Enhanced Snowflake - Lookup and Snowflake SCD2 Snaps with the Input Date Format field to select from the following two options:
    • Continue to execute the snap with the given input Date format
    • Auto Convert the format to Snowflake default format
  • Added a new account type Snowflake Google Storage Database to connect to Google Cloud Storage to load data.
  • Added support for all existing Snowflake Snap accounts to connect to a Snowflake instance hosted on the Google Cloud Platform

4.25425patches10190LatestEnhanced the Snowflake S3 Database and Snowflake S3 Dynamic accounts with a new field S3 AWS Token that allows you to connect to private and protected Amazon S3 buckets.
4.25main9554
 
StableUpgraded with the latest SnapLogic Platform release.

4.24 Patch

424patches8905 LatestEnhanced the Snowflake - Bulk Load Snap to allow transforming data using a new field Select Query before loading data into the Snowflake database. This option enables you to query the staged data files by either reordering the columns or loading a subset of table data from a staged file. This Snap supports CSV and JSON file formats for this data transformation.
4.24main8556
Stable

Enhanced the Snowflake - Select Snap to return only the selected output fields or columns in the output schema (second output view) using the Fetch Output Fields In Schema check box. If the Output Fields field is empty all the columns are visible.

4.23 Patch

423patches7905 Latest

Fixed the performance issue in the Snowflake - Bulk Load Snap while using External Staging on Amazon S3.

4.23main7430
 
Stable
4.22 Patch 422patches7246 Latest

Fixes an issue with the Snowflake Snaps that fail while displaying the same error message, javax.management.MalformedObjectNameException: Invalid character '=' in value part of property, repeatedly when there is “=“ or ”:” in the Snowflake URL connection.

4.22 Patch

422patches6849 Latest
4.22main6403
 
StableUpgraded with the latest SnapLogic Platform release.

4.21 Patch

421patches6272 Latest

Fixes the issue where Snowflake SCD2 Snap generates two output documents despite no changes to Cause-historization fields with DATE, TIME and TIMESTAMP Snowflake data types, and with Ignore unchanged rows field selected.

4.21 Patch

421patches6144 Latest
  • Fixed the following issues with DB Snaps:
    • The connection thread waits indefinitely causing the subsequent connection requests to become unresponsive.
    • Connection leaks occur during Pipeline execution.
  • Fixed the exception RefCnt has gone negative across the Snaps in Snowflake Snap Pack.

4.21 Patch 

db/snowflake8860 Latest

Added a new field, Handle Timestamp and Date Time Data, to Snowflake Lookup Snap. This field enables you to decide whether the Snap should translate UTC time to your local time and the format of the Date Time data.

4.21 Patch

MULTIPLE8841 Latest

Fixed the connection issue in Database Snaps by detecting and closing open connections after the Snap execution ends. 

4.21snapsmrc542

 

StableUpgraded with the latest SnapLogic Platform release.
4.20 Patch db/snowflake8800 Latest
  • Certifies the Snowflake Snap Pack against JDBC Driver version 3.12.3.

Snowflake Execute and Multi-Execute Snaps may break existing Pipelines if the JDBC Driver is updated to a newer version.

With the updated JDBC driver (version 3.12.3), the Snowflake Execute and Multi-Execute Snaps' output displays a Status of "-1" instead of "0" without the Message field upon successfully executing DDL statements. If your Pipelines use these Snaps and downstream Snaps use the Status field's value from these, you must modify the downstream Snaps to proceed on a status value of -1 instead of 0.

This change in the Snap behavior follows from the change introduced in the Snowflake JDBC driver in version 3.8.1:
"Statement.getUpdateCount() and PreparedStatement.getUpdateCount() return the number of rows updated by DML statements. For all other types of statements, including queries, they return -1."

4.20 Patch db/snowflake8758 Latest

Re-release of fixes from db/snowflake8687 for 4.20: Fixes the Snowflake Bulk Load snap where the Snap fails to load documents containing single quotes when the Load empty strings checkbox is not selected.

4.20snapsmrc535
 
StableUpgraded with the latest SnapLogic Platform release.
4.19 Patch db/snowflake8687 Latest

Fixed the Snowflake Bulk Load snap where the Snap fails to load documents containing single quotes when the Load empty strings checkbox is not selected.

4.19 Patch 

db/snowflake8499 Latest

Added the property Handle Timestamp and Date Time Data to Snowflake - Execute and Snowflake - Select Snaps. This property enables you to decide whether the Snap should translate UTC time to your local time.

4.19 Patch 

db/snowflake8412 Latest

Fixed an issue with the Snowflake - Update Snap wherein the Snap is unable to perform operations when:

  • An expression is used in the Update condition property.
  • Input data contain the character '?'.
4.19snaprsmrc528
 
Stable
  • Added a new field-set, Auto Historization Query, in the Snowflake SCD2 Snap to support auto-historization of column data. With this enhancement you can detect whether the incoming record is a historical event or a current event.  
  • Raised the minimum buffer size in the Snowflake - Bulk Upsert and Snowflake - Bulk Load Snaps to 6 MB.
4.18 Patch db/snowflake8044 Latest

Fixed an issue with the Snowflake - Select Snap wherein the Snap converts the Snowflake-provided timestamp value to the local timezone of the account.

4.18 Patch 

db/snowflake8044 Latest

Enhanced the Snap Pack to support AWS SDK 1.11.634 to fix the NullPointerException issue in the AWS SDK. This issue occurred in AWS-related Snaps that had HTTP or HTTPS proxy configured without a username and/or password. 

4.18 Patch

MULTIPLE7884 Latest

Fixed an issue with the PostgreSQL grammar to better handle the single quote characters.

4.18 Patch 

db/snowflake7821 Latest

Fixed an issue with the Snowflake - Execute Snap wherein the Snap is unable to support the '$' character in query syntax.

4.18 Patch

MULTIPLE7778 Latest

Updated the AWS SDK library version to default to Signature Version 4 Signing process for API requests across all regions.

4.18 Patch 

db/snowflake7739 Latest
  • Fixed an issue with the Snowflake - Bulk Upsert Snap wherein the Snap fails when using a pipeline parameter in Key columns.
  • Fixed an issue with the Snowflake - Unload Snap wherein the Snap does not abort the query when you stop the Pipeline execution.
4.18snapsmrc523
 
Stable

Added the Use Result Query property to the Multi Execute Snap, which enables you to write results to an output view.

4.17ALL7402
 
Latest

Pushed automatic rebuild of the latest version of each Snap Pack to SnapLogic UAT and Elastic servers.

4.17 Patch 

db/snowflake7396 Latest

Fixed an issue wherein bit data types in the Snowflake - Select table convert to true or false instead of 0 or 1.

4.17 Patch 

db/snowflake7334 Latest

Added AWS Server-Side Encryption support for AWS S3 and AWS KMS (Key Management Service) for Snowflake Bulk Load, Snowflake Bulk Upsert, and Snowflake Unload Snaps.

4.17snapsmrc515
 
Latest
  • Fixed an issue with the Snowflake Execute Snap wherein the Snap would send the input document to the output view even if the Pass through field is not selected in the Snap configuration. With this fix, the Snap sends the input document to the output view, under the key original, only if you select the Pass through field.
  • Added the Snap Execution field to all Standard-mode Snaps. In some Snaps, this field replaces the existing Execute during preview check box.
4.16 Patch db/snowflake6945 Latest

Fixed an issue with the Snowflake Lookup Snap failing when Date datatype is used in JavaScript functions.

4.16 Patch 

db/snowflake6928 Latest

Added support for file format options for input data from upstream Snaps, to the Snowflake Bulk Load Snap.

4.16 Patch 

db/snowflake6819 Latest
  • Snowflake Bulk Load: Added new property, Buffer size (MB). Configure this to specify the size limit of each buffer when writing to external staging systems such as S3. 
  • Fixed an issue with the Lookup Snap passing data simultaneously to output and error views when some values contained spaces at the end.
4.16snapsmrc508
 
Stable
  • Snowflake Account: Added the ability to use SnapLogic to securely connect to and query a Snowflake instance using Azure Blob as its storage layer. 
  • Snowflake Account: Added support for Snowflake JDBC JAR version 3.6.17. 
  • Snowflake Unload, Bulk Load, and Bulk Upsert: Updated the Snaps to enable SnapLogic users to successfully connect to a Snowflake instance to query, bulk load, and unload data from Azure Blob storage.
4.15snapsmrc500
 
Stable
  • Added two new Snaps, Snowflake - Multi Execute, and Snowflake SCD2. Snowflake - Multi Execute is used for executing multiple DDL and DML queries on the Snowflake DB. Snowflake SCD2 is used for Type 2 field historization. 
  • Enhanced the Snowflake Bulk Upsert Snaps to improve the Snaps performance.
  • Enhanced the Snowflake Snap Pack to reflect Azure certification.
4.14snapsmrc490
 
StableUpgraded with the latest SnapLogic Platform release.
4.13

snapsmrc486

 
StableUpgraded with the latest SnapLogic Platform release.
4.12

snapsmrc480

 
StableUpgraded with the latest SnapLogic Platform release.
4.11 PatchMULTIPLE4377 Latest

Fixed a document call issue that was slowing down the Snowflake Bulk Load Snap.

4.11 Patch 

db/snowflake4283 Latest

Snowflake Bulk Load - Fixed an issue by adding PUT command to the list of DDL command list for Snowflake.

4.11 Patch 

db/snowflake4273 Latest

Snowflake Bulk Load - Resolved an issue with Snowflake Bulk Load Delimiter Consistency (comma and newline).

4.11snapsmrc465
 
StableUpgraded with the latest SnapLogic Platform release.

4.10 Patch

snowflake4133
Latest

Updated the Snowflake Bulk Load Snap with Preserve case sensitivity property to preserve the case sensitivity of column names.

4.10

snapsmrc414

 
Stable
  • Updated the Snowflake Bulk Load Snap with Load empty strings property for the empty string values in the input documents to be loaded as empty strings to the string-type fields.
  • Updated the Snowflake Bulk Load Snap with Table Columns to support the order of the entries on the staged files that contain a subset of the columns in the Snowflake table.
  • Added the property Use Result Query to view the output preview field with a result statement.
  • Tested for the JDBC jar version 3.1.1 on the Database and the Dynamic accounts.

4.9.0 Patch

snowflake3234 Latest

Enhanced Snowflake - Execute Snap results to include additional details

4.9.0 Patch

snowflake3125 Latest

Addressed an issue in Snowflake Bulk Load where the comma character in a value is not escaped.

4.9snapsmrc405
 
StableJDBC Driver Class property added to enable the user to custom configure the JDBC driver in the Database and the Dynamic accounts.

4.8.0 Patch

snowflake2760 Latest

Potential fix for JDBC deadlock issue.

4.8.0 Patch

snowflake2739 Latest

Addressed an issue with the Snowflake schema not correctly represented in the Mapper Snap.

4.8

snapsmrc398

 
Stable
  • Info tab added to accounts.
  • Database accounts now invalidate connection pools if account properties are modified and login attempts fail.
  • Enhanced the default count of input and output view (UI) behavior of the Snaps for better user experience. 

    Snowflake Snap Pack

    Views

    Bulk Load

    Bulk Upsert

    Insert

    Update

    Delete

    Execute

    Unload

    Select

    Table List

    Lookup

    Initial (4.7)

    Input-Output views

    1-0

    1-0

    1-0

    1-0

    0-0

    0-0

    0-0

    0-1

    0-1

    0-1

    Current (4.8)

    Input-Output views

    1-1

    1-1

    1-1

    1-1

    0-1

    0-1

    0-1

    0-1

    0-1

    0-1

    Note that the Snowflake Select, Table List and Lookup Snap views remain unchanged.