In this article

Overview

You can use this Snap to execute arbitrary Snowflake SQL. This Snap works only with single queries.

note

The Snap substitutes the valid JSON paths that are defined in the WHERE clause for queries/statements with the values present in the incoming document. If the incoming document does not carry a substituting value, the document is written to the error view. If the Snap executes a SELECT query, it merges the query results into the incoming document and overwrites the values of all existing keys. On the other hand, the Snap writes the original document if there are no results from the query. If an output view is available and an UPDATE/INSERT/MERGE/DELETE statement is executed, the original document that was used to create the statement becomes output with the status of the statement executed.

The Snap substitutes the valid JSON paths that are defined in the WHERE clause for queries/statements with the values present in the incoming document. If the incoming document does not carry a substituting value, the document is written to the error view. If the Snap executes a SELECT query, it merges the query results into the incoming document and overwrites the values of all existing keys. On the other hand, the Snap writes the original document if there are no results from the query. If an output view is available and an UPDATE/INSERT/MERGE/DELETE statement is executed, the original document that was used to create the statement becomes output with the status of the statement executed.

Snap Type

The Snowflake - Execute Snap is a Write-type Snap that executes arbitrary Snowflake SQL.

Prerequisites

Security Prerequisites

You should have the following permissions (but not limited to) in your Snowflake account to execute this Snap:

The following commands enable minimum privileges in the Snowflake Console:

grant usage on database <database_name> to role <role_name>;
grant usage on schema <database_name>.<schema_name>;
 
grant "CREATE TABLE" on database <database_name> to role <role_name>;
grant "CREATE TABLE" on schema <database_name>.<schema_name>;

For more information on Snowflake privileges, refer to Access Control Privileges.

Internal SQL Commands

The permissions to grant for usage on database and creating tables depend on the queries you provide in this Snap.

Support for Ultra Pipelines

Works in Ultra Pipelines. However, we recommend that you not use this Snap in an Ultra Pipeline.

Limitations

note

Snowflake Execute and Multi-Execute Snaps may break existing Pipelines if the JDBC Driver is updated to a newer version.

With the updated JDBC driver (version 3.12.3), the Snowflake Execute and Multi-Execute Snaps' output displays a Status of "-1" instead of "0" without the Message field upon successfully executing DDL statements. If your Pipelines use these Snaps and downstream Snaps use the Status field's value from these, you must modify the downstream Snaps to proceed on a status value of -1 instead of 0.

This change in the Snap behavior follows from the change introduced in the Snowflake JDBC driver in version 3.8.1:
"Statement.getUpdateCount() and PreparedStatement.getUpdateCount() return the number of rows updated by DML statements. For all other types of statements, including queries, they return -1."

Snowflake Execute and Multi-Execute Snaps may break existing Pipelines if the JDBC Driver is updated to a newer version.

With the updated JDBC driver (version 3.12.3), the Snowflake Execute and Multi-Execute Snaps' output displays a Status of "-1" instead of "0" without the Message field upon successfully executing DDL statements. If your Pipelines use these Snaps and downstream Snaps use the Status field's value from these, you must modify the downstream Snaps to proceed on a status value of -1 instead of 0.

This change in the Snap behavior follows from the change introduced in the Snowflake JDBC driver in version 3.8.1:
"Statement.getUpdateCount() and PreparedStatement.getUpdateCount() return the number of rows updated by DML statements. For all other types of statements, including queries, they return -1."

Known Issues

Because of performance issues, all Snowflake Snaps now ignore the Cancel queued queries when pipeline is stopped or if it fails option for Manage Queued Queries, even when selected. Snaps behave as though the default Continue to execute queued queries when the Pipeline is stopped or if it fails option were selected.

Behavior Change

note

If you have any existing Pipelines that are mapped with status key or previous description then those Pipelines will fail. So, you might need to revisit your Pipeline design.

If you have any existing Pipelines that are mapped with status key or previous description then those Pipelines will fail. So, you might need to revisit your Pipeline design.

Snap Views

Type

Format

Number of Views

Examples of Upstream and Downstream Snaps

Description

Input

Document

  • Min: 0

  • Max: 1

  • JSON Generator

  • Binary to Document

Incoming documents are first written to a staging file on Snowflake's internal staging area. A temporary table is created on Snowflake with the contents of the staging file. An update operation is then run to update existing records in the target table and/or an insert operation is run to insert new records into the target table.

Output

Document

  • Min: 0

  • Max: 1

  • Mapper

  • Snowflake Execute

If an output view is available, then the output document displays the number of input records and the status of the bulk upload as follows:

 

Error

Error handling is a generic way to handle errors without losing data or failing the Snap execution. You can handle the errors that the Snap might encounter when running the Pipeline by choosing one of the following options from the When errors occur list under the Views tab:

  • Stop Pipeline Execution: Stops the current pipeline execution if the Snap encounters an error.

  • Discard Error Data and Continue: Ignores the error, discards that record, and continues with the remaining records.

  • Route Error Data to Error View: Routes the error data to an error view without stopping the Snap execution.

Learn more about Error handling in Pipelines.

Snap Settings

  • Asterisk (*): Indicates a mandatory field.

  • Suggestion icon ((blue star)): Indicates a list that is dynamically populated based on the configuration.

  • Expression icon ((blue star)): Indicates whether the value is an expression (if enabled) or a static value (if disabled). Learn more about Using Expressions in SnapLogic.

  • Add icon ((blue star)): Indicates that you can add fields in the field set.

  • Remove icon ((blue star)): Indicates that you can remove fields from the field set.

Field Name

Field Type

Description

Label*

Default ValueSnowflake - Execute
Example: Load Employee Tables

String

Specify the name for the Snap. You can make the name more specific, especially if your Pipeline has more than one of the same Snap.

SQL Statement*

Default Value: N/A

Example: INSERT into SnapLogic.book (id, book) VALUES ($id,$book)

String/Expression

Specify the Snowflake SQL statement to execute on the server.

note

We recommend you to add a single query in the SQL Statement field.

We recommend you to add a single query in the SQL Statement field.

Document value substitution is performed on literals starting with '$', for example, $people.name is substituted with its value available in the incoming document.

In DB Execute Snaps, if the Snowflake SQL statement is not an expression, the JSON path, such as $para, is allowed in the WHERE clause only.

If the query statement starts with SELECT (case-insensitive), the Snap regards it as a select-type query and executes once per input document. If not, the Snap regards it as a write-type query and executes in batch mode.

This Snap does not allow you to inject Snowflake SQL, for example, select * from people where $columnName = abc.
Only values can be substituted since it uses prepared statements for execution, which, for example, results in select * from people where address = ?.

Without using expressions

  • EmpId = 12 

  • email = 'you@example.com'

Using expressions

  • "EMPNO=$EMPNO and ENAME=$EMPNAME"

  • email = $email 

  • emp=$emp

  • "emp='" + $emp + "'"

  • "EMPNO=" + $EMPNO + " and ENAME='" + $EMPNAME+ "'"

Using expressions that join strings together to create SQL queries or conditions has a potential SQL injection risk and is hence unsafe. Ensure that you understand all implications and risks involved before using concatenation of strings with '=' Expression enabled. 

note
  • If '$' is not part of the JSON path, escape it as "\$" so that it can be executed as it is, such as SELECT \$2, \$3 FROM mytable.
    If the character before '$' is alphanumeric, there is no need to escape '$'. For example, SELECT metadata$filename ... 

  • When an escape character is an integral part of an expression-enabled query statement, precede it with another escape character.
    For example, if you enable expression for the SQL statement property, precede the backslash with another backslash when entering such a statement. See the following image:

  • The '$' sign and identifier characters, such as double quotes (“), single quotes ('), or back quotes (`), are reserved characters and should not be used in comments or for purposes other than their originally intended purpose.

  • If '$' is not part of the JSON path, escape it as "\$" so that it can be executed as it is, such as SELECT \$2, \$3 FROM mytable.
    If the character before '$' is alphanumeric, there is no need to escape '$'. For example, SELECT metadata$filename ... 

  • When an escape character is an integral part of an expression-enabled query statement, precede it with another escape character.
    For example, if you enable expression for the SQL statement property, precede the backslash with another backslash when entering such a statement. See the following image:

  • The '$' sign and identifier characters, such as double quotes (“), single quotes ('), or back quotes (`), are reserved characters and should not be used in comments or for purposes other than their originally intended purpose.

Single quotes in values must be escaped

Any relational database (RDBMS) treats single quotes (') as special symbols. So, single quotes in the data or values passed through a DML query may cause the Snap to fail when the query is executed. Ensure that you pass two consecutive single quotes in place of one within these values to escape the single quote through these queries.

For example:

If String To pass this valueUse
Has no single quotes
Schaum Series
'Schaum Series'
Contains single quotes
O'Reilly's Publication
'O''Reilly''s Publication'

Query type

 

Default Value: Auto
Example: Read

Dropdown list/Expression

Select the type of query for your SQL statement (Read or Write).

When Auto is selected, the Snap tries to determine the query type automatically.
If the execution result of the query is not as expected, you can change the query type to Read or Write.

Pass through

Default Value: Selected

Checkbox

Select this checkbox to enable the Snap to pass the input document to the output view under the key named original. This option applies only to the Execute Snaps with SELECT statement.

Ignore empty result

Default Value: Deselected

Checkbox

Select this checkbox to not write any document to the output view when a SELECT operation does not produce any result. If this checkbox is not selected and the Pass-through checkbox is selected, the input document is passed through to the output view.

Number of Retries

Default Value: 0
Example3

Integer

Specify the maximum number of attempts to be made to receive a response. The request is terminated if the attempts do not result in a response.

If the value is larger than 0, the Snap first downloads the target file into a temporary local file. If any error occurs during the download, the Snap waits for the time specified in the Retry interval and attempts to download the file again from the beginning. When the download is successful, the Snap streams the data from the temporary file to the downstream Pipeline. All temporary local files are deleted when they are no longer needed.

note

Ensure that the local drive has sufficient free disk space to store the temporary local file.

Ensure that the local drive has sufficient free disk space to store the temporary local file.

Minimum value: 0

Retry Interval (seconds)

Default Value: 1
Example: 10

Integer

Specify the time interval between two successive retry requests. A retry happens only when the previous attempt resulted in an exception. 

Use Result Query

Default Value: Deselected

Checkbox

Select this checkbox to write the query execution result to the Snap's output view after the successful execution. The output of the Snap will be enclosed within the key Result Query, and the value will be the actual output produced by the SQL query. See the example Snowflake Execute with Use Result Query enabled to know more about this option. 

This option allows users to effectively track the query's execution by clearly indicating the successful execution and the number of records affected, if any, after the execution.

Handle Timestamp and Date Time Data

Default value: Default Date Time format in UTC Time Zone

Example: SnapLogic Date Time format in Regional Time Zone

Dropdown list

Specify how the Snap must handle timestamp and date time data. The available options are:

  • Default Date Time format in UTC Time Zone: The Snowflake date time data are represented in UTC Time Zone.

  • SnapLogic Date Time format in Regional Time Zone: The Snowflake date time data are represented in the same regional time zone value, as provided in the Snowflake account.


If you use the Timestamp TZ and Timestamp LTZ in this Snap, we recommend you to use SnapLogic Date Time format in Regional TimeZone to ensure that you get the Timestamp data output of the target table in the same format as in the source table.

Source Table

Target Table


Manage Queued Queries

Default value: Continue to execute queued queries when pipeline is stopped or if it fails

Example: Cancel queued queries when the Pipeline is stopped or if it fails

Dropdown list

Select an option from the list to determine whether the Snap should continue or cancel the execution of the queued Snowflake Execute SQL queries when you stop the Pipeline. The available options are:

  • Continue to execute queued queries when pipeline is stopped or if it fails

  • Cancel queued queries when pipeline is stopped or if it fails

note

If you select Cancel queued queries when pipeline is stopped or if it fails, the read queries under execution are canceled, whereas the write type of queries under execution are not canceled. Snowflake internally determines which queries are safe to be canceled and cancels those queries.

If you select Cancel queued queries when pipeline is stopped or if it fails, the read queries under execution are canceled, whereas the write type of queries under execution are not canceled. Snowflake internally determines which queries are safe to be canceled and cancels those queries.

Default Value: Execute only
Example: Validate & Execute

Dropdown list

Examples 

Snowflake Execute with Use Result Query enabled

This example Pipeline demonstrates how to insert data into a table using the Snowflake Execute Snap.

First, we configure the Snowflake Execute Snap as follows. Note that we select the Use Result Query checkbox to view the statement result output.


Upon execution, we see the following output enclosed within the key Result Query.

The following screenshot displays the output preview when we disable the Use Result Query checkbox.



Executing the Snowflake SQL query using the Execute Snap

The following example demonstrates the execution of Snowflake SQL query using the Snowflake Execute Snap.

First, we configure the Execute Snap with this query—select * from "PRASANNA"."ADOBEDATA" , which returns the data from ADOBEDATA. 

Upon successful execution, the Snap displays the following output in its data preview.

Snowflake Execute Snap supports UDFs

User-defined functions (UDFs) created in the Snowflake console can be executed using Snowflake - Execute Snap. In the following example, the SQL statement is defined and then the Snap is executed with that conditions. 

First, the Snowflake Execute Snap is used to give the user-defined SQL statement. area_of_circle(3.0) is a UDF here. The Snap settings and the output view are as follows:

Then the Mapper Snap is used to define columns that need to be picked up from the Output of the Snowflake Execute. 

See Also

https://docs.snowflake.com/en/developer-guide/udf/sql/udf-sql-scalar-functions.html

https://docs.snowflake.com/en/sql-reference/udf-overview.html

https://docs.snowflake.com/en/user-guide-getting-started.html

Snap Pack History