ELT Execute (Archived)

In this article

Overview

You can use this Snap to execute SQL queries in the target database—Snowflake, Redshift, Azure Synapse, Databricks Lakehouse Platform, or BigQuery. You can run the following types of queries using this Snap:

  • Data Definition Language (DDL) queries

  • Data Manipulation Language (DML) queries

  • Data Control Language (DCL) queries

Prerequisites

Valid accounts and access permissions to connect to the following:

  • Source: AWS S3, Azure Cloud Storage, or Google Cloud Storage

  • Target: Snowflake, Redshift, Azure Synapse, Databricks Lakehouse Platform, or BigQuery

If you want to use the COPY INTO command for loading data into the target database, you must pass (expose) these account credentials inside the SQL statement. Hence, we recommend you to consider using the ELT Load Snap as an alternative.

Limitations

  • This Snap does not support multi-statement transaction rollback of any of the DDL, DCL or DML statements specified.

  • Each statement is auto-committed upon successful execution. In the event of a failure, the Snap can rollback only updates corresponding to the failed statement execution. All previous statements (during that Pipeline execution runtime) that ran successfully are not rolled back.

  • You cannot run Data Query Language (DQL) queries using this Snap. For example, SELECT and WITH query constructs.

  • Use this Snap either at the beginning or in the end of the Pipeline. 

  • This Snap executes the SQL query only during Pipeline Execution. It does NOT perform any action (including showing a preview) during Pipeline validation.

  • ELT Snap Pack does not support Legacy SQL dialect of Google BigQuery. We recommend that you use only the BigQuery's Standard SQL dialect in this Snap.

Known Issues

  • ELT Pipelines created prior to 4.24 GA release using one or more of the ELT Insert Select, ELT Merge Into, ELT Load, and ELT Execute Snaps may fail to show expected preview data due to a common change made across the Snap Pack for the current release (4.26 GA). In such a scenario, replace the Snap in your Pipeline with the same Snap from the Asset Palette and configure the Snap's Settings again.

  • The Snap’s preview data (during validation) contains a value with precision higher than that of the actual floating point value (float data type) stored in the Delta. For example, 24.123404659344 instead of 24.1234. However, the Snap reflects the exact values during Pipeline executions.

  • In any of the supported target databases, this Snap does not appropriately identify nor render column references beginning with an _ (underscore) inside SQL queries that use the following constructs and contexts (the Snap works as expected in all other scenarios):

    • WHERE clause

    • WHEN clause

    • ON condition (ELT Join, ELT Merge Into Snaps)

    • HAVING clause

    • QUALIFY clause

    • Insert expressions (column names and values in ELT Insert Select, ELT Load, and ELT Merge Into Snaps)

    • Update expressions list (column names and values in ELT Merge Into Snap)

    • Secondary AND condition

    • Inside SQL query editor (ELT Select and ELT Execute Snaps)

  • As a workaround while using these SQL query constructs, you can:

    • Precede this Snap with an ELT Transform Snap to re-map the '_' column references to suitable column names (that do not begin with an _ ) and reference the new column names in the next Snap, as needed.

  • In case of Databricks Lakehouse Platform where CSV files do not have a header (column names), a simple query like SELECT * FROM CSV.`/mnt/csv1.csv` returns default names such as _c0, _c1, _c2 for the columns which this Snap cannot interpret.

    • To avoid this scenario, you can:

      • Write the data in the CSV file to a DLP table beforehand, as in: CREATE TABLE csvdatatable (a1 int, b1 int,…) USING CSV `/mnt/csv1.csv` where a1, b1, and so on are the new column names.

      • Then, read the data from this new table (with column names a1, b1, and so on) using a simple SELECT statement.

Snap Input and Output

View

Type of View

Number of Views

Examples of Upstream and Downstream Snaps

Description

View

Type of View

Number of Views

Examples of Upstream and Downstream Snaps

Description

Input 

Document

  • Min: 0

  • Max: 1

  • ELT Select

  • ELT Insert Select

  • ELT Filter

An upstream Snap is not mandatory. Use the input view to connect the Snap as the terminating Snap in the Pipeline.

Output

Document

  • Min: 0

  • Max: 1

  • ELT Select

  • ELT Transform

A downstream Snap is not mandatory. Use the output view to connect the Snap as the first Snap in the Pipeline.

Snap Settings

Parameter

Data Type

Description

Default Value

Example 

Parameter

Data Type

Description

Default Value

Example 

Label

String

Required. The name for the Snap. You can modify this to be more specific, especially if you have more than one of the same Snap in your Pipeline.

N/A

ELT Execute for SF

SQL Statements

Required. Use this field set to define your SQL statements, one in each row. Click to add a new row. You can add as many SQL statements as you need.

SQL Statement Editor

String/Expression

Required. Enter the SQL statement to run, in this field. The SQL statement must follow the SQL syntax as stipulated by the target database—Snowflake, Redshift, Azure Synapse, or Databricks Lakehouse Platform.

N/A

drop table base_01_oldcodes;

Troubleshooting

Error

Reason

Resolution

Error

Reason

Resolution

Failure: DQL statements are not allowed.

The ELT Execute Snap does not support Data Query Language (DQL) and hence statements containing SELECT and WITH are not allowed.

Remove any DQL statements (containing SELECT, WITH) and enter one of the following statement types:

  • Data Definition Language (DDL): CREATE, ALTER, DROP, TRUNCATE, RENAME and so on. 

  • Data Control Language (DCL): GRANT, REVOKE

  • Data Manipulation Language (DML): INSERT, UPDATE, DELETE, MERGECALL and so on.

Examples

Sample Queries for the ELT Execute Snap

1 2 3 4 5 CREATE OR REPLACE WAREHOUSE me_wh WITH warehouse_size='X-LARGE'; CREATE OR REPLACE TABLE "TEST_DATA".NEW_DATA(VARCHAR_DATA VARCHAR(10)); CREATE OR REPLACE TABLE "TEST_DATA".DT_EXECUTE_01(VARCHAR_DATA VARCHAR(100),TIME_DATA TIME,FLOAT_DATA FLOAT,BOOLEAN_DATA BOOLEAN,NUMBER_DATA NUMBER(38,0),DATE_DATA DATE); TRUNCATE TABLE IF EXISTS "public".simple_data_02; INSERT OVERWRITE INTO "BIGDATAQA"."TEST_DATA"."OUT_ELT_EXECUTE_SF_003" SELECT * FROM ( SELECT * FROM "BIGDATAQA"."TEST_DATA"."DT_EXECUTE_03" )

Updating a Target Table Based on Updates to Another Table

The following Pipeline updates a target (backup) table - OUT_ELT_EXECUTE_SF_003 periodically based on the updates to another table DT_EXECUTE_03 (source). These tables are present in a Snowflake database, and we use data views from this database to present the updates that the Pipeline does in the target table.

There are two steps to achieve this functionality using the ELT Execute Snap:

  1. Create a Pipeline with only the ELT Execute Snap for performing the periodic update.

  2. Create a Scheduled Task from this Pipeline to trigger a job at specific times of the day, as needed. See Tasks Page for information on creating Scheduled Tasks from Pipelines.

    • This task regularly looks into the DT_EXECUTE_03 table for updates and inserts the latest data from this table into the target (backup) table.

Before we create the Pipeline:

Source Table: DT_EXECUTE_03 

Source Table: DT_EXECUTE_03 

Target Table: OUT_ELT_EXECUTE_SF_003

We configure the ELT Execute Snap to run a DML query, as follows. 

Once we create the Scheduled Task (after saving this Pipeline), the task runs as scheduled. Then, the ELT Execute Snap copies the data from the source table and inserts into the target (backup) table.

After the Scheduled Task/Pipeline is run:

Target Table: OUT_ELT_EXECUTE_SF_003

Target Table: OUT_ELT_EXECUTE_SF_003

Download this Pipeline.

Example 2: Using one ELT Execute Snap to Create and Fill a Table

In this example Pipeline, we create a new table in the Redshift database and fill data into this table using an ELT Execute Snap. We later read the data from this table using an ELT Select Snap.

Let us observe the configuration of the ELT Execute Snap (first Snap in the above Pipeline).

ELT Execute Snap

ELT Execute Snap

 

Snap Output

 

We have added two SQL statements into the SQL Statements field set—one for creating/overwriting a table and another for inserting a row into same table. ELT Execute Snap does not have a data preview except for the placeholder SQL statement that indicates the Snap is validated successfully. The Snap executes the SQL queries real-time when we run the Pipeline.

We connect an ELT Select Snap to the ELT Execute Snap to read the data from the newly-created table in the Redshift database. In this Snap:

  • We use the same ELT Database account that we use for the previous Snap.

  • Define/select the values for the database, schema and the table name to identify the table that the previous Snap is configured to create.

    • Alternatively, we can enable the SQL query editor and include the select * from dev.public.new_table_trg; statement. 

    • It is also important here to note that we cannot run this DQL query using the ELT Execute Snap.

ELT Select Snap

ELT Select Snap

Snap Output

Download this Pipeline.

Downloads

  1. Download and import the Pipeline into SnapLogic.

  2. Configure Snap accounts as applicable.

  3. Provide Pipeline parameters as applicable.

Snap Pack History

Release

Snap Pack Version 

Date

Type

Updates

Release

Snap Pack Version 

Date

Type

Updates

4.26-Patch

426patches12021

Sep 30, 2021

Latest

  • Fixed an issue where the ELT Load Snap connecting to a Databricks Lakehouse Platform (DLP) instance failed to perform the load operation. Ensure that you provide a valid DBFS Folder path in the Snap's account settings as the Snap requires this folder path.

4.26-Patch

426patches11646

Sep 22, 2021

Latest

  • Enhanced the ELT Database Account to support token-based authentication (Source Location Session Credentials) to S3 locations for Snowflake and Redshift target databases.

  • Enhanced the ELT Aggregate Snap with the following changes:

    • Revised the field labels from:

      • GROUP BY Fields List field set > Output Field to GROUP BY Field.

      • ORDER-By Fields to ORDER-BY Fields (Aggregate Concatenation Functions Only).

    • Removed the Suggestion option for Field Name field under General Aggregate Functions List field.

    • Made the Alias Name fields in the Aggregate Concatenation Functions List and the Percentile Distribution Functions List field sets mandatory.

  • If your target database is a Databricks Lakehouse Platform (DLP) instance, then the ELT Load Snap supports loading data from source CSV files that contain only comma as the separator between values.

4.26-Patch

426patches11323

Aug 17, 2021

Latest

  • Enhanced the ELT Database Account to allow parameterization of field values using Pipeline Parameters. You can define and use these parameters in expression-enabled fields to pass values during runtime.

4.26-Patch

426patches11262

Aug 16, 2021

Latest

  • Fixed the following Known Issues recorded in the 4.26 GA version:

    • For a Snowflake target instance, the ELT Insert Select Snap does not suggest column names to select for the Insert Column field in the Insert Expression List.

    • The Snaps—ELT Merge Into, ELT Select, ELT Join, and ELT Filter—do not prevent the risk of SQL injection when your target database is Databricks Lakehouse Platform (DLP).

    • Intermittent null-pointer exceptions in the ELT Load Snap on Databricks Lakehouse Platform (DLP).

    • The ELT Insert Select Snap attempts to create the target table even when it exists in the Snowflake database.

    • When loading data from a JSON file into a target Databricks Lakehouse Platform (DLP) instance using an ELT Load Snap, if you choose the Drop and Create Table option as the Load Action and specify an additional column (that is not available in the JSON file) for the new table, it results in one more column null added to the new target table.

    • When you use the SQL editor in the ELT Select Snap configuration to define your SQL query, the Pipeline validation fails due to a syntax error in the following scenarios. However, the Pipeline execution works as expected. The only workaround is to drop the LIMIT clause and the optional OFFSET clause from the SQL query during Pipeline validation.

      • The query contains a LIMIT clause on a Snowflake, Redshift or Databricks Lakehouse Platform target instance: The SQL query created during Pipeline validation includes an additional LIMIT clause, for example: SELECT * FROM "STORE_DATA"."ORDERS" LIMIT 10 LIMIT 990

      • The query contains an OFFSET clause (supported in case of Snowflake and Redshift): The SQL query created during Pipeline validation looks like SELECT * FROM "STORE_DATA"."ORDERS" LIMIT 10 OFFSET 4 LIMIT 990

4.26

main11181

 

Stable

  • Enhanced the ELT Snap preview to support the following Snowflake data types: arrayobjectvariant, and timestamp.

    • The Snaps convert the values to hexadecimal (HEX) equivalents—the default setting for the session parameter BINARY_OUTPUT_FORMAT in Snowflake. See Session Parameters for Binary Values for more information.

    • If this setting is different from hexadecimal (such as base64) in the Snowflake table, the Snaps still convert the values to hexadecimal equivalents for rendering them in the Snap preview.

  • Enhanced all ELT Snaps to display the Get preview data checkbox below the Snap's Label field.

  • The ELT Database account is now mandatory for all Snaps in the ELT Snap Pack.
    Breaking Change: Starting with the 4.26 release, all Snaps in the ELT Snap Pack (except the ELT Copy Snap) require an account to connect to the respective target database. Your existing Pipelines that do not use an account may fail. We recommend you to associate an ELT Database Account to each of the ELT Snaps (except ELT Copy Snap) for your Pipelines.

  • Enhanced the ELT Aggregate Snap to support Linear Regression functions on Redshift and Azure Synapse. The Snap also supports these functions on Databricks Lakehouse Platform.

  • Enhanced the ELT Execute Snap to enable running multiple DML, DDL, and DCL SQL statements from the same Snap instance.

  • Enhanced the ELT Join Snap to:

    • Support LEFT ANTI JOIN and LEFT SEMI JOIN types on all supported databases.

    • Display or hide the Resultant Column Names Prefix Type field based on the target database selected in the Snap's account.

  • Enhanced the ELT Load and ELT SCD2 Snaps to provide a list of suggested data types, while adding columns to or creating a table.

4.25-Patch

425patches10017

 

Latest

  • Updated the ELT SCD2 Snap to replace End date of historical row option in the Meaning field of Target Table SCD2 Fields field set with End Date of Current Row. See Note 1: Breaking change below for a breaking change caused by this update.
    Breaking Change: This may cause the existing Pipelines to fail as the end date of historical row option no longer exists. You need to make the following update in the ELT SCD2 Snap's settings across your Pipelines after upgrading your Snap Pack to this patch version:

    • Select End Date of Current Row from the Meaning drop-down list in the corresponding entry.

  • Fixed the issue with the ELT Insert Select Snap containing an open output preview that fails to retrieve output preview data in case of Redshift and Azure Synapse databases, though the Pipeline runs work as expected.

  • Fixed an issue where the ELT Execute Snap does not error out (Snap turns Green) even when running an SQL query to drop a non-existent table from a Snowflake or Azure Synapse database.

  • [Update on Jul 22, 2021]: Enhanced the ELT Snap previews to support the following data types: array, object, variant, and timestamp.

    • The Snaps convert the values to hexadecimal (HEX) equivalents—the default setting for the session parameter BINARY_OUTPUT_FORMAT in Snowflake. See Session Parameters for Binary Values for more information.

    • If this setting is different from hexadecimal (such as base64) in the Snowflake table, the Snaps still convert the values to hexadecimal equivalents for rendering them in the Snap previews.

4.25-Patch

425patches9725

 

Latest

  • Enhanced the ELT Snap preview to display the exact binary and varbinary values from Snowflake database during Pipeline validation, by converting the values to hexadecimal equivalents—the default setting in Snowflake. If the setting is different from hexadecimal in the Snowflake table, then the Snaps still convert the values to hexadecimal for rendering the Snap preview.

  • Enhanced the ELT Transform Snap to display the appropriate data type (binary or varbinary) for the column names populated in the output schema.

  • Enhanced the ELT Window Functions Snap to address potential issues due to an incorrect definition for MINUS function in case of Redshift and Azure Synapse databases.

4.25

main9554

 

Stable

  • Starting with the 4.25 release, SnapLogic has now certified the ELT Snap Pack to work with Snowflake hosted on Google Cloud Platform (GCP) as the target database, in addition to the other flavors of Snowflake hosted on AWS and Microsoft Azure

  • Introduced the ELT Execute Snap to enable you to run DML, DDL, and DCL SQL queries in Snowflake in Snowflake, Redshift, and Azure Synapse.

  • Introduced the ELT SCD2 Snap to support Type 2 Slowly Changing Dimensions (SCD2) updates to the target databases—Snowflake, Redshift, and Azure Synapse.

  • Enhanced the ELT Database Account to introduce:

    • Support for Google Cloud Storage as a storage location (source) in addition to AWS S3 and Azure Data Lake Storage (ADLS) when your target database is Snowflake.

    • Automatic download of the JDBC driver required for the selected Database Type using the new Download JDBC Driver Automatically check box.

  • Enhanced the ELT Load Snap to prevent changes to existing tables during Pipeline validation. If you set the Load Action as Drop and Create table, and the target table does not exist, the Snap creates a new (empty) target table based on the schema specified in its settings.

  • Enhanced the ELT Window Functions Snap to support Covariance, Correlation, and Linear Regression Functions on Snowflake, Redshift, and Azure Synapse databases. The Snap uses function-specific query re-writes to support these functions on Redshift and Azure Synapse databases.

  • Enhanced the ELT Merge Into and ELT Insert Select Snaps to support up to one output view, and added the Get Preview Data check box to these Snaps. You can now connect downstream ELT Snaps to these Snaps.

4.24

424patches8793

 

Latest

  • Fixes the issue of production job failures due to ELT Insert Select Snap after upgrading to 4.24 GA by updating the ELT Transform Snap to continue allowing duplication of fields in the Expression list for the Pipeline to complete successfully.

No changes are needed to your existing Pipelines.

  • Fixes the column name collision issue in the Snap's output when the two tables being joined have columns with the same/identical names. You can specify the extent of prefix (that is, to prefix all columns, only duplicate columns, or no prefix) using the Resultant Column Names Prefix Type drop-down list. Based on the prefix you choose, a table alias name is prefixed to the identical columns in the output.

The behavior of ELT Load Snap for Load Action during Pipeline validation across the supported databases is as follows:

  • Append rows to existing table: Does not append the data from the source files into the target table.

  • Overwrite existing table: Does not overwrite the data.

  • Drop and Create table: Does not drop the target table even if it exists, but the Snap creates a new target table if a table does not exist.

  • Alter table: Does not modify the schema of the target table.

4.24

main8556

 

Stable

Adds support for Azure Synapse database. You can now use the ELT Snap Pack to transform tables in the Snowflake, Redshift as well as Azure Synapse databases.

Updates the Snap Pack with the following features:

  • ELT Database Account: Enhances the ELT Database Account to support the Azure Synapse database.

  • ELT Aggregate: Enhances the Snap to:

    • Support Azure Synapse's T-SQL aggregate functions and the aggregate functions in Snowflake and Redshift databases.

      • General Aggregate Function COUNT_IF in Snowflake.

      • General Aggregate Functions in Snowflake.

      • Linear Regression Aggregate Functions in Snowflake.

      • Aggregate Concatenation Functions in Snowflake, Redshift, and Azure Synapse.

      • Percentile Distribution Functions in Snowflake and Redshift.

    • Suggest appropriate column names to select from, in the Snap fields. This applies to Snowflake, Redshift, and Azure Synapse databases.

  • ELT Insert Select: Enhances the Snap to:

    • Suggest appropriate column names to select from, in the Snap fields.

    • Create Hash-distributed tables using the Target Table Hash Distribution Column (Azure Synapse Only) field when the Load Action is selected as Drop and Create table and a condition like WHEN NOT MATCHED BY TARGET.

  • ELT Join: Enhances the Snap to support Natural JOINS (NATURAL INNER JOIN, NATURAL LEFT OUTER JOIN, NATURAL RIGHT OUTER JOIN, and NATURAL FULL OUTER JOIN) in addition to the INNER, LEFT OUTER, RIGHT OUTER, FULL OUTER, and CROSS Joins in Azure Synapse Database. This enhancement also makes account configuration mandatory when using this Snap.

    • Fixes the column name collision issue in the Snap's result set when the two tables being joined have columns with the same/identical names.  You can specify the Resultant Column Names Prefix Type drop-down list. Based on the prefix type you choose, a table alias name is prefixed to identical columns in the output.

  • ELT Load: Enhances the Snap to:

    • Support the File Name Pattern option using Key Based Mechanism for Redshift database. 

    • Suggest appropriate column names to select from, in the Snap fields. This applies to Snowflake, Redshift, and Azure Synapse databases.

    • Create Hash-distributed tables using the Target Table Hash Distribution Column (Azure Synapse Only) field when the Load Action is selected as Drop and Create table.

  • ELT Merge Into: Enhances the Snap to:

    • Suggest appropriate column names to select from, in the Snap fields. This applies to Snowflake, Redshift, and Azure Synapse databases.

    • Include the Target Table Hash Distribution Column (Azure Synapse Only) field for the Snap to create hash-distributed tables always.

    • Include the Update Expression List - When Not Matched By Source field set to allow defining one or more Update Expressions for the WHEN clause - WHEN NOT MATCHED BY SOURCE. This applies to Azure Synapse database.

    • Include the Target Table Alias field to specify the alias name required for the target table. The Snap is also equipped with the ability to auto-replace the actual table names (with the alias name), if any, used in the ON clause condition, secondary AND conditions, Update Expression list, or Insert Expression list. This applies to Snowflake, Redshift, and Azure Synapse databases.

  • ELT Transform: Enhances the Snap to:

    • Display input schema and output schema based on the upstream and downstream Snaps connected to this Snap.

    • Delete fields mentioned in the Expression field from the Snap's output when the mappings have an empty Target Path

  • ELT Window Functions: Enhances the Snap to support the following Window Functions in addition to the existing ones:

    • Value Based Analytic Functions

    • LEAD and LAG Analytic Functions

  • Fixes the issue of displaying generic error messages for Triggered Task failures with ELT Pipelines by displaying detailed error messages for ease in debugging.

4.23

main7430

 

Stable

Introduces the following Snaps:

  • ELT Load: Loads data from AWS S3 buckets and Azure clusters into the Snowflake and Redshift tables.

  • ELT Sample: Generates a data subset from the source table. 

  • ELT Pivot: Converts row data into column data.

  • ELT Unpivot: Converts column data into row data.

  • ELT Window Functions: Provides support for SQL Window Functions in ELT Pipelines.

4.22

main6403

 

Stable

Introduces the ELT Snap Pack that provides you with the Extract, Load, and Transform (ELT) capabilities. Use the following Snaps to build SQL queries that are executed in the Snowflake database:

  • ELT Aggregate: Builds SQL query to perform aggregate functions such as SUM, COUNT, MIN, and MAX. Also offers the GROUP BY functionality.

  • ELT Copy: Creates copies of the input SQL query. 

  • ELT Filter: Adds a WHERE clause in the input SQL query. Use this capability to create filters/conditions for your data set. 

  • ELT Insert Select: Performs the INSERT INTO SELECT operation on the specified table. 

  • ELT Intersect: Adds an INTERSECT SQL operator in the input queries.

  • ELT Join: Builds SQL query with a JOIN clause.

  • ELT Limit: Adds a LIMIT clause in the incoming SQL query.

  • ELT Merge Into: Performs the MERGE INTO operation on the specified table.

  • ELT Minus: Adds a MINUS SQL operator in the input queries.

  • ELT Select: Builds an SQL SELECT query and provides a built-in SQL query editor that enables you to construct complex queries.

  • ELT Sort: Adds the ORDER BY keyword in the input query. 

  • ELT Transform: Builds transformation-based SQL queries for the specified table.

  • ELT Union: Adds a UNION ALL or UNION DISTINCT operator in the input queries.

  • ELT Unique: Builds a SELECT DISTINCT SQL query. 

Note: Breaking change

This may cause the existing Pipelines to fail as the end date of historical row option no longer exists. You need to make the following update in the ELT SCD2 Snap's settings across your Pipelines after upgrading your Snap Pack to 425patches10017:

  • Select End Date of Current Row from the Meaning drop-down list in the corresponding entry.


See Also