/
Configuring ELT Database Accounts

Configuring ELT Database Accounts

Overview

You must create an ELT Database account to connect ELT Snaps in your Pipelines with the source and target CDWs (databases). This account enables you to write-and-transform data in the target databases hosted in the following cloud locations. The JDBC URL you define for your target database indicates the respective cloud location where the database is hosted.

Target DatabaseSupported Cloud LocationCloud Location in JDBC URL
SnowflakeAWSjdbc:snowflake://<account_name>.snowflakecomputing.com
Microsoft Azurejdbc:snowflake://<account_name>.<region>.azure.snowflakecomputing.com
Google Cloud Platform (GCP)jdbc:snowflake://<account_name>.<region>.gcp.snowflakecomputing.com
RedshiftAWSjdbc:redshift://<redshift-cluster-name>.<region>.redshift.amazonaws.com
Azure SynapseMicrosoft Azurejdbc:sqlserver://<yourserver>.database.windows.net

Databricks Lakehouse Platform (DLP)

AWS

jdbc:spark://<your_instance_code>.cloud.databricks.com
Microsoft Azurejdbc:spark://<your_instance_code>.azuredatabricks.net
Google BigQueryGoogle Cloud Platform (GCP)jdbc:bigquery://<host_URL>/bigquery

The different settings needed to access your target database depend on the type of your target database. Choose your database type from the following list for more information on configuring your Snap account.

The ELT Snap Pack does not support mixed accounts from different types of databases in the same Pipeline. For example, a Pipeline in which some Snaps are connecting to the Snowflake target database cannot have other Snaps connecting to the Redshift target database.

Configuring ELT Accounts

Each of the Snaps in the ELT Snap Pack except ELT Copy Snap requires an ELT Database Account configured. You can configure your ELT accounts in SnapLogic using either the Designer or Manager.

Using SnapLogic Designer

Drag any ELT Snap to the Canvas and click the Snap to open its settings. Click the Account tab. You can now either use an existing account or create a new one. Write-type snaps such as ELT Insert-Select and ELT Merge Into require accounts to function properly so the Account tab is automatically displayed when you place these Snaps on the Canvas.

Selecting an existing account

SnapLogic organizes and displays all accounts to which you have access, sorting them by account type and location. To select an existing account:

  1. Click the  icon to view the accounts to which you have access and select the account that you want to use. 

  2. Click  to save the account selection.

Creating an account

  1. Click Add Account in the Account Reference dialog.

  2. Select the Location in which you want to create the account, select the account type, and click ContinueThe Add Account dialog associated with the account type appears.


  3. Enter the required account details. For detailed guidance on how to provide information associated with each account type, use the following links:

    Enter additional information on this account in the Notes field of the Info tab. Doing so helps you–and other users–understand the purpose of the account, especially if there are multiple accounts of the same type.

  4. Click Validate to verify the account, if the account type supports validation.

  5. Click Apply to complete configuring the ELT account.

Using SnapLogic Manager

Use Manager to create accounts without associating them immediately with Pipelines.

Accounts in SnapLogic are associated with projects. You can use accounts created in other projects only if you have at least Read access to them.

  1. In the left pane, browse to the project in which you want to create the account and click CreateAccount ELT, followed by the appropriate account type.

    The Create Account dialog associated with the selected account type appears. 

  2. Repeat the steps numbered 3 through 5 in the previous section.


Avoid updating account credentials while Pipelines using that account are executing. Doing so may lead to unexpected results, including locking your account.

Account Settings

Supported JDBC JAR Versions

You can configure your ELT Database Account to automatically use an appropriate JDBC JAR file for connecting to your target database and performing the load and transform operations. 

Supported CDWCertified JDBC JAR File
Azure Synapsemssql-jdbc-11.2.1.jre11.jar
BigQuerySimbaJDBCDriverforGoogleBigQuery42_1.3.0.1001.zip
Databricks Lakehouse Platform (DLP)databricks-jdbc-2.6.29.jar
Redshiftredshift-jdbc42-2.1.0.9.jar
Snowflakesnowflake-jdbc-3.13.33.jar

Using Alternate JDBC JAR File Versions

We recommend that you let the ELT Snaps use the listed JAR file versions. However, you may use a different JAR file version of your choice.

Snap Pack History

 Click here to expand...

Release

Snap Pack Version 

Date

Type

Updates

November 2024439patches29443 LatestFixed an issue with the ELT Merge Into Snap where the Snap fails to perform the MERGE INTO operation on the RedShift target tables with the error - There were no target table columns found. when the specified target Database schema is not one of the default schemas ("$user”, public). This issue did not exist before the May 2024 release.
November 2024main29029 StableUpdated and certified against the current SnapLogic Platform release.
August 2024

438patches28010

 Latest

The ELT Insert-Select Snap no longer fails to execute SQL statements that contain multiple multiline comment character pairs (/* and */) and/or multiple quoted substrings. Quoted substrings refer to schema, database, table, or column identifiers, which are delimited to allow special characters.

  • We recommend that you upgrade your main27765 (August 2024 GA release) ELT Snap Pack to this latest version.

August 2024main27765 Stable

Upgraded the jOOQ library for the ELT Snap Pack from v3.9.1 to v3.17.x.

May 2024437patches27372 Latest

Enhanced the pipeline execution statistics of ELT Insert-Select Snap to be displayed in its output view and to allow downloading detailed stats as a JSON file that includes additional statistics (extraStats) on DML statement executions on target Databricks Lakehouse Platform (DLP) table