ELT Snowflake Account

Overview

You can use the ELT Database Account to connect to a target Snowflake database. The cloud location where the database is hosted is indicated in the JDBC URL for your target Snowflake database, as seen in the following table:

Target DatabaseSupported Cloud LocationCloud Location in JDBC URL
SnowflakeAWSjdbc:snowflake://<account_name>.snowflakecomputing.com
Microsoft Azurejdbc:snowflake://<account_name>.<region>.azure.snowflakecomputing.com
Google Cloud Platform (GCP)jdbc:snowflake://<account_name>.<region>.gcp.snowflakecomputing.com

The ELT Snap Pack does not support mixed accounts from different types of databases in the same Pipeline. For example, a Pipeline in which some Snaps are connecting to the Snowflake database cannot have other Snaps connecting to a Redshift database.

JDBC Driver Version Alert

Snowflake has announced end of support for its JDBC Driver versions lower than 3.12.17 as of January 31, 2023.

  • As SnapLogic currently uses version 3.13.33, there is no action needed from your end if you allow the accounts, and hence Snaps, to download this JDBC Driver automatically.

  • However, if you are manually uploading your JDBC Driver file for your Snap account, ensure that you have uploaded a JDBC Driver version 3.12.17 or above. Else, the Snap account may fail.

Prerequisites

  • A valid Snowflake account.
  • Certified JDBC JAR File: snowflake-jdbc-3.13.33.jar

    Using Alternate JDBC JAR File Versions

    We recommend that you let the ELT Snaps use this JAR file version. However, you may use a different JAR file version of your choice.

  • If you are using the OAuth2 authentication type for accessing your target Snowflake instance, the redirect URL for your Storage Integration (OAUTH_REDIRECT_URI) must be …/api/1/rest/admin/oauth2callback/elt.

Limitations

None.

Known Issue

  • When you use the auto-fill feature in the Google Chrome browser to fill ELT Snap account credentials—such as user names, passwords, client secrets, auth codes and tokens, secret keys, and keystores, the accounts, and hence the Pipelines fail. This is because the browser overwrites the field values with its own encrypted values that the SnapLogic Platform cannot read. SnapLogic recommends that you do not auto-save your Snap account credentials in the Chrome browser, delete any credentials that the browser has already saved for elastic.snaplogic.com, and then perform ONE of the following actions:
    • Option 1: Click  that appears in the address bar after you submit your login credentials at elastic.snaplogic.com, and then click Never
    • Option 2: Disable the Offer to save Passwords option at chrome://settings/passwords while working with your SnapLogic Pipelines. If you disable this option, your Chrome browser will not remember your passwords on any other website.

Account Settings

When you configure your Snap Account...
  • Asterisk ( * ) indicates a mandatory field.

  • Suggestion icon ( ) indicates a list that is dynamically populated based on the configuration.

  • Expression icon ( ) indicates whether the value is an expression (if enabled) or a static value (if disabled). Learn more about Using Expressions in SnapLogic.

  • Add icon () indicates that you can add fields in the fieldset.

  • Remove icon () indicates that you can remove fields from the fieldset.

ParameterField Dependency

Description

Label*None.

Required. Unique user-provided label for the account.

Default Value: N/A

Example: ELT SF AWS Account AD OFF

Account Properties*

Use this fieldset to configure the information required to establish a JDBC connection with the account.

This fieldset consists of the following fields:

  • Database Type
  • Download JDBC Driver Automatically
  • JDBC JAR(s) and/or ZIP(s)
  • JDBC Driver Class
  • JDBC URL
  • Username
  • Password
  • Default Database Name
Database Type*None.

Select the target data warehouse into which the queries must be loaded, that is SnowflakeThis activates the following fields:

  • Warehouse
  • Source Location

Default Value: N/A

Example: Snowflake

Download JDBC Driver AutomaticallyNone.

Select this checkbox to allow the Snap account to download the certified JDBC Driver for Snowflake. The following fields are disabled when you select this checkbox.

  • JDBC JAR(s) and/or ZIP(s)
  • JDBC driver class

To use a JDBC Driver of your choice, clear this checkbox, upload (to SLDB), and choose the required JAR files in the JDBC JAR(s) and/or ZIP(s): JDBC Driver field. 

Use of Custom JDBC JAR version

You can use a different JAR file version outside of the recommended listed JAR file versions.

Default Value: Not Selected

Example: Selected

JDBC JAR(s) and/or ZIP(s): JDBC DriverRequired when the Download JDBC Driver Automatically checkbox is not selected.

Upload the JDBC driver and other JAR files that you want to use into SLDB. Click  to add a new row. Add each JDBC JAR file in a separate row. See Downloading the JDBC Driver for more information about JDBC drivers and download the appropriate driver for your account.

Default Value: N/A

Example: snowflake-jdbc-3.12.16.jar

JDBC driver class*Required when the Download JDBC Driver Automatically checkbox is not selected.

Specify the driver class to use for your application. We recommend that you use net.snowflake.client.jdbc.SnowflakeDriver to suit your Snowflake database, as other classes and methods may change due to future enhancements.

Default Value: N/A

Example: net.snowflake.client.jdbc.SnowflakeDriver

JDBC URL*None.

Enter the JDBC driver connection string that you want to use in the syntax provided below. See Snowflake's JDBC Connection Parameters for more information

  • jdbc:snowflake://<full_account_name>.snowflakecomputing.com

Alternatively, you can make use of the UsernamePassword, and Database Name fields, along with the Advanced Properties > URL Properties fieldset to provide the parameters required for building your JDBC URL. See Passing your JDBC URL for more information.

Avoid passing Password inside the JDBC URL

If you specify the password inside the JDBC URL, it is saved as-is and not encrypted. We recommend passing your password using the Password field provided, instead, to ensure that your password is encrypted.

Default Value: N/A

Example:jdbc:snowflake://xy12345.us-east-2.aws.snowflakecomputing.comjdbc:snowflake://xy12345.east-us-2.azure.aws.snowflakecomputing.com

Authentication TypesNone.

Select one of the following modes of authentication that you want to use to access your Snowflake instance.

  • Basic: Activates Username and Password fields.
  • OAuth2: Activates a series of fields needs for OAuth2 type of authentication
    • Client ID, Client secret, Access token, Refresh token, Access token expiration, OAuth2 Endpoint, OAuth2 Token, Auto-refresh token, Authorize, and Refresh.
    • See Snowflake OAuth User Guide to learn how to obtain values for the above fields.

      If you use the OAuth2 authentication type to access your target Snowflake instance, configure the OAUTH_REDIRECT_URI in the definition of your Storage Integration to a value that ends with /elt, as in …/api/1/rest/admin/oauth2callback/elt.

  • Key Pair: Activates Private Key and Encrypted Private Key Passphrase fields. For more information, see Key Pair Authentication & Key Pair Rotation — Snowflake Documentation.

Username*Authentication Types is Basic.

Enter the username provided for your database account.

Default value: N/A

Example: SFBigdata

Password*Authentication Types is Basic.

Enter the password corresponding to the username specified above.

Default value: N/A

Example: <Encrypted>

Private Key*

Authentication Type is Key Pair.

Specify the key that you have generated for authentication. The key must include the generated header and footer.

The private key is read, decrypted if necessary, and included with properties to connect to Snowflake, where it is validated against its paired public key to authenticate the connection.

Learn more about Key Pair Authentication & Key Pair Rotation — Snowflake Documentation.

Default Value: N/A
Example:
-----BEGIN PRIVATE KEY-----MIIE6TAbBgkqhkiG9w0BBQM
wDgQILYPyennlX2bd8yX8x
OxGSGfvB+99+PmSlex0FmY9ov1J8H1H9Y3lJXXb

-----END PRIVATE KEY-----

Encrypted Private Key PassphraseAuthentication Type is Key Pair.

Specify the passphrase for the encrypted private key.

Default Value: N/A
Example: Nki62lmWfFO75Xg04W+Fnlt9ejiMn4zPbkf/me Aq1R3X7CM/ORLuLms4vU70NjzNYo09SOUXhQmoqpEkQBvrrYuzdaQt5hvwh7vZQh WZ5L8+lTRoAneFtmd6LZ94AssA==

Warehouse*Database Type is Snowflake

Enter the name of the Snowflake warehouse that you want to use.

Default value: N/A

Example: 12345abcdefghikj1km2345no

Default Database Name*None.

Enter the name of the database to use by default. This database is used if you do not specify one in the ELT Select, ELT Insert-Select, or ELT Merge Into Snaps. 

Default value: N/A

Example: EMPLOYEEDB

Client ID

Authentication Types is OAuth2.

Enter the client ID associated with your OAuth integration. You can create the client ID by creating an OAuth integration between your client and Snowflake.

Default value: N/A

Example: YmkCxrn03kty78shYstvFgObIrw=

Client secretAuthentication Types is OAuth2.

Enter the client secret associated with the above client ID.

Default value: N/A

Example: <An encrypted value>

Access tokenAuthentication Types is OAuth2.

Auto-generated upon authorization—when you click Authorize or Refresh or during an Auto-refresh action.

The access token associated with your OAuth integration is used to make API requests on behalf of the user associated with the client ID. This token is valid for 10 minutes. Select Auto-refresh token checkbox to refresh the token automatically.

Default value: N/A

Example: <An encrypted value>

Refresh tokenAuthentication Types is OAuth2.

Auto-generated upon authorization—when you click Authorize or Refresh or during an Auto-refresh action.

You need to specify the refresh token retrieval for the application when setting up the OAuth integration. When the OAuth2 flow is executed, the resulting refresh token is stored in this field.

If the refresh token is stored and is valid (within 90 days of its generation), then the access token can be refreshed automatically before it expires.

Default value: N/A

Example: <An encrypted value>

Access token expirationAuthentication Types is OAuth2.

Auto-generated and displayed upon authorization—when you click Authorize or Refresh or during an Auto-refresh action.

The number of seconds remaining until the Access token expires.

OAuth2 EndpointAuthentication Types is OAuth2.

Enter the URL of your OAuth2 endpoint that the client must call to request authorization codes needed for generating Access and Refresh tokens.

Default value: N/A

Example: https://<account_full_name_from_SF>.snowflakecomputing.com/oauth/authorize

OAuth2 TokenAuthentication Types is OAuth2.

Enter the URL of your OAuth2 endpoint that the client must call to request for Access and Refresh tokens.

Default value: N/A

Example: https://<account_full_name_from_SF>.snowflakecomputing.com/oauth/token-request

Auto-refresh tokenAuthentication Types is OAuth2.

Select this checkbox to enable the Snap account to refresh the Access token before it expires, using the Refresh token, if available.


Default value: Not selected

Example: Selected

AuthorizeAuthentication Types is OAuth2.

Click this button after providing your Client ID, Client secret, and the endpoint URLs above, to generate an Access token and a Refresh Token. These tokens are displayed in encrypted mode along with the actual value of Access token expiration.

Default value: N/A

Example: N/A

RefreshAuthentication Types is OAuth2.

Click this button after providing your Client ID, Client secret, and the endpoint URLs above, to manually refresh the Access token when it is about to expire. The updated tokens are displayed in encrypted mode. The new Access token expiration value indicates that the Access token is updated.

Default value: N/A

Example: N/A

Source Location*None.

Select the source data warehouse from which to load data into the Snowflake target database. You must configure this field if you want to use the ELT Load Snap. 

Available options are:

  • None
  • S3. Activates the AuthenticationS3 Bucket, and S3 Folder fields.
  • Azure. Activates the AuthenticationStorage AccountAzure Container, and Azure folder fields.

  • Google Cloud Storage. Activates the AuthenticationGoogle Cloud Storage (GCS) Bucket, and Google Cloud Storage (GCS) Folder fields.

Default value: None

Example: S3, Azure, Google Cloud Storage

AuthenticationSource Location is S3, Azure, or Google Cloud Storage

Select the authentication method to use for accessing the source data.

Available options are:

  • Source Location Credentials. Select this option when you do not have a storage integration setup in your S3 or Azure data storage. Activates the Access Key and Secret Key fields for S3 or the SAS Token field for Azure.
  • Source Location Session Credentials. Select this option if you have session credentials to access the source location in S3. Activates the Session Access KeySession Secret Key, and Session Token fields.
  • Storage Integration. Select this option when you want to use the storage integration to access the selected source location. Activates the Storage Integration Name field.

Default value: Source Location Credentials for S3 and Azure, Storage Integration for Google Cloud Storage.

Example: Storage Integration

Storage Integration NameSource Location is S3, Azure, or Google Cloud Storage and Authentication is Storage Integration.

Enter the name of the storage integration in your Snowflake database. Navigate to one of the following links for more information on setting up your Storage Integration in Snowflake.

Default value: N/A

Example: my_s3_integration, my_azure_integration, my_sf_gcs_integration

Access KeySource Location is S3 and Authentication is
Source Location Credentials.

Enter the access key ID associated with your AWS S3 account.

Default value: N/A

Example: ABCDEFGHIJKL1MNOPQRS

Secret KeySource Location is S3 and Authentication is
Source Location Credentials.

Enter the client secret key associated with your AWS S3 account.

Default value: N/A

Example: aBcdeFGhiJKLM/N1OPQRS/tUvWxyZABCDEFGHIJKLMN

SAS TokenSource Location is Azure and Authentication is
Source Location Credentials.

Enter the SAS token part of the SAS URI associated with your Azure storage account. See Getting Started with SAS for details.

Default value: N/A

Example: ?sv=2020-08-05&st=2020-08-29T22%3A18%3A26Z&se=2020-08-30T02%3A23%3A26Z&sr=b&sp=rw&sip=198.1.2.60-198.1.2.70&spr=https&sig=A%1DEFGH1Ijk2Lm3noI3OlWTjEg2tYkboXr1P9ZUXDtkk%3D

Session Access KeySource Location is S3 and Authentication is
Source Location Session Credentials.

Enter the access key ID associated with your AWS S3 account.

Default value: N/A

Example: ABCDEFGHIJKL1MNOPQRS

Session Secret KeySource Location is S3, and Authentication is
Source Location Session Credentials.

Enter the client secret key associated with your AWS S3 account.

Default value: N/A

Example: aBcdeFGhiJKLM/N1OPQRS/tUvWxyZABCDEFGHIJKLMN

Session TokenSource Location is S3 and Authentication is
Source Location Session Credentials.

Enter the session token value when the authorization to access the S3 folder is based on Security Token Service (STS) based temporary credentials. Note that only global STS regions are supported.

Default value: N/A

Example: A%1DEFGH1Ijk2Lm3noI3OlWTjEg2tYkboXr1P9ZUXDtkk%3D

Azure External LocationSource Location is Azure and Authentication is
Source Location Credentials.

Specify the type of Azure external storage location (source)—an Azure Data Lake Gen2 or a Blob Storage—where your source data is available.

Default Value: Azure Data Lake Gen2

Example: Blob Storage

Storage AccountSource Location is Azure and Authentication is
Source Location Credentials.

Enter the name of your Azure storage account.

Default value: N/A

Example: employeedata

S3 BucketSource Location is S3.

Enter the name of the bucket from which to load the data.

Default value: N/A

Example: employeedata

S3 FolderSource Location is S3.

Enter the name of the folder in the S3 bucket specified above where the source files are located.

Default value: N/A

Example: hrdepartment

Azure ContainerSource Location is Azure.

Enter the name of the container in the Azure storage account.

Default value: N/A

Example: hrdepartment

Azure FolderSource Location is Azure.

Enter the relative path for the folder within the container specified above. Leave this field blank to use all the folders in the specified container. 

Default value: N/A

Example: salaries

Google Cloud Storage (GCS) BucketSource Location is Google Cloud Storage and Authentication is 
Storage Integration.

Enter the name of the GCS bucket from which to load the data to your Snowflake database.

Default value: N/A

Example: elt_gcs_bucket_1

Google Cloud Storage (GCS) FolderSource Location is Google Cloud Storage and Authentication is 
Storage Integration.

Enter the name of the folder in the GCS bucket where the source files are located.

Default value: N/A

Example: elt_gcs_bucket_1_CSV_Files

Advanced Properties

Other parameters that you want to specify as URL properties. See the following resources for a list of parameters that can be specified in this fieldset.

In case of Snowflake, DO NOT configure parameters that are already present in the Account Properties fieldset. For example:

  • db=<name>
  • warehouse=<name>

This fieldset consists of the following fields:

  • URL Properties
    • URL Property Name
    • URL Property Value
URL PropertiesNone.

The account parameter's name and its corresponding value. Click + to add more rows. Add each URL property-value pair in a separate row. Specify the name of the parameter in the URL Property Name field and its value in the URL Property Value field.

N/AMAX_CONCURRENCY_LEVEL: 3

Click Validate after entering the required details to ensure that all fields have been filled accurately. Click Apply to save the settings.

Account Validation when using Pipeline parameters

If you have used Pipeline parameters or expressions to define values for the account fields above, the account validation (done by clicking the Validate button) is not supported. However, the Snaps that use this account may connect to the endpoint successfully depending on the accuracy and validity of the expressions and parameters used.

Passing your JDBC URL

Order of Precedence

The parameter values for the URL used to connect to your target database are governed by the following order of precedence:

  • JDBC URL field
  • Snap Properties (including Advanced Properties)
  • Default values

Default Properties set internally

No properties are internally passed in the JDBC URL, by default.

Specific Scenarios

When setting default values for the Database Name, Username, and Password fields:

  • As a best practice, ensure that the Default Database Name provided in the Snap's properties and the database name in the JDBC URL field match.

  • Else, make sure that both the database names exist.

However, the username and password from the JDBC URL take precedence over the values provided in the respective fields.

Troubleshooting

None.

Example

Configuring the ELT Database Account for connecting to a Snowflake Database

The following Pipeline is designed to extract data from a Parquet file residing in the ADLS Gen2 storage and load it into a new table in the Snowflake database.

Snap Settings

The Pipeline uses an ELT Load Snap to define the source Parquet file in a S3 folder location and is configured to perform the following functions.

  • Create a new table out_sy_elt_load_02 in the specified target database (Snowflake).
  • Load the data from Parquet file to this newly-created target table based on the table columns specified.

Account Settings

The Pipeline accesses the Parquet file using the Source Location Credentials for authentication.


See Also