ELT DLP Account
In this article
Overview
You can use the ELT Database Account to connect ELT Snaps with Databricks Lakehouse Platform (DLP) target instance. This account enables you to write transformed data to a target DLP instance hosted in a Microsoft Azure cloud location. The cloud location where the database is hosted is indicated in the JDBC URL for DLP—jdbc:spark://<your_instance_code>.cloud.databricks.com for AWS or jdbc:spark://<your_instance_code>.azuredatabricks.net for Microsoft Azure.
The ELT Snap Pack does not support mixed accounts from different types of databases in the same Pipeline. For example, a Pipeline in which some Snaps are connecting to the DLP instance cannot have other Snaps connecting to the Snowflake database.
Prerequisites
- A valid DLP account.
Certified JDBC JAR File: databricks-jdbc-2.6.29.jar
Using Alternate JDBC JAR File Versions
We recommend that you let the ELT Snaps use this JAR file version. However, you may use a different JAR file version of your choice.
Limitations
- With the basic authentication type for Databricks Lakehouse Platform (DLP) reaching its end of life on July 10, 2024, SnapLogic ELT pipelines designed to use this authentication type to connect to DLP instances would cease to succeed. We recommend that you reconfigure the corresponding Snap accounts to use the Personal access tokens (PAT) authentication type.
Known Issues
- When you use the auto-fill feature in the Google Chrome browser to fill ELT Snap account credentials—such as user names, passwords, client secrets, auth codes and tokens, secret keys, and keystores, the accounts, and hence the Pipelines fail. This is because the browser overwrites the field values with its own encrypted values that the SnapLogic Platform cannot read. SnapLogic recommends that you do not auto-save your Snap account credentials in the Chrome browser, delete any credentials that the browser has already saved for elastic.snaplogic.com, and then perform ONE of the following actions:
- Option 1: Click that appears in the address bar after you submit your login credentials at elastic.snaplogic.com, and then click Never.
- Option 2: Disable the Offer to save Passwords option at chrome://settings/passwords while working with your SnapLogic Pipelines. If you disable this option, your Chrome browser will not remember your passwords on any other website.
Due to an issue with DLP, aborting an ELT Pipeline validation (with preview data enabled) causes only those SQL statements that retrieve data using bind parameters to get aborted while all other static statements (that use values instead of bind parameters) persist.
For example,
select * from a_table where id = 10
will not be aborted whileselect * from test where id = ?
gets aborted.
To avoid this issue, ensure that you always configure your Snap settings to use bind parameters inside its SQL queries.
Account Settings
Asterisk ( * ) indicates a mandatory field.
Suggestion icon ( ) indicates a list that is dynamically populated based on the configuration.
Expression icon ( ) indicates whether the value is an expression (if enabled) or a static value (if disabled). Learn more about Using Expressions in SnapLogic.
Add icon () indicates that you can add fields in the fieldset.
Remove icon () indicates that you can remove fields from the fieldset.
Parameter | Field Dependency | Description | ||
---|---|---|---|---|
Label* | None. | Required. Unique user-provided label for the account. Default Value: N/A Example: ELT DLP Azure Account AD ON | ||
Account Properties* | Use this field set to configure the information required to establish a JDBC connection with the account. This field set consists of the following fields:
| |||
Database Type* | None. | Select the target data warehouse into which the queries must be loaded, that is Databricks Lakehouse Platform. This activates the following fields:
Default Value: N/A Example: Databricks Lakehouse Platform | ||
Download JDBC Driver Automatically | None. | Select this checkbox to allow the Snap account to download the certified JDBC Driver for DLP. The following fields are disabled when this checkbox is selected.
To use a JDBC Driver of your choice, clear this checkbox, upload (to SLDB), and choose the required JAR files in the JDBC JAR(s) and/or ZIP(s): JDBC Driver field. Use of Custom JDBC JAR version You can use a different JAR file version outside of the recommended listed JAR file versions. Spark JDBC and Databricks JDBC If you do not select this checkbox and use an older JDBC JAR file (older than version 2.6.25), ensure that you use:
Default Value: Not Selected Example: Selected | ||
JDBC JAR(s) and/or ZIP(s): JDBC Driver | Required when the Download JDBC Driver Automatically checkbox is not selected. | Upload the JDBC driver and other JAR files that you want to use into SLDB. Click JDBC driver for more information about JDBC drivers and downloading the appropriate driver for your account. Use the latest version of Databricks JDBC Driver to avoid errors while adding properties. to add a new row. Add each JDBC JAR file in a separate row. See Default Value: N/A Example: SimbaSparkJDBC42-2.6.21.1021.jar | ||
JDBC driver class* | Required when the Download JDBC Driver Automatically checkbox is not selected. | Specify the driver class to use for your application. We recommend that you use com.simba.spark.jdbc.Driver for DLP, as other classes and methods may change due to future enhancements. Default Value: N/A Example: com.simba.spark.jdbc.Driver | ||
JDBC URL* | None. | Enter the JDBC driver connection string that you want to use in the syntax provided below, for connecting to your DLP instance. See Microsoft's JDBC and ODBC drivers and configuration parameters for more information.
Spark JDBC and Databricks JDBC If you do not choose the Download JDBC Driver Automatically checkbox and use an older JDBC JAR file (older than version 2.6.25), ensure that you use:
|