$customHeader
Skip to end of banner
Go to start of banner

Databricks Account (Source: ADLS Gen2)

Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 14 Current »

In this article

Overview

You can use this account type to connect Databricks Snaps with data sources that use a Databricks Account with Azure Data Lake Storage (ADLS) Gen2 as the source.

Prerequisites

  • A valid Databricks account.

  • Certified JDBC JAR File: databricks-jdbc-2.6.25-1.jar

Limitations and Known Issues

None.

Account Settings

  • Asterisk ( * ): Indicates a mandatory field.

  • Suggestion icon ( (blue star) ): Indicates a list that is dynamically populated based on the configuration.

  • Expression icon ( (blue star) ): Indicates whether the value is an expression (if enabled) or a static value (if disabled). Learn more about Using Expressions in SnapLogic.

  • Add icon ( (blue star) ): Indicates that you can add fields in the field set.

  • Remove icon ( (blue star) ): Indicates that you can remove fields from the field set.

Field Name

Field Type

Field Dependency

Description

Label*

Default Value: N/A
Example: STD DB Acc DeltaLake AWS ALD

String

None.

Specify a unique label for the account.

Download JDBC Driver Automatically

Default Value: Not Selected

Example: Selected

Checkbox

None.

Select this checkbox to allow the Snap account to download the certified JDBC Driver for DLP. The following fields are disabled when this checkbox is selected:

  • JDBC JAR(s) and/or ZIP(s) : JDBC Driver

  • JDBC driver class

To use a JDBC Driver that you choose, clear this checkbox, upload (to SLDB), and choose the required JAR files in the JDBC JAR(s) and/or ZIP(s): JDBC Driver field. 

Use of Custom JDBC JAR version

You can use a different JAR file version other than the recommended list of JAR file versions.

Spark JDBC and Databricks JDBC

If you do not select this checkbox and use an older JDBC JAR file (older than version 2.6.25), ensure that you use: 

  • The old format JDBC URL ( jdbc:spark:// ) instead of the new one ( jdbc:databricks:// )

    • For a JDBC driver prior to version 2.6.25, the JDBC URL starts with jdbc:spark://

    • For a JDBC driver version 2.6.25 or later, the JDBC URL starts with jdbc:databricks://

  • The older JDBC Driver Class com.simba.spark.jdbc.Driver instead of the new com.databricks.client.jdbc.Driver.

JDBC URL*

Default Value: N/A

Example: jdbc:spark://adb-2409532680880038.18.azuredatabricks.net:443/default;transportMode=http;ssl=1;httpPath=sql/protocolv1/o/2409532680880038/0326-212833-drier754;AuthMech=3;

String

None.

Enter the JDBC driver connection string to use in the syntax provided below, for connecting to your DLP instance. 

jdbc:spark://dbc-ede87531-a2ce.cloud.databricks.com:443/default;transportMode=http;ssl=1;httpPath=
sql/protocolv1/o/6968995337014351/0521-394181-guess934;AuthMech=3;UID=token;PWD=<personal-access-token> 

Learn more about Microsoft's JDBC and ODBC drivers and configuration parameters.

Avoid passing a Password inside the JDBC URL

If you specify the password inside the JDBC URL, it is saved as it is and is not encrypted. Therefore, we recommend that you use the provided Password field to ensure that your password is encrypted.

Use Token Based Authentication

Default Value: Selected

Checkbox

None.

Select this checkbox to use token-based authentication for connecting to the target database (DLP) instance. Activates the Token field.

Token*

Default Value: N/A
Example: <Encrypted>

String

Use Token Based Authentication checkbox is selected.

Enter the token value for accessing the target database/folder path.

Database name*

Default Value: N/A
Example: Default

String

None.

Enter the name of the database to use by default. This database is used if you do not specify one in the Databricks Select or Databricks Insert Snaps.

Source/Target Location*

Default Value: N/A
Example: Default

Dropdown list

None.

Select the source or target data warehouse into which the queries must be loaded, that is Azure Data Lake Storage Gen2. This activates the following fields:

  • Azure storage account name

  • Azure Container

  • Azure Folder

  • Azure Auth Type

  • SAS Token

  • Azure Storage Account Key

Azure storage account name*

Default Value: N/A
Example: testblob

String

Source is ADLS Gen2.

Enter the name with which Azure Storage was created. The Bulk Load Snap automatically appends the '.blob.core.windows.net' domain to the value of this property.

Azure Container*

Default Value: N/A
Example: sl-bigdata-qa

String

Source is ADLS Gen2.

Enter the name of an existing Azure container.

Azure folder*

Default Value: N/A
Example: test-data

String

Source is ADLS Gen2.

Enter the name of an existing Azure folder in the container to be used for hosting files.

Azure Auth Type

Default Value: Shared Access Signature
Example: Shared Access Signature

Dropdown list

Source is ADLS Gen2.

Select the authorization type to use while setting up the account. Options available are:

  • Storage Account Key

  • Shared Access Signature: Select when you want to enter the SAS Token associated with the Azure storage account.

SAS Token*

Default value: N/A

Example?sv=2020-08-05&st=2020-08-29T22%3A18%3A26Z&se=2020-08-30T02%3A23%3A26Z&sr=b&sp=rw&sip=198.1.2.60-198.1.2.70&spr=https&sig=A%1DEFGH1Ijk2Lm3noI3OlWTjEg2tYkboXr1P9ZUXDtkk%3D

String

Azure Auth Type is Shared Access Signature.

Enter the SAS token which is the part of the SAS URI associated with your Azure storage account. Learn more in Getting Started with SAS.

Azure storage account key*

Default Value: N/A
ExampleABCDEFGHIJKL1MNOPQRS

String

Azure Auth Type is Storage account key.

Enter the access key ID associated with your Azure storage account.

Advanced Properties

Other parameters that you want to specify to configure the account. This fieldset consists of the following fields:

  • URL properties

  • Batch size

  • Fetch size

  • Min pool size

  • Max pool size

  • Max life time

  • Idle Timeout

  • Checkout timeout

URL properties

Use this fieldset to define the account parameter's name and its corresponding value. Add each URL property-value pair in a separate row. 

URL property name

Default Value: N/A
ExamplequeryTimeout

String

N/A

Specify the name of the parameter for the URL property.

URL property value

Default Value: N/A
Example: 0

String/Expression

N/A

Specify the value for the URL property parameter.

Batch size*

Default Value: N/A
Example3

Integer

N/A

Specify the number of queries that you want to execute at a time.

  • If the Batch size is one, the query is executed as-is, that is the Snap skips the batch (nonbatch execution).

  • If the Batch size is greater than one, the Snap performs the regular batch execution.

Fetch size*

Default Value: 100
Example: 12

Integer

N/A

Specify the number of rows a query must fetch for each execution.

Larger values could cause the server to run out of memory.

Min pool size*

Default Value: 3
Example: 0

Integer

N/A

Specify the minimum number of idle connections that you want the pool to maintain at a time. 

Max pool size*

Default Value: 15
Example0

Integer

N/A

Specify the maximum number of connections that you want the pool to maintain at a time.

Max life time*

Default Value: 60
Example50

Integer

N/A

Specify the maximum lifetime of a connection in the pool, in seconds:

  • Ensure that the value you enter is a few seconds shorter than any database or infrastructure-imposed connection time limit.

  • 0 (zero) indicates an infinite lifetime, subject to the Idle Timeout value.

  • An in-use connection is never retired. Connections are removed only after they are closed.

Minimum value: 0
Maximum value: No limit

Idle Timeout*

Default Value5
Example4

Integer

N/A

Specify the maximum amount of time in seconds that a connection is allowed to sit idle in the pool. 0 (zero) indicates that idle connections are never removed from the pool.

Minimum value: 0
Maximum value: No limit

Checkout timeout*

Default Value10000
Example9000

Integer

N/A

Specify the maximum time in milliseconds you want the system to wait for a connection to become available when the pool is exhausted.

If you provide 0, the Snap waits infinitely until the connection is available. Therefore, we recommend you not to specify 0 for Checkout Timeout.

Minimum value: 0
Maximum value: No limit

Troubleshooting

Error

Reason

Resolution

Account validation failed.

The Pipeline ended before the batch could complete execution due to a connection error.

Verify that the Refresh token field is configured to properly handle the inputs. If you are not sure when the input data is available, configure this field as zero to keep the connection always open.

Snap Pack History

 Click here to expand...

Release

Snap Pack Version

Date

Type

Updates

May 2024

437patches26400

Latest

Fixed an invalid session handle issue with the Databricks Snap Pack that intermittently triggered an error message when the Snaps failed to connect with Databricks to execute the SQL statement.

May 2024

main26341

Stable

Updated the Delete Condition (Truncates a Table if empty) field in the Databricks - Delete Snap to Delete condition (deletes all records from a table if left blank) to indicate that all entries will be deleted from the table when this field is blank, but no truncate operation is performed.

February 2024

main25112

Stable

Updated and certified against the current SnapLogic Platform release.

November 2023

main23721

Stable

Updated and certified against the current SnapLogic Platform release.

August 2023

main22460

Stable

Updated and certified against the current SnapLogic Platform release.

May 2023

433patches21630

Latest

Enhanced the performance of the Databricks - Insert Snap to improve the amount of time it takes for validation.

May 2023

main21015

Stable

Upgraded with the latest SnapLogic Platform release.

February 2023

main19844

Stable

Upgraded with the latest SnapLogic Platform release.

November 2022

main18944

Stable

The Databricks - Insert Snap now creates the target table only from the table metadata of the second input view when the following conditions are met:

  • The Create table if not present checkbox is selected.

  • The target table does not exist.

  • The table metadata is provided in the second input view.

September 2022

430patches18305

Latest

The following fields are added to each Databricks Snap as part of this enhancement:

  • Number of Retries: The number of attempts the Snap should make to perform the selected operation when the Snap account connection fails or times out.

  • Retry Interval (seconds): The time interval in seconds between two consecutive retry attempts.

September 2022

430patches17796

Latest

The Manage Queued Queries property in the Databricks Snap Pack enables you to decide whether a given Snap should continue or cancel executing the queued Databricks SQL queries.

August 2022

main17386

Stable

Upgraded with the latest SnapLogic Platform release.

4.29.2.0

42920rc17045

Latest

A new Snap Pack for Databricks Lakehouse Platform (Databricks or DLP) introduces the following Snaps:


  • No labels