Databricks Account (Source: ADLS Gen2)
In this article
Overview
You can use this account type to connect Databricks Snaps with data sources that use a Databricks Account with Azure Data Lake Storage (ADLS) Gen2 as the source.
Prerequisites
A valid Databricks account.
Certified JDBC JAR File:Â databricks-jdbc-2.6.25-1.jar
Limitations and Known Issues
None.
Account Settings
Asterisk ( * ): Indicates a mandatory field.
Suggestion icon ( ): Indicates a list that is dynamically populated based on the configuration.
Expression icon ( ): Indicates whether the value is an expression (if enabled) or a static value (if disabled). Learn more about Using Expressions in SnapLogic.
Add icon ( ): Indicates that you can add fields in the field set.
Remove icon ( ): Indicates that you can remove fields from the field set.
Field Name | Field Type | Field Dependency | Description | |
---|---|---|---|---|
Label* Â Default Value:Â N/A | String | None. | Specify a unique label for the account. Â | |
Download JDBC Driver Automatically  Default Value: Not Selected Example: Selected | Checkbox | None. | Select this checkbox to allow the Snap account to download the certified JDBC Driver for DLP. The following fields are disabled when this checkbox is selected:
To use a JDBC Driver that you choose, clear this checkbox, upload (to SLDB), and choose the required JAR files in the JDBC JAR(s) and/or ZIP(s): JDBC Driver field. Use of Custom JDBC JAR version You can use a different JAR file version other than the recommended list of JAR file versions. Spark JDBC and Databricks JDBC If you do not select this checkbox and use an older JDBC JAR file (older than version 2.6.25), ensure that you use:Â
| |
JDBC URL*  Default Value: N/A Example: jdbc:spark://adb-2409532680880038.18.azuredatabricks.net:443/default;transportMode=http;ssl=1;httpPath=sql/protocolv1/o/2409532680880038/0326-212833-drier754;AuthMech=3; | String | None. | Enter the JDBC driver connection string to use in the syntax provided below, for connecting to your DLP instance. jdbc:spark://dbc-ede87531-a2ce.cloud.databricks.com:443/default;transportMode=http;ssl=1;httpPath= Learn more about Microsoft's JDBC and ODBC drivers and configuration parameters. | |
Use Token Based Authentication  Default Value: Selected | Checkbox | None. | Select this checkbox to use token-based authentication for connecting to the target database (DLP) instance. Activates the Token field. | |
Token*  Default Value: N/A | String | Use Token Based Authentication checkbox is selected. | Enter the token value for accessing the target database/folder path.  | |
Database name* Â Default Value: N/A | String | None. | Enter the name of the database to use by default. This database is used if you do not specify one in the Databricks Select or Databricks Insert Snaps. Â | |
Source/Target Location*  Default Value: N/A | Dropdown list | None. | Select the source or target data warehouse into which the queries must be loaded, that is Azure Data Lake Storage Gen2. This activates the following fields:
| |
Azure storage account name*  Default Value: N/A | String | Source is ADLS Gen2. | Enter the name with which Azure Storage was created. The Bulk Load Snap automatically appends the '.blob.core.windows.net' domain to the value of this property.  | |
Azure Container* Â Default Value: N/A | String | Source is ADLS Gen2. | Enter the name of an existing Azure container. Â | |
Azure folder* Â Default Value: N/A | String | Source is ADLS Gen2. | Enter the name of an existing Azure folder in the container to be used for hosting files. Â | |
Azure Auth Type   Default Value: Shared Access Signature | Dropdown list | Source is ADLS Gen2. | Select the authorization type to use while setting up the account. Options available are:
 | |
SAS Token*  Default value: N/A Example: ?sv=2020-08-05&st=2020-08-29T22%3A18%3A26Z&se=2020-08-30T02%3A23%3A26Z&sr=b&sp=rw&sip=198.1.2.60-198.1.2.70&spr=https&sig=A%1DEFGH1Ijk2Lm3noI3OlWTjEg2tYkboXr1P9ZUXDtkk%3D | String | Azure Auth Type is Shared Access Signature. | Enter the SAS token which is the part of the SAS URI associated with your Azure storage account. Learn more in Getting Started with SAS.  | |
Azure storage account key* Â Default Value:Â N/A | String | Azure Auth Type is Storage account key. | Enter the access key ID associated with your Azure storage account. Â | |
Advanced Properties | Other parameters that you want to specify to configure the account. This fieldset consists of the following fields:
| |||
URL properties | Use this fieldset to define the account parameter's name and its corresponding value. Add each URL property-value pair in a separate row. | |||
URL property name  Default Value: N/A | String | N/A | Specify the name of the parameter for the URL property.  | |
URL property value  Default Value: N/A | String/Expression | N/A | Specify the value for the URL property parameter.  | |
Batch size*  Default Value: N/A | Integer | N/A | Specify the number of queries that you want to execute at a time.
| |
Fetch size* Â Default Value: 100 | Integer | N/A | Specify the number of rows a query must fetch for each execution. Larger values could cause the server to run out of memory. | |
Min pool size*  Default Value: 3 | Integer | N/A | Specify the minimum number of idle connections that you want the pool to maintain at a time.  | |
Max pool size* Â Default Value: 15 | Integer | N/A | Specify the maximum number of connections that you want the pool to maintain at a time. Â | |
Max life time* Â Default Value: 60 | Integer | N/A | Specify the maximum lifetime of a connection in the pool, in seconds:
Minimum value: 0 | |
Idle Timeout*  Default Value: 5 | Integer | N/A | Specify the maximum amount of time in seconds that a connection is allowed to sit idle in the pool. 0 (zero) indicates that idle connections are never removed from the pool. Minimum value: 0 | |
Checkout timeout*  Default Value: 10000 | Integer | N/A | Specify the maximum time in milliseconds you want the system to wait for a connection to become available when the pool is exhausted. Minimum value: 0 |
Snap Pack History
Related Links
Have feedback? Email documentation@snaplogic.com | Ask a question in the SnapLogic Community
© 2017-2024 SnapLogic, Inc.