In this article
Table of Contents | ||||||
---|---|---|---|---|---|---|
|
...
You can use this account type to connect <Snap Pack name> Databricks Snaps with data sources that use <Account Type Name> accountDatabricks Account with AWS S3 as a source.
Prerequisites
Valid client ID.
Valid tenant URL.
...
...
Limitations
Known Issues
Supports only reading JSON files.
....
Account Settings
<Insert image here>
Appsplus panel macro confluence macro | ||
---|---|---|
| ||
**Delete Before Publishing** Choose from the following sentences to document specific field types. Drop-down lists/Option Buttons (radio buttons): You must list the LoV and describe them if their meaning isn't apparent. In this case, format the LoV in italics, regular font for the LoV's description. In either case, list the LoVs as a bullet list.
Check boxes:
Text Fields
Numeric Text Fields
Notes in field descriptions
|
Info |
---|
|
...
Field Name
...
Field Type
...
Field Dependency
...
Description
Default Value:
Example:
Label*
Default Value: ELT Database Account
Example: ELT RS Account
...
String
...
None.
Specify a unique label for the account.
...
Field set
Specify advanced parameters that you want to include in the request.
This field set consists of the following fields:
Field 1
Field 2
Field 3
Field 1*
Default Value:
Example:
...
String
...
Debug mode checkbox is not selected.
Field 2
Default Value:
Example:
...
String
...
None.
Troubleshooting
...
Error
...
Reason
...
Resolution
...
Account validation failed.
...
The Pipeline ended before the batch could complete execution due to a connection error.
...
Verify that the Refresh token field is configured to handle the inputs properly. If you are not sure when the input data is available, configure this field as zero to keep the connection always open.
...
Related Links
Endpoint Doc Link 1
Endpoint Doc Link 2
Endpoint Doc Link 3A valid Databricks account.
Certified JDBC JAR File: databricks-jdbc-2.6.25-1.jar
Limitations and Known Issues
None.
Account Settings
...
Info |
---|
|
Field Name | Field Type | Field Dependency | Description | ||
---|---|---|---|---|---|
Label*
Default Value: N/A | String | None. | Specify a unique label for the account.
| ||
Download JDBC Driver Automatically
Default Value: Not Selected Example: Selected | Checkbox | None | Select this checkbox to allow the Snap account to download the certified JDBC Driver for DLP. The following fields are disabled when this checkbox is selected.
To use a JDBC Driver of your choice, clear this checkbox, upload (to SLDB), and choose the required JAR files in the JDBC JAR(s) and/or ZIP(s): JDBC Driver field. Use of Custom JDBC JAR version You can use a different JAR file version outside of the recommended listed JAR file versions. Spark JDBC and Databricks JDBC If you do not select this checkbox and use an older JDBC JAR file (older than version 2.6.25), ensure that you use:
| ||
JDBC URL*
Default Value: N/A Example: jdbc:spark://adb-2409532680880038.18.azuredatabricks.net:443/default;transportMode=http;ssl=1;httpPath=sql/protocolv1/o/2409532680880038/0326-212833-drier754;AuthMech=3; | String | None | Enter the JDBC driver connection string that you want to use in the syntax provided below, for connecting to your DLP instance. See Microsoft's JDBC and ODBC drivers and configuration parameters for more information. jdbc:spark://dbc-ede87531-a2ce.cloud.databricks.com:443/default;transportMode=http;ssl=1;httpPath= Avoid passing Password inside the JDBC URL If you specify the password inside the JDBC URL, it is saved as it is and is not encrypted. We recommend passing your password using the Password field provided, instead, to ensure that your password is encrypted.
| ||
Use Token Based Authentication Default value: Selected Example: Not selected | Checkbox | None | Select this checkbox to use token-based authentication for connecting to the target database (DLP) instance. Activates the Token field.
| ||
Token* Default value: N/A Example: <Encrypted> | String | When Use Token Based Authentication checkbox is selected. | Enter the token value for accessing the target database/folder path.
| ||
Database name* Default value: N/A Example: Default | String | None | Enter the name of the database to use by default. This database is used if you do not specify one in the Databricks Select or Databricks Insert Snaps.
| ||
Source/Target Location* | Dropdown | None | Select the source or target data warehouse into which the queries must be loaded, that is AWS S3. This activates the following fields:
| ||
S3 Bucket* | String | None | Specify the name of the S3 bucket that you want to use for staging data to Databricks. Default Value: N/A Example: sl-bucket-ca | ||
S3 Folder* | String | None | Specify the relative path to a folder in the S3 bucket listed in the S3 Bucket field. This is used as a root folder for staging data to Databricks. Default Value: N/A Example: https://sl-bucket-ca.s3.<ca>.amazonaws/<sf> | ||
Aws Authorization type | Dropdown | None | Select the authentication method to use for accessing the source data. Available options are:
Default value: Source Location Credentials for S3 and Azure, Storage Integration for Google Cloud Storage. Example: Storage Integration | ||
S3 Access-key ID* | String | None | Specify the S3 access key ID that you want to use for AWS authentication. Default Value: N/A Example: NAVRGGRV7EDCFVLKJH | ||
S3 Secret key* | String | None | Specify the S3 secret key associated with the S3 Access-ID key listed in the S3 Access-key ID field. Default Value: N/A Example: 2RGiLmL/6bCujkKLaRuUJHY9uSDEjNYr+ozHRtg | ||
S3 AWS Token* | String | Appears when Source/Target Location Session Credentials is selected in Aws Authorization type | Specify the S3 AWS Token to connect to private and protected Amazon S3 buckets.
Default Value: None |
...