Databricks Account (Source: AWS S3)
In this article
Overview
You can use this account type to connect Databricks Snaps with data sources that use Databricks Account with AWS S3 as a source.
Prerequisites
A valid Databricks account.
Certified JDBC JAR File: databricks-jdbc-2.6.25-1.jar
Limitations and Known Issues
None.
Account Settings
Asterisk ( * ): Indicates a mandatory field.
Suggestion icon ( ): Indicates a list that is dynamically populated based on the configuration.
Expression icon ( ): Indicates the value is an expression (if enabled) or a static value (if disabled). Learn more about Using Expressions in SnapLogic.
Add icon ( ): Indicates that you can add fields in the field set.
Remove icon ( ): Indicates that you can remove fields from the field set.
Field Name | Field Type | Field Dependency | Description |
---|---|---|---|
Label*
Default Value: N/A | String | None. | Specify a unique label for the account.
|
Download JDBC Driver Automatically
Default Value: Not Selected Example: Selected | Checkbox | None | Select this checkbox to allow the Snap account to download the certified JDBC Driver for DLP. The following fields are disabled when this checkbox is selected.
To use a JDBC Driver of your choice, clear this checkbox, upload (to SLDB), and choose the required JAR files in the JDBC JAR(s) and/or ZIP(s): JDBC Driver field. Use of Custom JDBC JAR version You can use a different JAR file version outside of the recommended listed JAR file versions. Spark JDBC and Databricks JDBC If you do not select this checkbox and use an older JDBC JAR file (older than version 2.6.25), ensure that you use:
|
JDBC URL*
Default Value: N/A | String | None | Enter the JDBC driver connection string that you want to use in the syntax provided below, for connecting to your DLP instance. Learn more about Microsoft's JDBC and ODBC drivers and configuration parameters. jdbc:spark://dbc-ede87531-a2ce.cloud.databricks.com:443/default;transportMode=http;ssl=1;httpPath=
|
Use Token Based Authentication Default value: Selected | Checkbox | None | Select this checkbox to use token-based authentication for connecting to the target database (DLP) instance. Activates the Token field.
|
Token* Default value: N/A | String | When Use Token Based Authentication checkbox is selected. | Enter the token value for accessing the target database/folder path.
|
Database name* Default value: N/A | String | None | Enter the name of the database to use by default. This database is used if you do not specify one in the Databricks Select or Databricks Insert Snaps.
|
Source/Target Location* | Dropdown | None | Select the source or target data warehouse into which the queries must be loaded, that is AWS S3. This activates the following fields:
|
S3 Bucket*
| String | None | Specify the name of the S3 bucket that you want to use for staging data to Databricks. Default Value: N/A Example: sl-bucket-ca |
S3 Folder* | String | None | Specify the relative path to a folder in the S3 bucket listed in the S3 Bucket field. This is used as a root folder for staging data to Databricks. Default Value: N/A Example: https://sl-bucket-ca.s3.<ca>.amazonaws/<sf> |
Aws Authorization type
Default value: Source Location Credentials for S3 and Azure, Storage Integration for Google Cloud Storage. Example: Storage Integration | Dropdown | None | Select the authentication method to use for accessing the source data. Available options are:
|
S3 Access-key ID*
Default Value: N/A Example: NAVRGGRV7EDCFVLKJH | String | None | Specify the S3 access key ID that you want to use for AWS authentication.
|
S3 Secret key*
Default Value: N/A Example: 2RGiLmL/6bCujkKLaRuUJHY9uSDEjNYr+ozHRtg | String | None | Specify the S3 secret key associated with the S3 Access-ID key listed in the S3 Access-key ID field.
|
S3 AWS Token*
Default Value: None | String | Appears when Source/Target Location Session Credentials is selected in Aws Authorization type | Specify the S3 AWS Token to connect to private and protected Amazon S3 buckets. |
Snap Pack History
Related Links
Have feedback? Email documentation@snaplogic.com | Ask a question in the SnapLogic Community
© 2017-2024 SnapLogic, Inc.