In this article
Table of Contents |
---|
minLevel | 1 |
---|
maxLevel | 2 |
---|
absoluteUrl | true |
---|
|
...
You can use this account type to connect <Snap Pack name> Databricks Snaps with data sources that use <Account Type Name> accountDatabricks Account with AWS S3 as a source.
Prerequisites
Valid client ID.
Valid tenant URL.
...
...
Limitations
Known Issues
Supports only reading JSON files.
...A valid Databricks account.
Certified JDBC JAR File: databricks-jdbc-2.6.25-1.jar
Limitations and Known Issues
None.
Account Settings
<Insert image here>
Appsplus panel macro confluence macro |
---|
data | {"features":["title","icon","rounded","collapsible","border"],"title":"Documenting Fields Based On Data Type/UI Element","titleColor":"#000000","titleColorBG":"#ff5c30","titleSize":"14","titleBold":true,"titleItalic":false,"titleUnderline":false,"lozenge":"Hot stuff","lozengeColor":"#172B4D","lozengeColorBG":"#fff0b3","height":200,"panelPadding":12,"panelColor":"#172B4D","panelColorBG":"#FFFFFF","borderColor":"#ff5c30","borderRadius":3,"borderStyle":"solid","borderWidth":1,"icon":"editor/info","iconPrimary":"#FFFFFF","iconSecondary":"#0052CC","newMacro":false} |
---|
|
**Delete Before Publishing** Choose from the following sentences to document specific field types. Drop-down lists/Option Buttons (radio buttons): You must list the LoV and describe them if their meaning isn't apparent. In this case, format the LoV in italics, regular font for the LoV's description. In either case, list the LoVs as a bullet list. <State what the should do in this field>. The available options are: <bullet list of LoVs> Specify the Salesforce API to be used. The available options are:... Select the <category> that you want to use. Available options are... * Option 1<italicized>. <third person singular form of the verb> * Option 2<italicized>. <third person singular form of the verb> Select the API that you want to use. Available options are: Bulk API. Sends the Snap execution request details as a bulk API call. REST API. ... OR Select one of the three following modes in which the Snap executes: * Validate & Execute. Performs limited execution of the Snap and generates a data preview during Pipeline validation, then performs full execution of the Snap (unlimited records) during Pipeline runtime. * Execute only. Performs full execution of the Snap during Pipeline execution without generating preview data. * Disabled. Disables the Snap and all Snaps downstream from it.
Check boxes: If selected, <Snap behavior>. If selected, an empty file is written when the incoming document has no data. If selected, <behavior>. If not selected/Otherwise, <behavior> Use "If not selected" if the first sentence is long. If selected, the Snap uses the file path value as is. Otherwise, the Snap uses the file path value in the URL. If selected, an empty file is written when the incoming document has empty data. If there is no incoming document at the input view of the Snap, no file is written regardless of the value of the property. Select to <action> Use this if the behavior is binary. Either this or that, where the converse behavior is apparent/obvious. Select to execute the Pipeline during validation.
Text Fields Describe what the user shall specify in this field. Additional details, as applicable, in a separate sentence. Include caveats such as the field being conditionally mandatory, limitations, etc. Enter the name for new account. Specify the account ID to use to log in to the endpoint. Required if IAM Role is selected. Do not use this field if you are using batch processing.
Numeric Text Fields Describe what the field represents/contains. Additional details, as applicable, in a separate sentence. Include caveats such as the field being conditionally mandatory, limitations, etc. Include special values that impact the field's behavior as a bullet list. The number of records in a batch. The number of seconds for which you want the Snap to wait between retries. The number of seconds for which the Snap waits between retries. Use the following special values: * 0: Disables batching. * 1: Includes all documents in a single request.
Notes in field descriptions Confluence’s new editor does not allow nesting of most macros inside another macro, especially the Note/Alert/Warning/Info (Panel) macros inside a table macro and Excerpt macros inside Expand or Panel Macro+ macros. So, as a workaround use the Footnotes approach as mentioned below: Assign numbers at the Note locations in the form of follow through phrases like See Note 2 below this table. or such. Add your Notes---an appropriate Note/Alert/Warning/Info (Panel) macro---immediately below the macro (for example, Table macro) beginning the content with the corresponding number assigned.
|
Info |
---|
Asterisk ( * ): Indicates a mandatory field. Suggestion icon ( ): Indicates a list that is dynamically populated based on the configuration. Expression icon ( ): Indicates whether the value is an expression (if enabled) or a static value (if disabled). Learn more about Using Expressions in SnapLogic. Add icon ( ): Indicates that you can add fields in the fieldset. Remove icon ( ): Indicates that you can remove fields from the fieldset.
|
...
Field Name
...
Field Type
...
Field Dependency
...
Description
Default Value:
Example:
Label*
Default Value: ELT Database Account
Example: ELT RS Account
...
String
...
None.
Specify a unique label for the account.
...
Field set
Specify advanced parameters that you want to include in the request.
This field set consists of the following fields:
Field 1*
Default Value:
Example:
...
String
...
Debug mode checkbox is not selected.
Field 2
Default Value:
Example:
...
String
...
None.
Troubleshooting
...
Error
...
Reason
...
Resolution
...
Account validation failed.
...
The Pipeline ended before the batch could complete execution due to a connection error.
...
Verify that the Refresh token field is configured to handle the inputs properly. If you are not sure when the input data is available, configure this field as zero to keep the connection always open.
...
Related Links
...
Endpoint Doc Link 1
...
Endpoint Doc Link 2
...
Endpoint Doc Link 3
...
Info |
---|
Asterisk ( * ): Indicates a mandatory field. Suggestion icon ( ): Indicates a list that is dynamically populated based on the configuration. Expression icon ( ): Indicates the value is an expression (if enabled) or a static value (if disabled). Learn more about Using Expressions in SnapLogic. Add icon ( ): Indicates that you can add fields in the field set. Remove icon ( ): Indicates that you can remove fields from the field set.
|
Field Name | Field Type | Field Dependency | Description |
---|
Label* Default Value: N/A Example: STD DB Acc DeltaLake AWS S3 | String | None. | Specify a unique label for the account. |
Download JDBC Driver Automatically Default Value: Not Selected Example: Selected | Checkbox | None | Select this checkbox to allow the Snap account to download the certified JDBC Driver for DLP. The following fields are disabled when this checkbox is selected. To use a JDBC Driver of your choice, clear this checkbox, upload (to SLDB), and choose the required JAR files in the JDBC JAR(s) and/or ZIP(s): JDBC Driver field. Use of Custom JDBC JAR version You can use a different JAR file version outside of the recommended listed JAR file versions. Spark JDBC and Databricks JDBC If you do not select this checkbox and use an older JDBC JAR file (older than version 2.6.25), ensure that you use: |
JDBC URL* Default Value: N/A Example: jdbc:spark://adb-2409532680880038.18.azuredatabricks.net:443/default;transportMode=http;ssl=1;httpPath=sql/protocolv1/o/2409532680880038/0326-212833-drier754;AuthMech=3; | String | None | Enter the JDBC driver connection string that you want to use in the syntax provided below, for connecting to your DLP instance. Learn more about Microsoft's JDBC and ODBC drivers and configuration parameters. jdbc:spark://dbc-ede87531-a2ce.cloud.databricks.com:443/default;transportMode=http;ssl=1;httpPath= sql/protocolv1/o/6968995337014351/0521-394181-guess934;AuthMech=3;UID=token;PWD=<personal-access-token> Avoid passing Password inside the JDBC URL If you specify the password inside the JDBC URL, it is saved as it is and is not encrypted. We recommend passing your password using the Password field provided, instead, to ensure that your password is encrypted. |
Use Token Based Authentication Default value: Selected Example: Not selected | Checkbox | None | Select this checkbox to use token-based authentication for connecting to the target database (DLP) instance. Activates the Token field. |
Token* Default value: N/A Example: <Encrypted> | String | When Use Token Based Authentication checkbox is selected. | Enter the token value for accessing the target database/folder path. |
Database name* Default value: N/A Example: Default | String | None | Enter the name of the database to use by default. This database is used if you do not specify one in the Databricks Select or Databricks Insert Snaps. |
Source/Target Location* | Dropdown | None | Select the source or target data warehouse into which the queries must be loaded, that is AWS S3. This activates the following fields: S3 Bucket S3 Folder AWS Authorization type S3 Access Key ID S3 Secret Key
|
S3 Bucket* | String | None | Specify the name of the S3 bucket that you want to use for staging data to Databricks. Default Value: N/A Example: sl-bucket-ca |
S3 Folder* | String | None | Specify the relative path to a folder in the S3 bucket listed in the S3 Bucket field. This is used as a root folder for staging data to Databricks. Default Value: N/A Example: https://sl-bucket-ca.s3.<ca>.amazonaws/<sf> |
Aws Authorization type Default value: Source Location Credentials for S3 and Azure, Storage Integration for Google Cloud Storage. Example: Storage Integration | Dropdown | None | Select the authentication method to use for accessing the source data. Available options are: Source/Target Location Credentials. Select this option when you do not have a storage integration setup in your S3. Activates the Access Key and Secret Key fields for S3. Source/Target Location Session Credentials. Select this option if you have session credentials to access the source location in S3. Activates the Session Access Key, Session Secret Key, and Session Token fields. Storage Integration. Select this option when you want to use the storage integration to access the selected source location. Activates the Storage Integration Name field.
|
S3 Access-key ID* Default Value: N/A Example: NAVRGGRV7EDCFVLKJH | String | None | Specify the S3 access key ID that you want to use for AWS authentication. |
S3 Secret key* Default Value: N/A Example: 2RGiLmL/6bCujkKLaRuUJHY9uSDEjNYr+ozHRtg | String | None | Specify the S3 secret key associated with the S3 Access-ID key listed in the S3 Access-key ID field. |
S3 AWS Token* Default Value: None Example: AQoDYXdzEJr | String | Appears when Source/Target Location Session Credentials is selected in Aws Authorization type | Specify the S3 AWS Token to connect to private and protected Amazon S3 buckets. Info |
---|
The temporary AWS Token is used when: |
|
Snap Pack History
Expand |
---|
Insert excerpt |
---|
| Databricks Snap Pack |
---|
| Databricks Snap Pack |
---|
name | Databricks Snap Pack History |
---|
nopanel | true |
---|
|
|
Related Links