Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

In this article

Table of Contents
minLevel1
maxLevel2
absoluteUrltrue

...

You can use this account type to connect <Snap Pack name> Databricks Snaps with data sources that use <Account Type Name> accountthe Databricks account with Google Cloud Storage as source.

Prerequisites

  • Valid client ID.

  • Valid tenant URL.

  • ...

  • ...

Limitations

Known Issues

  • Supports only reading JSON files.

  • ....

  • A valid Databricks account.

  • Certified JDBC JAR File: databricks-jdbc-2.6.25-1.jar

Limitations and Known Issues

None.

Account Settings

<Insert image here>

Appsplus panel macro confluence macro
data{"features":["title","icon","rounded","collapsible","border"],"title":"Documenting Fields Based On Data Type/UI Element","titleColor":"#000000","titleColorBG":"#ff5c30","titleSize":"14","titleBold":true,"titleItalic":false,"titleUnderline":false,"lozenge":"Hot stuff","lozengeColor":"#172B4D","lozengeColorBG":"#fff0b3","height":200,"panelPadding":12,"panelColor":"#172B4D","panelColorBG":"#FFFFFF","borderColor":"#ff5c30","borderRadius":3,"borderStyle":"solid","borderWidth":1,"icon":"editor/info","iconPrimary":"#FFFFFF","iconSecondary":"#0052CC","newMacro":false}

**Delete Before Publishing**

Choose from the following sentences to document specific field types.

Drop-down lists/Option Buttons (radio buttons):

You must list the LoV and describe them if their meaning isn't apparent. In this case, format the LoV in italics, regular font for the LoV's description. In either case, list the LoVs as a bullet list.

  • <State what the should do in this field>. The available options are: <bullet list of LoVs>
    Specify the Salesforce API to be used. The available options are:...

  • Select the <category> that you want to use. Available options are...
    * Option 1<italicized>. <third person singular form of the verb>
    * Option 2<italicized>. <third person singular form of the verb>
    Select the API that you want to use. Available options are:
    Bulk API. Sends the Snap execution request details as a bulk API call.
    REST API. ...
    OR
    Select one of the three following modes in which the Snap executes:
    * Validate & Execute. Performs limited execution of the Snap and generates a data preview during Pipeline validation, then performs full execution of the Snap (unlimited records) during Pipeline runtime.
    * Execute only. Performs full execution of the Snap during Pipeline execution without generating preview data.
    * Disabled. Disables the Snap and all Snaps downstream from it.

Check boxes:

  • If selected, <Snap behavior>.
    If selected, an empty file is written when the incoming document has no data.

  • If selected, <behavior>. If not selected/Otherwise, <behavior>
    Use "If not selected" if the first sentence is long.
    If selected, the Snap uses the file path value as is. Otherwise, the Snap uses the file path value in the URL.
    If selected, an empty file is written when the incoming document has empty data. If there is no incoming document at the input view of the Snap, no file is written regardless of the value of the property.

  • Select to <action>
    Use this if the behavior is binary. Either this or that, where the converse behavior is apparent/obvious.
    Select to execute the Pipeline during validation.

Text Fields

  • Describe what the user shall specify in this field. Additional details, as applicable, in a separate sentence. Include caveats such as the field being conditionally mandatory, limitations, etc.
    Enter the name for new account.
    Specify the account ID to use to log in to the endpoint.
    Required if IAM Role is selected.
    Do not use this field if you are using batch processing.

Numeric Text Fields

  • Describe what the field represents/contains. Additional details, as applicable, in a separate sentence. Include caveats such as the field being conditionally mandatory, limitations, etc. Include special values that impact the field's behavior as a bullet list.
    The number of records in a batch.
    The number of seconds for which you want the Snap to wait between retries.
    The number of seconds for which the Snap waits between retries.
    Use the following special values:
    * 0: Disables batching.
    * 1: Includes all documents in a single request.

Notes in field descriptions

  • Confluence’s new editor does not allow nesting of most macros inside another macro, especially the Note/Alert/Warning/Info (Panel) macros inside a table macro and Excerpt macros inside Expand or Panel Macro+ macros. So, as a workaround use the Footnotes approach as mentioned below:

    • Assign numbers at the Note locations in the form of follow through phrases like See Note 2 below this table. or such.

    • Add your Notes---an appropriate Note/Alert/Warning/Info (Panel) macro---immediately below the macro (for example, Table macro) beginning the content with the corresponding number assigned.

Info
  • Asterisk ( * ): Indicates a mandatory field.

  • Suggestion icon ( (blue star) ): Indicates a list that is dynamically populated based on the configuration.

  • Expression icon ( (blue star) ): Indicates whether the value is an expression (if enabled) or a static value (if disabled). Learn more about Using Expressions in SnapLogic.

  • Add icon ( (blue star) ): Indicates that you can add fields in the fieldset.

  • Remove icon ( (blue star) ): Indicates that you can remove fields from the fieldset.

...

Field Name

...

Field Type

...

Field Dependency

...

Description

Default Value:

Example:

Label*

Default ValueELT Database Account
ExampleELT RS Account

...

String

...

None.

Specify a unique label for the account.

...

Field set

Specify advanced parameters that you want to include in the request.

This field set consists of the following fields:

  • Field 1

  • Field 2

  • Field 3

Field 1*

Default Value
Example

...

String

...

Debug mode checkbox is not selected.

Field 2

Default Value
Example

...

String

...

Info
  • Asterisk ( * ): Indicates a mandatory field.

  • Suggestion icon ( (blue star) ): Indicates a list that is dynamically populated based on the configuration.

  • Expression icon ( (blue star) ): Indicates the value is an expression (if enabled) or a static value (if disabled). Learn more about Using Expressions in SnapLogic.

  • Add icon ( (blue star) ): Indicates that you can add fields in the fieldset.

  • Remove icon ( (blue star) ): Indicates that you can remove fields from the fieldset.

Field Name

Field Type

Field Dependency

Description

Label*

Default Value: N/A
ExampleSTD DB Acc DeltaLake GCS

String

None.

Specify a unique label for the account.

Account Properties*

Use this field set to configure the information required to establish a JDBC connection with the account.

This field set consists of the following fields:

  • Download JDBC Driver Automatically

  • JDBC URL

  • Use Token Based Authentication

  • Token

  • Database Name

  • Source/Target Location

  • Source JDBC URL

  • Source username

  • Source password

Download JDBC Driver Automatically

Default Value: Not Selected
Example: Selected

Checkbox

None.

Select this checkbox to allow the Snap account to download the certified JDBC Driver for DLP. The following fields are disabled when this checkbox is selected.

  • JDBC JAR(s) and/or ZIP(s) : JDBC Driver

  • JDBC driver class

To use a JDBC Driver of your choice, clear this checkbox, upload (to SLDB), and choose the required JAR files in the JDBC JAR(s) and/or ZIP(s): JDBC Driver field. 

Use of Custom JDBC JAR version

You can use a different JAR file version outside of the recommended listed JAR file versions.

Spark JDBC and Databricks JDBC

If you do not select this checkbox and use an older JDBC JAR file (older than version 2.6.25), ensure that you use: 

  • The old format JDBC URL ( jdbc:spark:// ) instead of the new one ( jdbc:databricks:// )

    • For JDBC driver prior to version 2.6.25, the JDBC URL starts with jdbc:spark://

    • For JDBC driver version 2.6.25 or later, the JDBC URL starts with jdbc:databricks://

  • The older JDBC Driver Class com.simba.spark.jdbc.Driver instead of the new com.databricks.client.jdbc.Driver.

JDBC Driver Class

Default Value: com.databricks.client.jdbc.Driver
Example: com.databricks.client.jdbc.Driver

String

None.

Specify the JDBC driver class to use.

JDBC JARs

Use this field set to define list of JDBC JAR files to be loaded.

JDBC Driver

String

None.

Specify or upload the JDBC driver to use.

Info

The driver name must be unique. If you leave this field blank, the default JDBC driver is loaded.

JDBC URL*

Default Value: N/A
Example: jdbc:spark://adb-2409532680880038.18.azuredatabricks.net:443/default;transportMode=http;ssl=1;httpPath=sql/protocolv1/o/2409532680880038/0326-212833-drier754;AuthMech=3;

String

None.

Enter the JDBC driver connection string that you want to use in the syntax provided below, for connecting to your DLP instance. See Microsoft's JDBC and ODBC drivers and configuration parameters for more information.

jdbc:spark://dbc-ede87531-a2ce.cloud.databricks.com:443/default;transportMode=http;ssl=1;httpPath=
sql/protocolv1/o/6968995337014351/0521-394181-guess934;AuthMech=3;UID=token;PWD=<personal-access-token> 

Avoid passing Password inside the JDBC URL

If you specify the password inside the JDBC URL, it is saved as it is and is not encrypted. We recommend passing your password using the Password field provided, instead, to ensure that your password is encrypted.

Use Token Based Authentication

Default value: Selected
Example: Not selected

Checkbox

None.

Select this checkbox to use token-based authentication for connecting to the target database (DLP) instance. Activates the Token field.

Token*

Default value: N/A
Example: <Encrypted>

String

When Use Token Based Authentication checkbox is selected.

Enter the token value for accessing the target database/folder path.

Database name*

Default Value: N/A
Example: Default

String

None.

Enter the name of the database to use by default. This database is used if you do not specify one in the Databricks Select or Databricks Insert Snaps.

Source/Target Location*

Default Value: None
Example: Google Cloud Storage

Dropdown

None.

Select the target data warehouse. If you want to load the queries from ADLS Gen2 as source, then the selected datawarehouse would serve as a target and vice versa. Following are the options available:

  • None: Select while using the read-only Snaps and you need not write anything to the target data warehouse.

  • Amazon S3

  • Azure Blob Storage

  • Azure Data Lake Storage Gen 2

  • DBFS

  • Google Cloud Storage

  • JDBC

 This activates the following fields:

  • GCS Bucket

  • GCS Folder

  • GCS Authorization type

GCS Bucket

Default Value: N/A
Example: jdbc:snowflake://snaplogic.east-us-2.azure.snowflakecomputing.com

String

None.

Specify the GCS Bucket to use for staging data to be used for loading to the target table.

GCS Folder

Default Value: N/A
Example: db_admin

String

None.

Specify the username of the external source database.

GCS Authorization type

Default Value: Service Account
Example:

String

None.

Specify the password for the external source database.

Advanced Properties

Other parameters that you want to specify to configure the account. This field set consists of the following fields:

  • URL Properties

  • Batch Size

  • Fetch Size

  • Min Pool Size

  • Max Pool Size

  • Max Life Time

  • URL properties

    • URL Property Name

    • URL Property Value

URL properties

Use this field set to define the account parameter's name and its corresponding value. Click + to add the parameters and the corresponding values. Add each URL property-value pair in a separate row. It consists of the following fields:

  • URL property name

  • URL property value

URL property name

Default Value: N/A
ExamplequeryTimeout

N/A

None.

Specify the name of the parameter for the URL property.

URL property value

Default Value: N/A
Example: 0

N/A

None.

Specify the value for the URL property parameter.

Batch size*

Default Value: N/A
Example3

Integer

None.

Specify the number of Snowflake queries that you want to execute at a time.

  • If the Batch Size is one, the query is executed as-is, that is the Snap skips the batch (non-batch execution).@Kalpana Mall

  • If the Batch Size is greater than one, the Snap performs the regular batch execution.

Fetch size*

Default Value: 100
Example: 12

Integer

None.

Specify the number of rows a query must fetch for each execution.

Large values could cause the server to run out of memory.

Min pool size*

Default Value: 3
Example: 0

Integer

None.

Specify the minimum number of idle connections that you want the pool to maintain at a time. 

Max pool size*

Default Value: 15
Example0

Integer

None.

Specify the maximum number of connections that you want the pool to maintain at a time.

Max life time*

Default Value: 60
Example50

Integer

None.

Specify the maximum lifetime of a connection in the pool, in seconds.

  • Ensure that the value you enter is a few seconds shorter than any database or infrastructure-imposed connection time limit.

  • 0 indicates an infinite lifetime, subject to the Idle Timeout value.

  • An in-use connection is never retired. Connections are removed only after they are closed.

Minimum value: 0
Maximum value: No limit

Idle Timeout*

Default Value5
Example4

Integer

None.

Specify the maximum amount of time in seconds that a connection is allowed to sit idle in the pool. 

0 indicates that idle connections are never removed from the pool.

Minimum value: 0
Maximum value: No limit

Checkout timeout*

Default Value10000
Example9000

Integer

None.

Specify the maximum time in milliseconds you want the system to wait for a connection to become available when the pool is exhausted.

Minimum value: 0
Maximum value: No limit

Troubleshooting

Error

Reason

Resolution

Account validation failed.

The Pipeline ended before the batch could complete execution due to a connection error.

Verify that the Refresh token field is configured to handle the inputs properly. If you are not sure when the input data is available, configure this field as zero to keep the connection always open.

Insert excerpt
XYZ Snap Pack
XYZ Snap Pack

...