Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • JDBC URL

  • Use Token Based Authentication

  • Token

  • Database Name

  • Source/Target Location

  • Source JDBC URL

  • Source username

  • Source password

  • Properties
  • Batch Size

  • Fetch Size

  • Min Pool Size

  • Max Pool Size

  • Max Life Time

  • URL properties

    • URL Property Name

    • URL Property Value

  • Field Name

    Field Type

    Field Dependency

    Description

    Label*

    Default Value: N/A
    ExampleSTD DB Acc DeltaLake GCS

    String

    None.

    Specify a unique label for the account.

    Account Properties*

    Use this field set to configure the information required to establish a JDBC connection with the account.

    This field set consists of the following fields:

    Download JDBC Driver Automatically

    Download JDBC Driver Automatically

    Default Value: Not Selected
    Example: Selected

    Checkbox

    None.

    Select this checkbox to allow the Snap account to download the certified JDBC Driver for DLP. The following fields are disabled when this checkbox is selected.

    • JDBC JAR(s) and/or ZIP(s) : JDBC Driver

    • JDBC driver class

    To use a JDBC Driver of your choice, clear this checkbox, upload (to SLDB), and choose the required JAR files in the JDBC JAR(s) and/or ZIP(s): JDBC Driver field. 

    Use of Custom JDBC JAR version

    You can use a different JAR file version outside of the recommended listed JAR file versions.

    Spark JDBC and Databricks JDBC

    If you do not select this checkbox and use an older JDBC JAR file (older than version 2.6.25), ensure that you use: 

    • The old format JDBC URL ( jdbc:spark:// ) instead of the new one ( jdbc:databricks:// )

      • For JDBC driver prior to version 2.6.25, the JDBC URL starts with jdbc:spark://

      • For JDBC driver version 2.6.25 or later, the JDBC URL starts with jdbc:databricks://

    • The older JDBC Driver Class com.simba.spark.jdbc.Driver instead of the new com.databricks.client.jdbc.Driver.

    JDBC Driver Class

    Default Value: com.databricks.client.jdbc.Driver
    Example: com.databricks.client.jdbc.Driver

    String

    None.

    Specify the JDBC driver class to use.

    JDBC JARs

    Use this field set to define list of JDBC JAR files to be loaded.

    JDBC Driver

    String

    None.

    Specify or upload the JDBC driver to use.

    Info

    The driver name must be unique. If you leave this field blank, the default JDBC driver is loaded.

    JDBC URL*

    Default Value: N/A
    Example: jdbc:spark://adb-2409532680880038.18.azuredatabricks.net:443/default;transportMode=http;ssl=1;httpPath=sql/protocolv1/o/2409532680880038/0326-212833-drier754;AuthMech=3;

    String

    None.

    Enter the JDBC driver connection string that you want to use in the syntax provided below, for connecting to your DLP instance. See Microsoft's JDBC and ODBC drivers and configuration parameters for more information.

    jdbc:spark://dbc-ede87531-a2ce.cloud.databricks.com:443/default;transportMode=http;ssl=1;httpPath=
    sql/protocolv1/o/6968995337014351/0521-394181-guess934;AuthMech=3;UID=token;PWD=<personal-access-token> 

    Avoid passing Password inside the JDBC URL

    If you specify the password inside the JDBC URL, it is saved as it is and is not encrypted. We recommend passing your password using the Password field provided, instead, to ensure that your password is encrypted.

    Use Token Based Authentication

    Default value: Selected
    Example: Not selected

    Checkbox

    None.

    Select this checkbox to use token-based authentication for connecting to the target database (DLP) instance. Activates the Token field.

    Token*

    Default value: N/A
    Example: <Encrypted>

    String

    Appears when Use Token Based Authentication checkbox is selected.

    Enter the token value for accessing the target database/folder path.

    Database name*

    Default Value: N/A
    Example: Default

    String

    None.

    Enter the name of the database to use by default. This database is used if you do not specify one in the Databricks Select or Databricks Insert Snaps.

    Source/Target Location*

    Default Value: None
    Example: Google Cloud Storage

    Dropdown

    None.

    Select the target data warehouse. If you want to load the queries from ADLS Gen2 as source, then the selected datawarehouse would serve as a target and vice versa. Following are the options available:

    • None: Select while using the read-only Snaps and you need not write anything to the target data warehouse.

    • Amazon S3

    • Azure Blob Storage

    • Azure Data Lake Storage Gen 2

    • DBFS

    • Google Cloud Storage

    • JDBC

     This activates the following fields:

    • GCS Bucket

    • GCS Folder

    • GCS Authorization type

    GCS Bucket

    Default Value: N/A
    Example: sl-test-bucket

    String

    Appears when Google Cloud Storage is selected for Source/Target Location.

    Specify the GCS Bucket to use for staging data to be used for loading to the target table.

    GCS Folder

    Default Value: N/A
    Example: test_data

    String

    Appears when Google Cloud Storage is selected for Source/Target Location.

    Specify the relative path to a folder in the GCS Bucket. This is used as a root folder for staging data.

    GCS Authorization type

    Default Value: Service Account

    String

    Appears when Google Cloud Storage is selected for Source/Target Location.

    Select the authentication type to load data. By default the authentication type is Service Account.

    Service Account Email*

    Default Value: N/A
    Example: bigdata@iam.gserviceaccount.com

    String/Expression

    Appears when Google Cloud Storage is selected for Source/Target Location.

    Specify the service account email allowed to connect to the BigQuery database. This will be used as the default username when retrieving connections. The Email must be valid in order to set up the data source

    Service Account Key File Path*

    Default Value: N/A
    Example: 7f7c54a1c19b.json

    String/Expression

    Appears when Google Cloud Storage is selected for Source/Target Location.

    Specify the path to Key file used to authenticate the service account email address with the BigQuery database.

    Advanced Properties

    Other parameters that you want to specify to configure the account. This field set consists of the following fields:

    URL

    URL properties

    Use this field set to define the account parameter's name and its corresponding value. Click + to add the parameters and the corresponding values. Add each URL property-value pair in a separate row. It consists of the following fields:

    • URL property name

    • URL property value

    URL property name

    Default Value: N/A
    ExamplequeryTimeout

    N/A

    None.

    Specify the name of the parameter for the URL property.

    URL property value

    Default Value: N/A
    Example: 30000

    N/A

    None.

    Specify the value for the URL property parameter.

    Batch size*

    Default Value: N/A
    Example50

    Integer

    None.

    Specify the number of Snowflake queries that you want to execute at a time.

    • If the Batch Size is one, the query is executed as-is, that is the Snap skips the batch (non-batch execution).@Kalpana Mall

    • If the Batch Size is greater than one, the Snap performs the regular batch execution.

    Fetch size*

    Default Value: 100
    Example: 10

    Integer

    None.

    Specify the number of rows a query must fetch for each execution.

    Large values could cause the server to run out of memory.

    Min pool size*

    Default Value: 3
    Example: 15

    Integer

    None.

    Specify the minimum number of idle connections that you want the pool to maintain at a time. 

    Max pool size*

    Default Value: 15
    Example0

    Integer

    None.

    Specify the maximum number of connections that you want the pool to maintain at a time.

    Max life time*

    Default Value: 60
    Example50

    Integer

    None.

    Specify the maximum lifetime of a connection in the pool, in seconds.

    • Ensure that the value you enter is a few seconds shorter than any database or infrastructure-imposed connection time limit.

    • 0 indicates an infinite lifetime, subject to the Idle Timeout value.

    • An in-use connection is never retired. Connections are removed only after they are closed.

    Minimum value: 0
    Maximum value: No limit

    Idle Timeout*

    Default Value5
    Example4

    Integer

    None.

    Specify the maximum amount of time in seconds that a connection is allowed to sit idle in the pool. 

    0 indicates that idle connections are never removed from the pool.

    Minimum value: 0
    Maximum value: No limit

    Checkout timeout*

    Default Value10000
    Example9000

    Integer

    None.

    Specify the maximum time in milliseconds you want the system to wait for a connection to become available when the pool is exhausted.

    If you provide 0, the Snap waits infinitely until the connection is available. Therefore, we recommend you not to specify 0 for Checkout Timeout.

    Minimum value: 0
    Maximum value: No limit

    Snap Pack History

    Expand

    Insert excerpt
    Databricks Snap Pack
    Databricks Snap Pack
    nameDatabricks Snap Pack History
    nopaneltrue

    ...