Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Reverted from v. 6

In this article

...

Snap Pack

Date of Update

Snap Pack Version

Updates

Databricks

42920rc17045

A new Snap Pack for Databricks Lakehouse Platform (Databricks or DLP) introduces the following Snaps:

...

  • When you add an input view to the Databricks - Delete Snap, ensure that you configure the Batch size as 1 in the Snap’s account configuration. For any other batch size, the Snap fails with the exception: Multi-batch parameter values are not supported for this query type.

  • [This issue is fixed in the September 2022 Release] The Databricks - Unload Snap fails with the error: External source/target type is invalid when you attempt unloading data from your DLP instance to a DBFS location.

...

  • In the case of Azure Synapse and Redshift, if the source value is NULL the ELT Pivot Snap fails to return valid results either to the target table or the downstream Snaps.

  • When loading data from a CSV file to a target DLP table, the header names in the file must exactly match the column names in the target table. Otherwise, the ELT Load Snap returns the error—Column names should be the same in the target table and CSV file and aborts the load operation.

  • You cannot add a column to your BigQuery target table with a deleted column name using the ELT Load Snap, as BigQuery reserves deleted column names and data until the pre-configured time travel duration (from 2 through 7 days).

  • Due to an issue with BigQuery table schema management (the time travel feature), an ALTER TABLE action (Add or Update column) that you attempt after deleting a column (DROP action) in your BigQuery target table causes the table to break and the Snap (ELT Load, ELT SCD2, or ELT Execute) to fail.

    • As a workaround, you can consider either avoiding ALTER TABLE actions on your BigQuery instance or creating (CREATE) a temporary copy of your table and deleting (DROP) it after you use it.

  • Due to an issue with the Simba Spark JDBC driver for DLP, you cannot insert a NULL value in the nullable columns of Boolean data type in your DLP instance using any of the write-type Snaps—ELT Load, ELT SCD2, ELT Insert Select, ELT Merge Into, and ELT Execute, when the target table schema is available.

    • The only workaround currently available for this issue is to upgrade your JDBC driver to databricks-jdbc-2.6.25-1.jar, use the corresponding JDBC driver class (com.databricks.client.jdbc.Driver) and JDBC URL in your Snap account.

...