Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Does not support Ultra Pipelines

Limitations

  • Snaps in the Databricks Snap Pack do not support array, map, and struct data types in their input and output documents.

Known Issues

  • The Databricks - Bulk Load Snap fails to execute the DROP AND CREATE TABLE and ALTER TABLE operations on

...

  • tables when using the Databricks SQL persona on the AWS Cloud. The error message Operation not allowed: ALTER TABLE RENAME TO is not allowed for managed Delta tables on S3 is displayed. However, the same actions run successfully when using the Data Science and Engineering persona on the AWS Cloud. This is a limitation on the Databricks endpoint for serverless configurations or SQL endpoint clusters.
    Workaround: If you want to use DROP AND CREATE TABLE action in the Databricks - Bulk Load Snap, then connect a Databricks - Execute Snap upstream of the Bulk Load Snap to drop the table using this syntax: DROP TABLE IF EXISTS <target table name>
    By doing so, the Databricks - Bulk Load Snap does not invoke the ALTER TABLE SQL, and hence the pipeline runs successfully. You can also drop the table through the console before invoking the pipeline that contains the Databricks -Bulk Load Snap.

Known Issues


Cause: This issue arises due to a limitation within the Databricks SQL Admin Console, which prevents you from adding the configuration parameter spark.databricks.delta.alterTable.rename.enabledOnAWS trueto the SQL Warehouse Settings. As a result, the Snap encounters restrictions when attempting to perform certain operations on managed Delta tables stored on Amazon S3.

...