Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • If you remove or rename a source table or object after using it in a data pipeline, its name will still be visible in the source configuration and you will not be able to de-select it.

  • For Amazon S3 data sources, you must add a forward slash if you copy the URL from the Amazon S3 console. For example, change s3://my_bucket to s3:///my_bucket.

  • The first row of CSV files must define the names for the column headings in the target table, with no empty or null values.

  • AutoSync cannot upload a CSV file to the cloud data warehouse if the file name includes single or double quotes or a special character, such as #, $, %, or &.

  • Google BigQuery does not support special characters for names. Data pipelines where the source file, table, or object names include special characters will fail. Whitespace, dash, and underscore in the names are not a problem.

  • Sometimes, AutoSync cannot clean up the Google Cloud Storage staging area after loading from Salesforce to Google BigQuery. If this occurs, you should manually remove the files to retrieve the storage.

  • In the IIP AutoSync Manager screen, you can create Accounts to use with AutoSync. For Google BigQuery, the Account Type dropdown allows you to select Google Service Account or Google Service Account JSON, which are not supported by AutoSync.

  • Sometimes, when you create or edit the synchronization schedule, you can select a start time in the past. If you do this for a data pipeline that is scheduled to run once, it will not run unless you start it manually.

  • Data pipelines that load the Marketo Program Manager object can time out and fail.

  • To use the SCD2 load type for Snowflake, you must modify Snowflake configurations created before the May 2023 release. Because AutoSync automatically sets the timezone to UTC for SCD2 operations, do the following:

    • For an Account created in the IIP, add a Uri property parameter with the name TIMEZONE and a value of UTC.

    • For credentials saved in AutoSync, delete them and create a new configuration.

Fixed Issues

On May 30, 2023 fixed an issue with SCD2 load type for Salesforce to Snowflake where some of the effective begin and end timestamps were in a different format.

Data Automation

Fixed Issues

: Fixed a null pointer exception so no 5XX errors can occur if you download non-existent query details from the Pipeline Execution Statistics of an ELT (write-type) Snap.

...

  • When creating a new project from a Git repository, you can also create a new branch for the new project. Learn more.

  • Support for HashiCorp KV Secrets Engine Version 1 is available, in addition to KV Search Engine Version 2.

Fixed Issues

  • Fixed an issue where Orgs could not be provisioned.

  • The subscription feature Secrets Management -CyberArk is now displayed correctly on the Manager > Subscriptions page.

...

  • The Bouncy Castle library version is upgraded from bcpg-jdk150n[1.69] to bcpg-jdk180n[1.73] in all our Snap Packs. This upgrade brings in the latest security features to enhance the performance of the SnapLogic platform.

  • The Generic Database Account now supports the SSH Tunneling connection. You can now encrypt the network connection between your client and the database server, ensuring a highly secure connection.

  • The Hive Snap Pack is Cloudera-certified for Cloudera Data Warehouse (CDW). You can use the Hive Execute Snap to work with CDW clusters through a Generic Hive Database account.

  • The Marketo Bulk Extract Snap works successfully in the non-lineage path in an Ultra task.

  • The Key passphrase field in the Private Key Account of the Binary Snap Pack now supports expressions, allowing dynamic evaluation using pipeline parameters when the expression button is enabled.

  • Snowflake

    • SnapLogic is specified as a partner tag in all requests directing to Snowflake, making it easier for Snowflake to identify the requests coming from SnapLogic.

    • The default JDBC JAR for the Snowflake Snap Pack is upgraded to version 3.13.28 to support the GEOMETRY data type.

Fixed Issues

  • Fixed an issue with the Encrypt Field Snap, where the Snap failed to support an RSA public key to encrypt a message or field. Now the Snap supports the RSA public key to encrypt the message.

...