Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • If you save a data pipeline and then edit its name, you cannot delete it.

  • For Amazon S3 data sources, you must add a forward slash if you copy the URL from the Amazon S3 console. For example, change s3://my_bucket to s3:///my_bucket.

  • AutoSync cannot upload a CSV file to the cloud data warehouse if the name includes a special character, such as #, $, %, or &.

  • Google BigQuery does not support special characters for names. Data pipelines where the source file, table, or object names include special characters will fail. Whitespace, dash, and underscore in the names are not a problem.

  • At times, AutoSync cannot clean up the Google Cloud Storage staging area after loading from Salesforce to Google BigQuery. If this occurs, you should manually remove the files to retrieve the storage.

  • In the IIP AutoSync Manager screen, you can create Accounts to use with AutoSync. For Google BigQuery, the Account Type dropdown allows you to select Google Service Account or Google Service Account JSON, which are not supported by AutoSync.

  • In some cases, when you create or edit the synchronization schedule, you can select a start time in the past. If you do this for a data pipeline that is scheduled to run once, it will not run unless you start it manually.

  • To use the SCD2 load type for Snowflake, you must modify Snowflake configurations created before the May 2023 release. Because AutoSync automatically sets the timezone to UTC for SCD2 operations, do the following:

    • For an Account created in the IIP, add a Uri property parameter with the name TIMEZONE and a value of UTC.

    • For credentials saved in AutoSync, delete them and create a new configuration.

...

  • The UI now uses accessible colors. Some icons are larger and more readable. For example, note the status icons on the summary cards in the Execution overview:

    Image RemovedImage Added

Fixed Issues

TBD

...