In this article
Platform:
API Management:
New Snaps:
ELT for Cloud Data Platforms:
New! Introducing SnapLogic Flows
|
|
|
|
|
|
During the 4.26 release cycle's auto-upgrade date of September 18, the Snaplex-based Scheduler will be enabled by default through the Control Plane. The new scheduling mechanism improves the timeliness and reliability of Scheduled Task executions. You do not need to update your Snaplex instances to enable the new scheduler, unless you have set the feature flag to false for your Org, in which case you will have to switch it to true in order for the Snaplex-based Scheduler to be enabled for your Org. Contact support@snaplogic.com for more information. |
Zero downtime of the SnapLogic Platform during the release window: Starting with this release, we are implementing zero downtime for the SnapLogic Platform during quarterly release updates. Thus, you will be able to access the SnapLogic UI and all your Pipelines will run as schedule during the release window. For details, see the SnapLogic release process article.
Email Encryption: Adds the ability to upload a public key in Manager to be used for emails. You can now encrypt your emails using this public key. Email Encryption is a subscription feature.
Lifecycle Management for APIs: Added the capability to manage the lifecycle of your SnapLogic APIs. You can now publish APIs created from API specifications to the new developer portal in the Portal Manager console. You can also unpublish, deprecate, and retire an API version, providing full API lifecycle management features.
Portal Manager: Added console for managing the developer portal for your Org. You can now customize the new Developer Portal with your own branding and URL suffix. The Portal Manager also enables you to view API status on the API Catalog.
API Developer Portal: Introduced a new page where API consumers can explore and view APIs created in your Org. The API Catalog provides a space where APIs can be exposed to users outside of the SnapLogic ecosystem. You control the consumer's ability to view and call APIs by adding API policies in the API Version Details page.
Create APIs from Assets in Manager: Added options in the API Manager console to create APIs directly from Projects and the Assets. You can now build your APIs by developing Pipelines and Tasks, and add other assets like Accounts and Files to the Project to be uploaded as an API in the API Manager. You can also create an Empty API asset as a placeholder.
OAuth 2.0 Credentials API Policy: Added a new OAuth2 flow to authenticate users using client credentials in the API Policy.
Batching Support: Added batching support to the Pipeline Execute Snap. In contrast to the Reuse mode, the batching field enables users to specify the number of documents to be processed to completion through the child Pipelines before processing the next document in the batch. Accordingly, batch mode does not support reuse mode.
/wiki/spaces/DRWIP/pages/2295988225 Added Windows 2019 support for your Groundplex instances.
SnapLogic will sunset support for Windows Server 2012 on . Hence, ensure that you upgrade your Groundplex instances to Windows 2016 or 2019. |
/wiki/spaces/DRWIP/pages/2294218801 Snaplex tab widgets now display one-minute data intervals for time ranges up to seven days.
OAuth Accounts: Optimized the OAuth account refresh operations by moving them to the customer Groundplex instances. Additionally, the Box OAuth 2.0 account type was also added to Snaps that do not restart Ultra Task instances because Box has the capability of reloading the new account token.
Your Groundplex thread/heap dump files are now stored in a folder as configured in the java.io.tmpdir
property, which is usually /tmp on Linux.
In 4.26, we fixed the Ignore empty stream checkbox functionality in the Snap to actually produce an empty binary stream output in case there is no input document when the checkbox is NOT enabled. Previously, there was no output even when the checkbox was NOT enabled. If your existing Pipelines (prior to 4.26) use the Ignore empty stream functionality, then you might need to revisit your Pipeline design and either enable or disable the said checkbox depending upon the expected behavior. |
Binary Copy: This Snap that belongs to the Flow Snap Pack enables you to copy a binary stream to the Snap’s output views. Use this Snap if you want to send the same information to multiple endpoints.
Copybook Snap Pack: This Snap Pack enables you to work with Cobol Copybooks. This Snap Pack has the following Snaps:
Shopify: This Snap Pack allows businesses to set up an online store and sell their products online. You can use this Snap Pack to create orders, products, customers, and run automated workflows. The Shopify Snap Pack has the following Snaps:
|
Added the following Tableau Snaps to support hyper extract files for Tableau 10.25 and later versions. Hyper is Tableau's in-memory data engine that is optimized for fast data ingestion and analytical query processing on large or complex data sets.
Zuora: Added the following Snaps and account types to connect these Snaps with the Zuora REST API.
Anaplan: Added the following fields to the Anaplan Write Snap:
Azure Active Directory: Enhanced the Azure Active Directory Create Entry and Azure Active Directory Update Entry Snaps to support Pipeline parameters and upstream values for the Attribute name setting under Attributes and the Snap Pack to support proxy authentication.
Box:
Google Sheets: Enhanced the Worksheet Writer Snap to populate the target schema preview with headers and associated data types (when the data is written to an existing worksheet with a valid header) in the upstream Snap.
JDBC:
Enhanced the Kafka SSL Account with new fields (Registry Username or Key and Registry Password or Secret) for Schema Registry authentication. The two existing SASL properties (SASL Username and SASL Password) are revised to SASL Username or Key and SASL Password or Secret, respectively.
Improved the handling of interrupted/aborted Kafka Snaps to ensure a proper clean-up of metrics.
Optimized the Kafka Producer Snap to initialize the Kafka API only if there is at least one input document.
Fixed an issue of account passwords being included in the log messages output of Kafka Snaps. The account passwords are now hidden in the logs for both Kafka Consumer and Kafka Producer Snaps.
Apache Kafka client library is upgraded from version 2.6.0 to 2.8.0.
Confluent Kafka client libraries are upgraded from version 5.2.1 to 6.2.0.
OpenAPI: Enhanced the OpenAPI Snap with the following two fields:
PostgreSQL: Enhanced the performance of PostgreSQL - Bulk Load Snap significantly. We expect the Snap to execute up to three times faster than the previous version for enterprise workloads.
REST:
Salesforce: Enhanced the Salesforce Read Snap to enable you to add an optional second output view to display the schema of a target object as the output document.
ServiceNow: Enhanced the ServiceNow Query, ServiceNow Insert, ServiceNow Update, and ServiceNow Delete Snaps with a retry mechanism that includes the following fields:
Added support for all existing Snowflake Snap accounts to connect to a Snowflake instance hosted on the Google Cloud Platform
SOAP: Enhanced the SOAP Execute Snap with a new checkbox Escape special characters to escape XML special characters in variable values when inserting values into the Velocity template.
SQL Server: Fixed an issue with the SQL Server - Bulk Load Snap where the Snap fails when the login password contains a colon or a less than (<) symbol.
Improved the error messages in Teradata Snap Pack, Oracle Snap Pack, MySQL - Select in MySQL Snap Pack, Channel Operations Snap in Slack Snap Pack, Create Event Snap in Exchange Online and Teams - Create Team in the Teams Snap Pack where the Snaps fail with a null pointer exception error when the given account information is invalid.
Enhanced the JIRA, Coupa, SOAP Snap Packs and Workday Prism Analytics Bulk Load Snap in Workday Prism Analytics Snap Pack to support HTTP proxy authentication.
Updated the AWS SDK from version 1.11.688 to 1.11.1010 in the DynamoDB Snap Pack Redshift Snap Pack, and added a custom SnapLogic User Agent header value.
Revised the names of the following Snap Packs. This does not affect your subscription to these Snap Packs.
Old Snap Pack Name | New Snap Pack Name | |
---|---|---|
Confluent Kafka | Kafka | |
Google Spreadsheet | Google Sheets | |
Tableau 9&10 | Tableau
|
The Snap Pack consisting of the Tableau 8 version Snaps (Tableau 8 Write and TDE 8 Formatter) is deprecated due to no customer usage. Contact support@snaplogic.com, if your existing Pipelines use Snaps from the deprecated Snap Pack.
None.
SnapLogic's data automation solution speeds up the identification and integration of new data sources, and the migration of data from legacy systems. The solution can automatically detect duplicate, erroneous, or missing data, and identify structures and formats that do not match the data model. Data automation can accelerate the loading and transformation of your data into the data warehouse, speeding up the data-to-decisions process.
Starting with the 4.26 release, you can use the Snaps in the ELT Snap Pack to perform ELT operations on Databricks Lakehouse Platform (DLP). This is in addition to the existing list of supported target databases—Snowflake, Redshift, and Azure Synapse.
ELT Snaps automatically use a corresponding JDBC JAR file to connect to your target database and to perform the load and transform operations.
Database | Certified JDBC JAR File |
---|---|
Azure Synapse | mssql-jdbc-9.2.1.jre8.jar |
Databricks Lakehouse Platform (DLP) | SimbaSparkJDBC42-2.6.17.1021.jar |
Redshift | redshift-jdbc42-2.0.0.2.jar |
Snowflake | snowflake-jdbc-3.13.1.jar |
Though we recommend you to use the above JAR file versions, you can choose to use a different version, based on your environment. |
Enhanced the ELT Snap preview to support the following Snowflake data types: array, object, variant, and timestamp.
The Snaps convert the values to hexadecimal (HEX) equivalents—the default setting for the session parameter BINARY_OUTPUT_FORMAT in Snowflake. See Session Parameters for Binary Values for more information.
If this setting is different from hexadecimal (such as base64) in the Snowflake table, the Snaps still convert the values to hexadecimal equivalents for rendering them in the Snap preview.
The ELT Database account is now mandatory for all Snaps in the ELT Snap Pack.
Starting with the 4.26 release, all Snaps in the ELT Snap Pack (except the ELT Copy Snap) require an account to connect to the respective target database. Your existing Pipelines that do not use an account may fail. We recommend you to associate an ELT Database Account to each of the ELT Snaps (except ELT Copy Snap) for your Pipelines. |
_dbfsPwd
, _sfUsername
.ELT Pipelines created prior to 4.24 GA release using one or more of the ELT Insert Select, ELT Merge Into, ELT Load, and ELT Execute Snaps may fail to show expected preview data due to a common change made across the Snap Pack for the current release (4.26 GA). In such a scenario, replace the Snap in your Pipeline with the same Snap from the Asset Palette and configure the Snap's Settings again.
In case of Databricks Lakehouse Platform, all ELT Snaps' preview data (during validation) contains a value with precision higher than that of the actual floating point value (float data type) stored in the Delta. For example, 24.123404659344 instead of 24.1234. However, the Snap reflects the exact values during Pipeline executions.
WHERE
clause (ELT Filter Snap)WHEN
clauseON
condition (ELT Join, ELT Merge Into Snaps)HAVING
clauseQUALIFY
clauseAND
conditionInside SQL query editor (ELT Select and ELT Execute Snaps)
As a workaround while using these SQL query constructs, you can: Precede this Snap with an ELT Transform Snap to re-map the '_' column references to suitable column names (that do not begin with an _ ) and reference the new column names in the next Snap, as needed. |
SELECT * FROM CSV.`/mnt/csv1.csv`
returns default names such as _c0, _c1, _c2 for the columns which this Snap cannot interpret. To avoid this scenario, you can:CREATE TABLE csvdatatable (a1 int, b1 int,…) USING CSV `/mnt/csv1.csv`
where a1, b1, and so on are the new column names.After the fix with 426Patches11262 on , the ELT Load Snap on Databricks Lakehouse Platform (DLP) may cause intermittent null-pointer exceptions only when you specify an incorrect source file path in any of the DBFS Folder Path and the File List > File fields.
The query contains a LIMIT clause on a Snowflake, Redshift or Databricks Lakehouse Platform target instance: The SQL query created during Pipeline validation includes an additional LIMIT clause, for example: SELECT * FROM "STORE_DATA"."ORDERS" LIMIT 10 LIMIT 990
.
SELECT * FROM "STORE_DATA"."ORDERS" LIMIT 10 OFFSET 4 LIMIT 990
. SnapLogic Flows is a breakthrough new user interface that empowers business users to self-build new application integrations and data automations to support the specific needs of their respective functions and departments. Flows removes coding barriers and makes it easy for business users to succeed in developing and integrating applications such as Salesforce, Marketo, Google Sheets, Slack, to get the data and insights they need to make faster business decisions.
Flows also enables IT to step away from the core development of these solutions but gives them the ability to add requirements and guardrails for non-technical developers, enabling IT to centrally maintain visibility, control access, and oversee what is developed before it is pushed to production.
To get started, register for Flows.