In this article
...
This Snap uses the UPDATE command internally. It enables updating the specified rows in the target table with new values.
Info |
---|
Use Snowflake - Bulk Instead of using Snowflake—Update Snap, use Snowflake—Bulk Upsert Snap to do efficient bulk efficiently update of records instead of using Snowflake - Update Snaprecords in bulk. The Snowflake Bulk Snaps use the Snowflake’s Bulk API, thus improving the performance. |
Support for Ultra Pipelines
Works in in Ultra Pipelines.
...
Known Issues
multiexcerpt-include-macro | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
|
Snap Views
Type | Format | Number of views | Examples of Upstream and Downstream Snaps | Description |
---|---|---|---|---|
Input | Document |
|
| Table Name, record ID, and record details. |
Output | Document |
|
| Updates a record. |
Error | Error handling is a generic way to handle errors without losing data or failing the Snap execution. You can handle the errors that the Snap might encounter while running the Pipeline by choosing one of the following options from the When errors occur list under the Views tab. The available options are:
Learn more about Error handling in Pipelines. |
Snap Settings
Info |
---|
|
Field Name | Field Type | Description | |
---|---|---|---|
Label* Default Value: Snowflake - Update | String | Specify the name for the Snap. You can modify this to be more specific, especially if you have more than one of the same Snap in your pipeline. | |
Schema Name Default Value: N/A | String/Expression/Suggestion | Specify the database schema name. In case it is not defined, then the suggestion for the Table Name will retrieve all tables names of all schemas. The property is suggestible and will retrieve available database schemas during suggest values. The values can be passed using the pipeline parameters but not the upstream parameter. | |
Table Name* Default Value: N/A | String/Expression/Suggestion | Specify the name of the table in the instance. The table name is suggestible and requires an account setting. The values can be passed using the pipeline parameters but not the upstream parameter. | |
Update condition |
Default Value: N/A |
Without using expressions
Using expressions
| String/Expression | Specify the SQL WHERE clause of the update statement. You can define specific values or columns to update (Set condition |
) in the upstream Snap, such as Mapper Snap, and then use the WHERE clause to apply these conditions on the columns sourced from the upstream Snap. For instance, here is a sample of an Update SQL query: If the Update Condition field is left blank, the condition is applied to all the records of the target table.
|
Refer to the example to understand how to use the Update Condition. | |||
Number of retries Default Value: 0 | Integer/Expression | Specify the maximum number of attempts to be made to receive a response. The request is terminated if the attempts do not result in a response. Ensure that the local drive has sufficient free disk space as large as the expected target file size. If the value is larger than 0, the Snap first downloads the target file into a temporary local file. If any error occurs during the download, the Snap waits for the time specified in the Retry interval and attempts to download the file again from the beginning. When the download is successful, the Snap streams the data from the temporary file to the downstream Pipeline. All temporary local files are deleted when they are no longer needed. | |
Retry interval (seconds) Default Value: 1 | Integer/Expression | Specify the time interval between two successive retry requests. A retry happens only when the previous attempt resulted in an exception. | |
Manage Queued Queries Default Value: Continue to execute queued queries when pipeline is stopped or if it fails | Dropdown list |
Choose an option to determine whether the Snap should continue or cancel the execution of the queued |
queries when |
the pipeline stops or fails. If you select Cancel queued queries when pipeline is stopped or if it fails, then the read queries under execution are cancelled, whereas the write type of queries under execution are not cancelled. Snowflake internally determines which queries are safe to be cancelled and cancels those queries. | |||
Snap Execution Default Value: Execute only | Dropdown list | Select one of the three modes in which the Snap executes. Available options are:
|
Info |
---|
Instead of building multiple Snaps with inter dependent interdependent DML queries, we recommend that you use the Stored Procedure or the Multi Execute Snap. |
Examples
...
Encode Binary data type and update records in Snowflake
The following example Pipeline demonstrates how you can encode binary data (biodata of the employee) and update employee records in the Snowflake database.
Initially, we begin with configuring
...
Configure the File Reader Snap to read data (employee details) from the SnapLogic database.
...
ThenNext, we configure the configure the Binary to Document and Snowflake Select Snaps. We select Select ENCODE_BASE64 in Encode or Decode field (to enable the Snap to encode binary data) in Binary to Document Snap.
Binary to Document Snap Configuration | Snowflake - Select Configuration |
---|
...
Configure the Join Snap to join the document streams from both the upstream Snaps using Outer Join type.
...
We configure Configure the Mapper Snap to pass the incoming data to Snowflake - Update Snap. Note that the target schema for Bio and Text are in binary and varbinary formats respectively.
...
We configure Configure the Snowflake - Update Snap to update the existing records in Snowflake database with the inputs from the upstream Mapper Snap. We use Use the update condition, "BIO = to_binary( '"+$BIO+"','base64')"
to update the records.
...
Upon validation, the Snap updates the records in the Snowflake database.
...
Next, we connect a JSON Formatter Snap with the Snowflake - Update Snap and finally configure the File Writer Snap to write the output onto to a file.
...
...
The following is an example that shows how to update a record in a Snowflake object using the Snowflake Update Snap:
...
The The Snowflake Update Snap updates the table, ADOBEDATA of the schema PRASANNA.
...
Attachments | ||
---|---|---|
|
Snap Pack History
Expand | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
|