In this article
...
Field Name | Field Type | Description | |||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Label* Default Value: Snowflake - Update | String | Specify the name for the Snap. You can modify this to be more specific, especially if you have more than one of the same Snap in your pipeline. | |||||||||||||
Schema Name Default Value: N/A | String/Expression/Suggestion | Specify the database schema name. In case it is not defined, then the suggestion for the Table Name will retrieve all tables names of all schemas. The property is suggestible and will retrieve available database schemas during suggest values. The values can be passed using the pipeline parameters but not the upstream parameter. | |||||||||||||
Table Name* Default Value: N/A | String/Expression/Suggestion | Specify the name of the table in the instance. The table name is suggestible and requires an account setting. The values can be passed using the pipeline parameters but not the upstream parameter. | |||||||||||||
Update condition* Default Value: N/A
| String/Expression | Specify the condition on which you want to execute the update.
| |||||||||||||
Number of retries Default Value: 0 | Integer/Expression | Specify the maximum number of attempts to be made to receive a response. The request is terminated if the attempts do not result in a response. Ensure that the local drive has sufficient free disk space as large as the expected target file size. If the value is larger than 0, the Snap first downloads the target file into a temporary local file. If any error occurs during the download, the Snap waits for the time specified in the Retry interval and attempts to download the file again from the beginning. When the download is successful, the Snap streams the data from the temporary file to the downstream Pipeline. All temporary local files are deleted when they are no longer needed. | |||||||||||||
Retry interval (seconds) Default Value: 1 | Integer/Expression | Specify the time interval between two successive retry requests. A retry happens only when the previous attempt resulted in an exception. | |||||||||||||
Manage Queued Queries Default Value: Continue to execute queued queries when pipeline is stopped or if it fails | Dropdown list | Select this property to decide whether the Snap should continue or cancel the execution of the queued Snowflake Execute SQL queries when you stop the pipeline. If you select Cancel queued queries when pipeline is stopped or if it fails, then the read queries under execution are cancelled, whereas the write type of queries under execution are not cancelled. Snowflake internally determines which queries are safe to be cancelled and cancels those queries. | |||||||||||||
Snap Execution Default Value: Execute only | Dropdown list | Select one of the three modes in which the Snap executes. Available options are:
|
...
The following example Pipeline demonstrates how you can encode binary data (biodata of the employee) and update employee records in the Snowflake database.
...
Initially, we begin with configuring the File Reader Snap to read data (employee details) from the SnapLogic database.
...
Then, we configure the Binary to Document and Snowflake Select Snaps. We select ENCODE_BASE64 in Encode or Decode field (to enable the Snap to encode binary data) in Binary to Document Snap.
Binary to Document Snap | Snowflake - Select |
---|---|
Then, we configure the Join Snap to join the document streams from both the upstream Snaps using Outer Join type.
...
We configure the Mapper Snap to pass the incoming data to Snowflake - Update Snap. Note that the target schema for Bio and Text are in binary and varbinary formats respectively.
...
We configure the Snowflake - Update Snap to update the existing records in Snowflake database with the inputs from the upstream Mapper Snap. We use the update condition, "BIO = to_binary( '"+$BIO+"','base64')"
to update the records.
...
Upon validation, the Snap updates the records in Snowflake database.
...
Next, we connect a JSON Formatter Snap with Snowflake - Update Snap and finally configure the File Writer Snap to write the output onto a file.
...
...
The following is an example that shows how to update a record in a Snowflake object using the Snowflake Update Snap:
...
The The Snowflake Update Snap updates the table, ADOBEDATA of the schema PRASANNA.
...