On this Page
Snap type: | Write | ||||||
---|---|---|---|---|---|---|---|
Description: | The Snap Performs a bulk load operation from the input view document stream to the target table by using SQLServerBulkCopy API. It uses a memory buffer to send records to the target table instead of a temporary CSV file. The Batch size and Bulk copy timeout values can be used to tune the performance and memory used. ETL Transformations and Data Flow The input document stream is converted to multiple batches, which are bulk-loaded to the target table by using SQLServerBulkCopy API. The Snap converts the input data values according to the corresponding SQL Server column data types to Java class objects which SQLServerBulkCopy accepts.
Input & Output
| ||||||
Prerequisites: | SQL Server JDBC driver is required in the Azure SQL database account. SQL Server JDBC driver version 4.1 and older do not support SQLServerBulkCopy API. | ||||||
Limitations and Known Issues: |
| ||||||
Configurations: | Accounts and Access This Snap uses account references created on the Accounts page of SnapLogic Manager to handle access to this endpoint. See Azure SQL Account for information on setting up this type of account. Views
| ||||||
Troubleshooting: | None. | ||||||
Settings | |||||||
Label | Required. The name for the Snap. You can modify this to be more specific, especially if you have more than one of the same Snap in your pipeline. | ||||||
Schema Name | The database schema name. In case it is not defined, then the suggestion for the table name will retrieve all tables names of all schemas. The property is suggestible and will retrieve available database schemas during suggest values.
Example: SYS Default value: None | ||||||
Table Name | Required. The target table to load the incoming data into.
Example: users Default value: None | ||||||
Create table if not present | Select this check box to create target table in case it does not exist; else the system throws "table not found" error.
Default value: Deselected
| ||||||
Batch size | Sets the number of rows in each batch. Example: 1000 Default value: 10000 | ||||||
Bulk copy timeout (sec) | Sets the number of seconds for each batch operation to complete before it times out.
Default value: 60 | ||||||
Advanced properties | The following additional options for SQLServerBulkCopy are available (true or false, default: false). Check constraints - Sets whether constraints are to be checked while data is being inserted or not. Fire triggers - Sets whether the server should be set to fire insert triggers for rows being inserted into the database. Keep identity - Sets whether or not to preserve any source identity values. Keep nulls - Sets whether to preserve null values in the destination table regardless of the settings for default values, or if they should be replaced by default values (where applicable). Table lock - Sets whether SQLServerBulkCopy should obtain a bulk update lock for the duration of the bulk copy operation. Use internal transaction - Sets whether each batch of the bulk-copy operation will occur within a transaction or not. Refer to the Microsoft document for further detail. | ||||||
|
In this pipeline, the Azure SQL Bulk Load Snap loads the data from the input stream. The data is bulk loaded into the table "dbo"."datatypetest".
The successful execution of the pipeline displays the below output preview with the status of records loaded:
The key configurations for the Snap are:
In the below pipeline, the values are passed via the upstream for the Azure Bulk Load Snap to update the table, "dbo"."@prasanna1" on the Azure.
The Azure Bulk Load Snap Loads the data into the table and the Azure Execute Snap reads the table contents respectively:
With Expressions
In the below pipeline:
The following describes a pipeline, with a broader business logic involving multiple ETL transformations, that shows how typically in an enterprise environment, Azure SQL Bulk Load functionality is used. The pipeline download is available below.
In the below pipeline, the records from a table on the SQL Server are loaded into a table on the Azure SQL.The Azure SQL Execute Snap reads the records the loaded records on the Azure SQL table.
Extract: The SQL Server Select reads the records form a table on SQL Server.
Transform: The Mapper Snap maps the metadata from the input schema (SQL Server) to the output schema (Azure SQL)
Load: Azure SQL Bulk Load Snap loads the records into the Azure SQL table.
Read: The Azure Execute Snap reads the loaded records on the Azure SQL table.
A similar enterprise scenario where the records from the Oracle server are loaded into the Azure SQL Server. The loaded records are transformed to JSON and written to a file. The Azure SQL Execute Snap reads the records from the table on the Azure SQL.
The pipeline download is available below.
Extract: The Oracle Select reads the records form a table on the Oracle Server.
Transform: The JSON Formatter Snap transforms the output records in to a JSON format and writes them to a file using the File Writer Snap.
Load: Azure SQL Bulk Load Snap loads the records into the Azure SQL table.
Read: The Azure Execute Snap reads the loaded records on the Azure SQL table.