...
Table of Contents | ||||||
---|---|---|---|---|---|---|
|
Problem
Querying huge amount amounts of data from an external location, such as Azure Blob storage can be a rigorous task as well as and time-consuming if the file size is highhuge. The resulting queried data can be inaccurate and there is a possibility of data loss.
Solution
Using the Azure Synapse SQL Snap Pack, you can automate the querying process for loading bulk data. This solution is efficient as because it is easy to query your storage data and is cost-effective too, because the data processing works on the pay-as-you-go model. Learn more about the Azure Synapse Analytics pricing.
...
Understanding the Solution
Prerequisites:
A valid Azure Storage Account. Learn more about creating an Azure Blob Storage Account.
A valid Azure Synapse SQL Account.
...
Step 1: Configure the Azure Synapse SQL Account as follows:
...
Step 2: Configure the Azure Synapse SQL - Bulk Load Snap.:
...
a. The BulkLoad_TC13_VK target table into which the data from Blob Storage should load.
...
c. The Copy Argument With MAXERRORS=1000
ignores 1000 record errors and continues with the Snap execution.
d. Validate On validating the Pipeline. You , you can view the following query in the output.:
...
Step 23: Configure the JSON Formatter Snap to format the output into JSON data.
...
Step 34: Configure the File Writer Snap to write the file to SLDB. After validating the Pipeline, you can download the file from the SnapLogic Manager to your local .drive:
...
Step 45: Execute the Pipeline.
Downloads
Info |
---|
|
Attachments | ||
---|---|---|
|
...