In this article
Problem
Querying huge amount of data from an external location, such as Azure Blob storage can be a rigorous task as well as time-consuming if file size is high. The resulting data can be inaccurate and there is a possibility of data loss.
Solution
Using the Azure Synapse SQL Snap Pack, you can automate the querying process for loading bulk data. This solution is efficient as it is easy to query your storage data and is cost-effective as well as the data processing works on the pay-as-you-go model. Learn more about the Azure Synapse Analytics pricing.
Download this solution.
Understanding the Solution
Prerequisites:
A valid Azure Storage Account. Learn more about creating an Azure Blob Storage Account.
A valid Azure Synapse SQL Account.
Steps | Configuration | Validated Output |
---|---|---|
Step 1: Configure the Azure Synapse SQL - Bulk Load Snap. a. The BulkLoad_TC13_VK target table into which the data from Blob Storage should load. b. The File Name Pattern c. The Copy Argument d. Validate the Pipeline. You can view the following query in the output.
| ||
Step 2: Configure the JSON Formatter Snap to format the output into JSON data. | ||
Step 3: Configure the File Writer Snap to write the file to SLDB. | ||
Step 4: Execute the Pipeline. |