In this article
Table of Contents | ||||
---|---|---|---|---|
|
...
ASI Support for Snowflake in SnapLogic ELT
Prerequisites
To be able to leverage the ELT Load Snap's ASI feature for your Snowflake instance, you need to have the following minimum access privileges respectively on the objects:
Privilege | Object |
---|---|
USAGE | Storage integration |
CREATE STAGE | Schema |
...
Info |
---|
ASI for Snowflake and our OfferingSnowflake's INFER_SCHEMA feature provides the capability to automatically infer schema from PARQUET, AVRO, and ORC source file formats. SnapLogic provides you with the additional ability to infer schema from CSV and JSON source files. The ELT Load Snap can read these files from an Amazon S3, Azure Blob Storage (WASB), ADLS Gen2, or Google Cloud Storage location and use the schema information to create tables in a target Snowflake instance. |
Ability / Support | Description | ||
---|---|---|---|
Source file locations | The ASI feature for Snowflake requires that your source files are available in an Amazon S3, WASB, ADLS Gen2, or Google Cloud Storage location.
| ||
Source file formats |
| ||
Schema inference mechanism | When you set the ELT Load Snap to auto infer source schema, the Snap:
Full file scans
| ||
Merging inferred schema | When you have multiple source files to read and load data from, the ASI feature reads all files (only first 400 records) and automatically merges the new columns/data types detected from the subsequent files with the schema detected from the first file.
Needs a consistent header row across files For accurate schema inference, ensure that the column names and the number of columns in each file's header match with those in the remaining specified files. | ||
Date and time formats | You can specify DATE_FORMAT or TIME_FORMAT as another Format Option with INFERSCHEMA = TRUE in the File Format Option List. We support the following default values for these format options:
Alternatively, you can use DATEFORMAT='AUTO' or TIMEFORMAT='auto'. Learn more at Using automatic recognition with DATEFORMAT and TIMEFORMAT - Snowflake. | ||
Compressed file formats | The ASI feature works for compressed files stored in an Amazon S3 location, but not on WASB, ADLS Gen2, or Google Cloud Storage.
| ||
Authentication types | Source Location Credentials and Storage Integration for Amazon S3, WASB, and ADLSGEN2, Storage Integration for GCS. |
Other Compatible File Format Options
...
You cannot infer some Snowflake-specific data types like BINARY and GEOGRAPHY from CSV files using this Snap. It assigns the VARCHAR data type to columns with BINARY and GEOGRAPHY values.
Support is only for default format values (auto) for the DATE_FORMAT, TIME_FORMAT and TIMESTAMP_FORMAT file format options. Learn more about these file format options in Snowflake Documentation | Format Type Options
Because the Snap reads only the first 400 rows of data in each source file for ASI, the maximum size/value in a column may not get detected.
SnapLogic attempts to prevent the risk of having very large values in smaller size data type columns by assigning the maximum size data type to the columns. This may result in:
More than required memory consumption.
Loss of precision and scale when reading and assigning the DECIMAL data type.
This limitation will be addressed when the Full file scan mode is introduced.
Although the number of records read for ASI is only 400, reading specific individual records that are unusually large may affect both the I/O overhead and the Snap's performance.
If the JCC and the source data files are not colocated, the additional networking overhead may affect the performance of the ELT Load Snap as determined by the record size and available memory and I/O resources available to the JCC.
Due to a limitation in Snowflake in inferring the source files (and thereby the source schema) based on file name patterns in a Storage Integration-based session, the Snap fails to load data from any supported source location into your Snowflake target instance based on any File Name Pattern that you specify. As a workaround, we recommend that you use the File List field set to define the source data file locations for the Snap to load the data.
...