...
SnapLogic and Snowflake simplify data integration and warehousing via the cloud using standard functionality and other Snaps. To connect the Snaps to multiple data sources and analytics to the Snowflake cloud data warehouse solution, use the 12 pre-built Snowflake Snaps and configure them. This diagram represents the integration of SnapLogic and Snowflake:
...
Prerequisites
Key Steps in Configuring Snowflake S3 Dynamic Account
...
Account Properties
...
To connect to Snowflake, you can configure the Snowflake S3 Dynamic Snap Account using the following key steps.
Specify JDBC JAR details: Add details on JDBC Jars, Hostname, Port Number, Authentication Type, Password, Database Name, Warehouse name, and JDBC Driver class.
Specify Amazon S3 Storage details: S3 Access Keys
Advanced Properties Section
Specify URL properties: (This is optional as additional details need to be added here)
Specify other details related to batch size and instance specifications.
Account Properties
Add details on S3 Access Key, S3 Bucket, S3 Folder, S3 Access-key ID, S3 Secret key, and S3 AWS Token.
Specify the JDBC JAR details
To create establish a connection to the database, you must specify it is necessary to have the JDBC JAR file details. By default, the The Snowflake Snap Pack is comes bundled with the JDBC JAR V3.13.25 file to ensure that even when by default, ensuring successful authentication even if you do not provide a JDBC Driver, the account does not fail. To specific JDBC driver. If you prefer to use a custom JAR file version, you must can manually upload a JAR file using it by accessing the database icon under Account Properties, as shown below:
...
Configure additional settings related to the JDBC in the Account Properties section. For more information, refer to Snowflake Dynamic Accounts.
Specify Amazon Simple Storage Service (Amazon S3) Storage details
To specify the Amazon Simple Storage Service (Amazon S3) Storage account details in the Account properties, the following configuration information from Snowflake is essential:
Snowflake Configurations
...
configure the S3 account details, you must have the following:
Snowflake Access in Amazon S3 Storage Account: Verify that the Snowflake user has appropriate access to the Amazon S3 Storage account.
Secure Access to Cloud Storage
...
...
Snowflake Access in
...
Amazon S3 Storage Account
You would need an AWS administrator from your organization to provide all the Snowflake access to your Amazon S3 storage account. Learn more at Virtual Private Cloud Ids for Snowflake Account.
...
Secure Access to Cloud Storage
There are two ways currently provided by Snowflake to configure Identity and access management Snowflake offers two methods for configuring Identity and Access Management (IAM) , which allows you to read to facilitate reading and writing data from and write to an S3 bucket. The You must ensure that the bucket's security and access management policies on the bucket should also allow Snowflake to access the bucketpermit Snowflake's access.
Configure Cloud Storage Integration in Snowflake: Set up the necessary cloud storage integration within Snowflake to establish the connection between Snowflake and the S3 bucket.
Configure an AWS IAM User: Create an AWS IAM user with the required appropriate permissions to access the S3 bucket.
Info |
---|
The option to configure an AWS IAM Role to Access Amazon S3 — this feature is now deprecated by Snowflake and cannot be used. |
...
This includes granting the necessary read and write permissions to facilitate data interaction between Snowflake and the S3 bucket.
Configure Cloud Storage Integration in Snowflake
To use the storage integrations for Snowflake, an administrator must provide all the necessary IAM user permissions in the AWS account. Learn more at https://docs.snowflake.com/en/user-guide/data-load-s3-config-storage-integration
Create an AWS IAM User (Role)
Log in to the AWS console. Open the IAM console, navigate to Access Management > Roles,and click the Create role button.
Select the AWS service as the Trusted entity type, and EC2 as the Use case, and click Next.
In Add permissions policies page, select all or required policies that grant allow your instances access to the resources and then choose Next.
Optional. Add tags for resources because this is optional. Then choose Next. Specify a Role name and description in the Name review and create page.
Review the details and click Create role.
...
Configure access permissions for Snowflake in your AWS Management Console so that you can use an S3 bucket to load and unload data; perform the following stepsTo create an IAM policy:
Log in to the AWS Management Console and Select Identity & Access Management (IAM).
In your AWS console, click Policies and select the policy attached to the role from the list in the table.
Click the JSON tab, specify the policy details in the JSON editor, and click Review Policy.
Review the policy summary. Add , add a name and, optionally, a description for this policy, and select Create policy.
...
Based on the above configurations, specify the parametrized account details for configure the pipeline parameters for the Snowflake pipeline.
Configuring the account with parameterized settings - Passing account details like username, password from pipeline parameters.
S3 Access Key: During the account setup process, an authentication request is used. The Access Key ID serves as authentication for requests made to Amazon S3. This is the authentication request used while setting up the account and requires the user to add details for the Access key id, which authenticates the requests for the Amazon S3.
S3 AWS token: AWS generates a temporary security token that includes the user's permissions and the expiration time of the token. These temporary security tokens have a limited lifespan and prevent preserve the security of S3 resources, and prevents prevent unauthorized access. You can add details of the lifet ime life time under the Advanced Properties section.
S3 Storage Integration: Specify the S3 Storage Integration for Snowflake to be used for staging data instead of using AWS Access-key ID and S3 Secret key. This value is necessary for validating data after a bulk load or bulk insert operation. Learn more at https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_access-keys.html
Info |
---|
The Amazon S3 bucket where Snowflake will write writes the output files must reside in the same region as the cluster. Learn more at https://docs.snowflake.com/en/user-guide/data-load-s3-create-stage. |
Advanced Properties
Specify URL properties: (This is optional as additional details need to be added here)
Specify other details related to batch size and instance specifications.
Specify URL properties
The URL properties are optional details that
...
you can add as needed.
Specify other details
...
Specify values for Batch size, Fetch size, Minimum and Maximum pool size, Maximum life time, Idle timeout and Checkout timeout in this section. For detailed information, refer to Snowflake S3 Dynamic Account.
...
...
Related Content