Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Table of Contents
minLevel1
maxLevel5

Overview

SnapLogic and Snowflake simplify streamline cloud-based data integration and warehousing via the cloud using standard functionality and other Snaps. To connect the Snaps to multiple data sources and analytics to with their comprehensive set of features and Snaps. By utilizing the 12 Snowflake Snaps and configuring them, you can easily connect Snaps to various data sources and enable analytics within the Snowflake cloud data warehouse solution, use the 12 pre-built Snaps and configure them. This diagram represents the integration of . The following diagram illustrates the integration between SnapLogic and Snowflake:.

...

The data integration with Snowflake includes Snaps for bulk load, upsert, and unload in addition to standard CRUD (create, read, update, and delete) functionality. To connect to Snowflake, you have to configure the Snowflake S3 Dynamic Snap Account using the key steps suggested below.

...

Prerequisites

Storage Integration in Snowflake

...

Key Steps in Configuring the Snowflake S3 Dynamic

...

Account Properties Section

Advanced Properties Section

  • Specify URL properties: (This is optional as additional details need to be added here)

  • Specify other details related to batch size and instance specifications.

Account Properties

Specify the JDBC JAR details

...

account

For the Snowflake S3 Dynamic account to successfully authorize, create, and configure the details for the account as shown in the workflow.

...

 

  1. Configure the Snowflake database details

  2. Configure the Amazon S3 details

  3. Configure the Advanced properties

  4. Configure the pipeline parameters for the Snowflake pipeline

...

Step 1: Configure the Snowflake database details

To establish a connection to the Snowflake database, configure the following details:

  1. JDBC JAR: By default, the Snowflake Snap Pack

...

  1. comes bundled with the JDBC JAR V3.13.

...

  1. 28 file, ensuring successful authentication even if you do not provide a

...

  1. specific JDBC driver. If you prefer to use a custom JAR file version, you

...

  1. can manually upload

...

  1. it by accessing the database icon under Account

...

  1. properties.

  2. Database details: Specify the Hostname, Port Number, Authentication Type, Password, Database Name, Warehouse name, and JDBC Driver class. For more information, refer to Snowflake Dynamic Accounts.

...

Step 2: Configure the Amazon S3 Storage details

To specify the Amazon Simple Storage Service (Amazon S3) Storage account details in the Account properties, the following configuration information from Snowflake is essential:

  1. Snowflake Configurations

    1. Snowflake Access in AWS S3 Storage Account

    2. Secure Access to Cloud Storage

...

1 a Snowflake Access in AWS S3 Storage Account

You would need an AWS administrator from your organization to Configure the following:

  • S3 Bucket: The Amazon S3 bucket where Snowflake creates the required data files. This bucket must reside in the same region as the cluster. Learn more: Creating an S3 Stage | Snowflake Documentation

  • S3 Access Key ID: The Access Key ID serves as authentication for requests made to Amazon S3. Learn more: AWS Access Key ID.

  • S3 AWS token: AWS generates a temporary security token that includes the user's permissions and the token's expiration time. These temporary security tokens have a limited lifespan and preserve the security of S3 resources, and prevent unauthorized access. You can add details of the lifespan under the Advanced properties section.

  • S3 Storage Integration: Specify the S3 Storage Integration for Snowflake to be used for staging data instead of using the AWS Access-key ID and the S3 Secret key. This value is necessary for validating data after a bulk load or bulk insert operation. Learn more: Managing access keys for IAM users - AWS Identity and Access Management

To configure the S3 account details, you must have the following:

  • Access to Snowflake in Amazon S3 Storage Account: Verify that the Snowflake user has appropriate access to the Amazon S3 Storage account. An AWS administrator of your organization must provide all the Snowflake access to your Amazon S3 storage account. Learn more

...

...

  • Access to Cloud Storage

...

...

Info

The option to configure an AWS IAM Role to Access Amazon S3 — this feature is now deprecated by Snowflake and cannot be used.

You must ensure that the S3 bucket's security and access management policies permit access to Snowflake.

Configure Cloud Storage Integration in the Snowflake platform

To use the storage integrations for Snowflake, an administrator must provide all the necessary IAM user permissions in the AWS account. Learn more: Option 1: Configuring a Snowflake Storage Integration to Access Amazon S3 | Snowflake Documentation

Create an AWS IAM User (Role)

  1. Log in to the AWS console. Open the IAM console, navigate to Access Management > Roles,and click the Create role button.

    Image Added

    Image Removed

     

  2. Select the AWS service as the Trusted entity type, and EC2 as the Use case, and click Next.

    Image Added

    Image Removed

     

  3. In the Add permissions policies page, select all or required policies that grant allow your instances access to the resources and then choose Next.

  4. Add tags for resources because this is optional. Then choose Next. Specify a Role name and description in the Name review and create page.

  5. Review the details and click Create role.

...

Assign permissions to the IAM user

...

to access the S3 bucket

The following permissions are required to access the S3 bucket and folder. Learn more at Configure Secure Access to Cloud Storage.

  • s3:GetBucketLocation

  • s3:GetObject

  • s3:GetObjectVersion

  • s3:ListBucket

Learn more: Configure Secure Access to Cloud Storage.

Create an IAM Policy

Configure access permissions for Snowflake in your AWS Management Console so that you can use an S3 bucket to load and unload data; perform the following steps.

To create an IAM policy:

  1. Log in to the AWS Management Console and Select Identity & Access Management (IAM).

  2. In your AWS console, click Policies and select the policy attached to the role from the list in the table.

  3. Click the JSON tab, specify the policy details in the JSON editor, and click Review Policy.

  4. Review the policy summary. Add , add a name and, optionally, a description for this policy, and select Create policy

 

Based on the above configurations, specify the parametrized account details for pipeline parameters.

  • S3 Access Key: This is the authentication request used while setting up the account and requires the user to add details for the Access key id, which authenticates the requests for the Amazon S3.

  • S3 AWS token: AWS generates a temporary security token that includes the user's permissions and the expiration time of the token. These temporary security tokens have a limited lifespan and prevent the security of S3 resources and prevents unauthorized access. You can add details of the lifet ime under the Advanced Properties section.

  • S3 Storage Integration: Specify the S3 Storage Integration for Snowflake to be used for staging data instead of using AWS Access-key ID and S3 Secret key. This value is necessary for validating data after a bulk load or bulk insert operation. Learn more at https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_access-keys.html

Info

The Amazon S3 bucket where Snowflake will write the output files must reside in the same region as the cluster. Learn more at https://docs.snowflake.com/en/user-guide/data-load-s3-create-stage.

Advanced Properties

Specify URL properties

The URL properties are optional details that can be added by the user.

Specify details on Batch size, Fetch size, Minimum and Maximum pool size, Maximum life time, Idle timeout and Checkout timeout in this section. For detailed information, refer to Snowflake S3 Dynamic Account.

...

 

...

Step 3: Configure the Advanced properties

  1. Specify URL properties: Add any needed optional URL properties.

  2. Specify values for additional properties such as Batch size, Fetch size, Minimum and Maximum pool size. Refer to the Snowflake S3 Dynamic Account for additional information.

    Image Added

Step 4: Configure the pipeline parameters for the Snowflake pipeline

Based on the above configurations, configure the pipeline parameters for the Snowflake pipeline.

  1. Click the Edit Pipeline Properties (blue star) icon in the SnapLogic Designer toolbar.

  2. In the Edit Pipeline dialog box, define the Key-Value Parameters as shown below:

    Image Added

     

  3. Enable the expressions for all the dynamic fields and select the parameters for each dynamic field as applicable:

    Image Added

     

  4. Click Apply.

Demonstration

Watch the following video to understand how to configure the Snowflake S3 Dynamic Account.

...

...

Related content

...