Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Snowflake object delegates authentication responsibility for external cloud storage (GCS) to a Cloud Storage service account. For accessing Cloud Storage buckets, Snowflake creates a service account that can be granted permissions to access the bucket(s) that store your data files. That means, The SI object allows you to avoid supplying sensitive credentials when loading or unloading data into GCS. SI This object generates Google Service Account entity for your external GCS , along with an optional set of allowed or blocked storage locations. However, the generated Google Service account needs to be granted permissions by Google Cloud platform IAM service.

...

  1. Log on to Snowflake using this URL: https://snaplogic.snowflakecomputing.com/console

  2. Create a Snowflake Storage Integration Object. For example,
    create storage integration JOHN_GCS_STORAGE_INTEGRATION1

      type = external_stage

      storage_provider = gcs

      enabled = true

      storage_allowed_locations = ('gcs://johnsnowflake1/', 'gcs://johnsnowflake2/');

  3. Describe the Snowflake storage integration object to get the GCP Service Account.
    desc integration JOHN_GCS_STORAGE_INTEGRATION;

  4. Create a Custom Role: IAM & Admin -> Roles

    Create a Custom Role:

    1. Navigate to IAM & Admin - > Roles - > CREATE ROLE

    2. Create a role with the appropriate name and add the following permissions:

      • firebase.projects.get

      • resourcemanager.projects.get

      • storage.buckets.create

      • storage.buckets.delete

      • storage.buckets.get

      • storage.buckets.getIamPolicy

      • storage.buckets.list

      • storage.buckets.setIamPolicy

      • storage.buckets.update

      • storage.objects.create

      • storage.objects.delete

      • storage.objects.get

      • storage.objects.getIamPolicy

      • storage.objects.list

      • storage.objects.setIamPolicy

      • storage.objects.update

  5. Provide permission on the Google Cloud Storage bucket to the Snowflake GCP Service Account.

  6. Add the GCP service user and the custom role you have just created.

  7. Bulk load the data to Snowflake.
    COPY INTO "PUBLIC".JOHN_EMP2 

    FROM 'gcs://johnsnowflake1/data/' 

    FILES = ( 'john_emp1.csv' )  

    FILE_FORMAT = (TYPE = CSV SKIP_HEADER = 1)

    STORAGE_INTEGRATION = JOHN_GCS_STORAGE_INTEGRATION;

  8. Unload the data from Snowflake table to a GCS location:

    copy into 'gcs://johnsnowflake1/unload/'

    from "PUBLIC".JOHN_EMP2

    storage_integration = JOHN_GCS_STORAGE_INTEGRATION

    file_format = (format_name = snap_csv_format);

...

See Also