In this article
...
Use this account type to connect Redshift Snaps with data sources.
Prerequisites
- Private project folder
- Project Space’s shared folder
- Global shared folder
Limitations
None.
Known Issues
None.
Account Settings
...
title | Documenting Fields Based On Data Type/UI Element |
---|
**Delete Before Publishing**
Choose from the following sentences to document specific field types.
...
...
Check boxes:
- If selected, <Snap behavior>.
If selected, an empty file is written when the incoming document has no data. - If selected, <behavior>. If not selected/Otherwise, <behavior>
Use "If not selected" if the first sentence is long.
If selected, the Snap uses the file path value as is. Otherwise, the Snap uses the file path value in the URL.
If selected, an empty file is written when the incoming document has empty data. If there is no incoming document at the input view of the Snap, no file is written regardless of the value of the property.
- Select to <action>
Use this if the behavior is binary. Either this or that, where the converse behavior is apparent/obvious.
Select to execute the Pipeline during validation.
Text Fields
- Describe what the field represents/contains. Additional details, as applicable, in a separate sentence. Include caveats such as the field being conditionally mandatory, limitations, etc.
The name of the account.
The account ID that you want to use to log in to the endpoint.
Required if IAM Role is selected.
Do not use this field if you are using batch processing.
Numeric Text Fields
- Describe what the field represents/contains. Additional details, as applicable, in a separate sentence. Include caveats such as the field being conditionally mandatory, limitations, etc. Include special values that impact the field's behavior as a bullet list.
The number of records in a batch.
The number of seconds for which you want the Snap to wait between retries.
The number of seconds for which the Snap waits between retries.
Use the following special values:
* 0: Disables batching.
* 1: Includes all documents in a single request.
Parameter | Data Type | Description | Default Value | Example | ||
---|---|---|---|---|---|---|
Label | String | Required. Unique user-provided label for the account. | N/A | |||
JDBC Driver Class | String | Name of the JBDC driver class to use. | org.postgresql.Driver | |||
JDBC jars | String | The list of JDBC jar files to be loaded. The user can upload the Redshift driver(s) that can override the default org.postgresql.Driver driver. Set the Batch size property to 1 with the JDBC driver version RedshiftJDBC41-1.1.10.1010.jar. | N/A | |||
JDBC Url | String | Enter the Url of the JDBC database. | NA | jdbc:redshift://hostname:port/database | ||
Account properties | ||||||
Endpoint | String | Required. Enter the server's address to connect to. | NA | |||
Port number | Numeric | Required. Enter the database server's port to connect. | 5439 | |||
Database name | String | Required. Enter the database name to connect. | NA | |||
Username | String | Enter the username to connect to the database. Username will be used as the default username when retrieving connections. The username must be valid in order to set up the data source. | NA | |||
Password | String | Enter the password used to connect to the data source. Password will be used as the default password when retrieving connections. The password must be valid in order to set up the data source. | NA | |||
S3 Bucket | String | Enter the external S3 Bucket nameresiding in an external AWS account, to use for staging data onto Redshift.
| NA | |||
S3 Folder | String | Enter the relative path to a folder in S3 Bucket. This is used as a root folder for staging data onto Redshift. | NA | |||
S3 Access-key ID | String | Enter the S3 Access key ID part of AWS authentication.
| NA | |||
S3 Secret key | String | Enter the S3 Secret key part of AWS Authentication.
| NA | |||
IAM properties | Specify the IAM properties information for Redshift to communicate with IAM. | |||||
AWS account ID | String | Enter the ID of the Amazon Web Services account to be used for performing bulk load operation.
| NA | |||
IAM role name | String | Enter the name of the IAM role that has been assigned to the redshift cluster to access the S3 bucket provided above.
| NA | |||
Region name | String | Enter the name of the region the Redshift cluster. | NA | |||
Advanced properties | Specify advanced properties to support this account. | |||||
Auto commit | Checkbox | Select this check box to enable the Snap to commit offsets automatically as messages are consumed and sent to the output view. | Selected | |||
Batch size | Numeric | Required. Enter the number of statements to execute at a time.
| 50 | |||
Fetch size | Numeric | Required. Enter the number of rows to fetch at a time when executing a query.
| 100 | |||
Max pool size | Numeric | Required. Enter the maximum number of connections a pool will maintain at a time.
| 50 | |||
Max life time | Default value: 30 | Required. Enter the maximum lifetime of a connection in the pool. Ensure that the value you enter is a few seconds shorter than any database or infrastructure-imposed connection time limit. A value of 0 indicates an infinite lifetime, subject to the Idle Timeout value. An in-use connection is never retired. Connections are removed only after they are closed. | 30 | |||
Idle Timeout | Numeric Default value: 5 | Required. Enter the maximum amount of time a connection is allowed to sit idle in the pool. A value of 0 indicates that idle connections are never removed from the pool. | 5 | |||
Checkout timeout | Numeric Default value: 10000 | Required. Enter the number of milliseconds to wait for a connection to be available when the pool is exhausted. Zero waits forever. An exception will be thrown after the wait time has expired. | 1000 | |||
URL Properties | Specify the URL properties associated with this account. | |||||
URL property name | Numeric | Enter the URL property name | NA | |||
URL property value | Numeric | Enter the URL property value | NA |
Troubleshooting
...
Account validation failed
...
The Pipeline ended before the batch could complete execution due to a connection error.
...
None.
...