Generic JDBC - Select
Â
In this article
Overview
You can use this Snap to fetch data from the connected database by providing a table name and configuring the connection. This Snap also supports DML operation (SELECT) when using the AWS Athena database. This Snap produces the records from the database on its output, view which can then be processed by a downstream Snap.Â
Queries produced by the Snap have an equivalent format:
SELECT * FROM [table] WHERE [where clause] ORDER BY [ordering] LIMIT [limit] OFFSET [offset]
The WHERE clause can only use variables, not constants or Pipeline parameters.Â
A good example for a where clause is:Â SALARY = $SALARYÂ (here, we use the SALARY variable of the input document).Â
Snap Type
Generic JDBC - Select Snap is a Read-type Snap that executes SQL SELECT statement.
Prerequisites
None.
Support for Ultra Pipelines
Works in Ultra Pipelines. However, we recommend that you not use this Snap in an Ultra Pipeline.
Known Issues
The metadata output in the second output preview is not displayed in a table format when your target database is AWS Athena.
The suggestions list is not populated for the Table name field when your target database is AWS Athena.
When the Generic JDBC—Select Snap connects to the Sybase database to retrieve
BigTime
-type data, the Snap displays both date and time for the data type.
Limitations
None.
Snap Views
Type | Format | Number of Views | Examples of Upstream and Downstream Snaps | Description |
---|---|---|---|---|
Input | Document |
|
| Document that provides values for one or more properties of the Snap or simply for pass through purposes. This Snap has at most one document input view. If the input view is defined, then the where clause can substitute incoming values for a given expression, such as a table name or as a variable as part of the WHERE clause. |
Output | Document |
|
| Document for each record retrieved. Special types such as TIMESTAMP, TIMESTAMPTZ and TIMESTAMPLTZ are converted into SnapLogic internal date type representations which then can be consumed by downstream Snaps just like any other data type. This Snap has one document output view by default A second view can be added to dump out the metadata for the table as a document. The metadata document can then be fed into the second input view of a database Insert or Bulk Load Snap so that the table is created in the database with a similar schema as the source table. |
Error | Error handling is a generic way to handle errors without losing data or failing the Snap execution. You can handle the errors that the Snap might encounter when running the Pipeline by choosing one of the following options from the When errors occur list under the Views tab:
Learn more about Error handling in Pipelines. |
Snap Settings
Asterisk (*): Indicates a mandatory field.
Suggestion icon (): Indicates a list that is dynamically populated based on the configuration.
Expression icon (): Indicates whether the value is an expression (if enabled) or a static value (if disabled). Learn more about Using Expressions in SnapLogic.
Add icon (): Indicates that you can add fields in the field set.
Remove icon (): Indicates that you can remove fields from the field set.
Field Name | Field Type | Description | |
Label* Â Default Value:Â Generic JDBC - Select | String | Specify the name for the Snap. You can modify this to be more specific, especially if you have more than one of the same Snap in your Pipeline. | |
Schema Name  Default Value: N/A | String/Expression | The database schema name. Selecting a schema filters the Table name list to show only those tables within the selected schema.  | |
Table Name* Â Default Value:Â N/A | String/Expression | Specify the table to execute the select query on. | |
Where Clause  Default Value: N/A | String/Expression | Specify the WHERE clause of the SELECT statement. This supports document value substitution (such as $person.firstname will be substituted with the value found in the incoming document at the path). However, you may not use a value substitution after "IS" or "is" word. Examples: Without using expressions
Using expressions
Using expressions that join strings together to create SQL queries or conditions has a potential SQL injection risk and is hence unsafe. Ensure that you understand all implications and risks involved before using concatenation of strings with '=' Expression enabled. | |
Order By | Use this fieldset to specify the columns in the order in which you want to sort the database. The default database sort order will be used. | ||
Column Names  Default Value: N/A | String/Expression | Specify the column names. | |
Limit offset  Default Value: N/A | Integer/Expression | Specify the offset for the limit clause. This is where the result set should start. Starting row for the query. Note that some databases do not support OFFSET, such as Teradata, and the Limit offset property is ignored. | |
Limit rows  Default Value: N/A | Integer/Expression | Specify the number of rows to return from the query. | |
Output fields | Use this fieldset to specify the output fields for SQL SELECT statement. | ||
Output Field  Default Value: N/A | String/Expression | Specify or select output field names for SQL SELECT statement. To select all fields, leave it at default.  | |
Fetch Output Fields In Schema  Default Value: Deselected | Checkbox | Select this check box to include only the selected fields or columns in the Output Schema (second output view). If you do not provide any Output fields, all the columns are visible in the output.  | |
Pass-through  Default Value: Selected | Checkbox | Select to make the input document will be pass through the output view under the key 'original'.  | |
Ignore empty result  Default Value: Deselected | Checkbox | Select if you want no document to be written to the output view when a SELECT operation does not produce any result. If this property is not selected and the Pass-through property is selected, the input document will be passed through to the output view. | |
Auto Commit  Default Value: False | Dropdown list | Select one of the options for this property to override the state of the Auto commit property on the account. The Auto commit at the Snap-level has three values: True, False, and Use account setting. The expected functionality for these modes are:
| |
Match data types  Default Value: Deselected | Checkbox | This property applies only when the Output fields property is provided with any field value(s). If this property is selected, the Snap tries to match the output data types same as when the Output fields property is empty (SELECT * FROM ...). The output preview would be in the same format as the one when SELECT * FROM is implied and all the contents of the table are displayed. | |
Number of Retries  Default Value: 0 | Integer/Expression | Specify the maximum number of attempts to be made to receive a response. The request is terminated if the attempts do not result in a response. | |
Retry Interval (seconds) Â Default Value: 1 | Integer/Expression | Specify the time interval between two successive retry requests. A retry happens only when the previous attempt resulted in an exception. Â | |
Staging mode  Default Value: In memory | Dropdown list | Required when the value in the Number of retries field is greater than 0. Specify the location from the following options to store input documents between retries:
| |
Snap Execution  Default Value: Validate & Execute | Dropdown list | Select one of the following three modes in which the Snap executes:
|
Examples
Read and Sort Records by Age
This example pipeline demonstrates how to read data from a database, sort the data, and insert it into an Oracle database.
Step 1: Configure the Generic JDBC - Select Snap to select ACTRESS records from the TECTONIC table and sort them in ascending order based on the AGE column. On validation, the Snap displays the selected ACTRESS records in the output.
   |
Step 2: Configure the Oracle - Insert Snap to insert the sorted data into the Oracle database. On validation, the Snap displays a confirmation message indicating that the data was successfully inserted into the database.
 |
Downloads
Snap Pack History
Related Content
Have feedback? Email documentation@snaplogic.com | Ask a question in the SnapLogic Community
© 2017-2024 SnapLogic, Inc.