Overview
We are rolling out an important update in the JDBC Snap Pack. This update includes upgrading the jOOQ library to version 3.17.x, which contains several improvements, performance optimization, and key changes that may impact your current integrations with SnapLogic. This FAQ document provides the reasons behind this update, the expected impact, the timelines for the upgrade, the affected databases, and the expected behavior changes.
What is happening?
The jOOQ library for the JDBC Snap Pack is upgraded from v3.9.1 to v3.17.x.
Why are we upgrading the JDBC jOOQ library?
To improve security, compatibility, performance, stability, and reliability.
What is the timeline for this upgrade?
The upgrade to jOOQ 3.17.x for the JDBC Snap Pack will be rolled as per the following timelines:
Early availability phase/Latest Release: This phase will allow early adopters to test the new version and provide feedback. You can choose to use the latest version of the Snap Pack for testing and reporting any issues you encounter.
General Availability (GA)/Stable Release: The full upgrade to jOOQ 3.17.x will be available to all with the GA release on February 12, 2025. The stable version is then pushed to all production environments.
Will I be impacted?
Yes, you may be impacted if you use the latest version. If you prefer not to proceed with the jOOQ upgrade, you can revert to the previous working version of the Snap Pack.
What are the key areas affected by this upgrade?
SQL generation and parsing
Database connection pooling
Query performance and execution time
Error and exception handling
Compatibility with some legacy database features
Which databases were tested for the jOOQ upgrade?
Given the various databases in use, we have tested against only specific databases. However, we recommend testing your pipelines in your lower-level environments (development or QA) to ensure compatibility and functionality with your specific database before planning on the production environment.
Mysql | SQL Sever | Teradata |
We recommend you investigate your pipelines and check the Snaps downstream of JDBC Snaps for any of the affected scenarios. Update any dependencies that the downstream Snaps might have with respect to the changing error and outputs (especially in expressions) to conform to the new expected errors and outputs.
Do I need to reconfigure my accounts?
Yes, you need to reconfigure your accounts if the current driver (JARs) is incompatible with the upgraded jOOQ version (v3.17.x). For more details, refer to the breaking changes.
What are the recommended JDBC drivers (JARs) for various databases?
Database | Latest certified and recommended drivers | Dependencies |
SQL Server | mssql-jdbc-11.2.0.jre11.jar | For this driver version, you must set the URL property in the Generic Database Account as |
MySQL | mysql-connector-java-5.1.47.jar (MySQL 5.x.x server version) |
|
Oracle | ojdbc10.jar |
|
Snowflake | snowflake-jdbc-3.13.25.jar |
|
Redshift | RedshiftJDBC42-1.1.17.1017.jar |
|
Postgre SQL | postgresql-42.7.2.jar |
|
Vertica | vertica-jdbc-11.0.0-0.jar |
|
Teradata | 17.20.12 |
|
Cassandra | v4.11.1 |
|
Maria DB | mariadb-java-client-2.6.0.jar |
|
Informix | ifxjdbc.jar |
|
Db2 | db2jcc4.jar |
|
SAP HANA | ngdbc-2.8.14.jar |
|
IBM Db2 i | jt400-20.0.7.jar |
|
Sybase | jconn4.jar |
|
Athena AWS | AthenaJDBC42_2.0.24.1000.jar |
|
Breaking Changes
After the jOOQ library upgrade, we recommend that you use the SnapLogic-certified drivers for your target databases. If you do not use the certified drivers, the previously functioning pipelines might be impacted.
Additionally, we highly recommend that you migrate to dedicated Database Snap Packs (for example, Oracle, SQL Server, MySQL, Redshift) instead of using the Generic JDBC Snap Pack for your integration needs.
Behavior Changes
The following table lists the change in the output messages:
JDBC Snaps | Endpoints | Scenario | Error/Output from the Old (v3.9.1) JOOQ Library | Error/Output from the New (v3.17.x) JOOQ Library |
All databases - Execute | When calling stored procedures | Output message:
| Output message:
| |
IBM DB2 | When the Snaps integrate with the DB2 for the XML data type. | Output message: { | Output message:
| |
PostgreSQL | When the Snaps integrate with the PostgreSQL for the XML data type. | Output message:
| Output message:
|
The following table lists the change in the error messages:
JDBC Snaps | Endpoints | Scenario | Error/Output from the Old (v3.9.1) JOOQ Library | Error/Output from the New (v3.17.x) JOOQ Library |
Sybase - Execute | When Generic JDBC - Execute Snap inserts out-of-range values for |
|
| |
Oracle - Execute | When you pass a special character (') within the single quotes (''). |
|
| |
MySQL - Execute | When you pass a special character (') within the single quotes (''). |
|
| |
Redshift - Execute | When you pass a special character (') within the single quotes (''). |
|
| |
SQL Server - Insert | When the Generic JDBC—Insert Snap has a datatype mismatch. |
|
| |
Oracle - Insert |
|
| ||
SAP HANA - Insert |
|
| ||
Vertica - Insert | When you attempt to insert |
|
| |
PostgreSQL | When you integrate with the PostgreSQL database and attempt to insert a String value in an Int type. |
|
| |
DB2 | When you integrate DB2 and JDBC Snaps with the |
|
| |
Sybase - Insert | When you insert more than the range of |
|
| |
When you insert |
|
| ||
When you insert |
|
| ||
When you insert |
|
| ||
Sybase-Execute | When Generic JDBC - Execute Snap inserts out of range in Positive for money type. |
|
|
Here’s a summary of the specific behavior changes in the Snap Pack that affect the output, success, and failure messages.
Previously, when you ran multiple statements in the Generic JDBC - Execute Snap, only the reason appeared in the error. If your target database is Snowflake, the error reason now includes the Snowflake class appended in the beginning.
Before:reason=Actual statement count 2 did not match the desired statement count 1
After:reason=net.snowflake.client.jdbc.SnowflakeSQLException: Actual statement count 2 did not match the desired statement count 1
When Generic JDBC - Insert Snap inserts
null
into the Vertica - Insert and there is a data type mismatch, a change in the error message has been observed.
Before:reason=Error converting value bigintcol to type class java.lang.Integer, error=Error converting value
After:reason=[Vertica][VJDBC](3681) ERROR: Invalid input syntax for integer: "bigintcol", error code: 3681, SQL state: 22V02, error=Batch operation failed
Previously, when you integrated DB2 and JDBC Snaps with the
char
data type, theSQLCODE=-433
. Now, the error message isSQLCODE=-302
.
Before:SQLCODE=-433
After:SQLCODE=-302
Previously, when you called stored procedures, the Execute Snap displayed only the success message in the output. Now, the Snap displays both the status and the message in the output for all databases.
Before:Message: Success
After:$UPDATE_COUNT=-1, 0, or 1
(based on the Snap Pack behavior) in the output.Previously, when Generic JDBC—Select and Generic JDBC—Execute Snaps were integrated with PostgreSQL or IBM DB2, the metadata was displayed along with the XML data. Now, you only receive the XML data.
Before:metadata + XML data
After:XML data
Previously, when the Generic JDBC—Insert and Generic JDBC—Execute were integrated with the Sybase database and attempted to insert an out-of-range negative value for the
money
type, the error reason slightly changed.
Before:reason=Arithmetic overflow during implicit conversion of NUMERIC value
After:reason=Data truncation
Previously, when the Generic JDBC—Insert and Generic JDBC—Execute were integrated with the Sybase database and attempted to insert out-of-range values for
int
types likebig int
,small int
, andtiny int
, the error reason slightly changed.
Before:reason=Arithmetic overflow during implicit conversion of NUMERIC value
After:reason=Data truncation
Previously, when the Generic JDBC—Insert was integrated with the Sybase database and
varchar
was attempted to insert intoint
datatype, the error message and reason slightly changed.
Before:error=Batch operation failed, reason=Implicit conversion from datatype 'VARCHAR' to 'INT' is not allowed. Use the CONVERT function to run this query. , error code: 257, SQL state: 37000, resolution=Please check for valid Snap properties and input data.
After:error=Error converting value, reason=Error converting value P to type class java.lang.Integer, resolution=Please provide input data in the appropriate data type., Reason: Expected and input data are not equal, Resolution: Please check expected and input data
Previously, when the Generic JDBC—Insert was integrated with the Sybase database and
varchar
was attempted to insert into thedate
datatype, the error message and reason slightly changed.
Before:error=Batch operation failed, reason=Syntax error during implicit conversion of VARCHAR value 'Hello' to a DATETIME field. , error code: 249, SQL state: S1000, resolution=Please check for valid Snap properties and input data.
After:error=Error converting value, reason=Error converting value Hello to type class java.sql.Timestamp, resolution=Please provide input data in the appropriate data type. Reason: Expected and input data are not equal, Resolution: Please check expected and input data.
Previously, when the Generic JDBC—Insert was integrated with the Sybase database and
varchar
was attempted to insert intotiny int
datatype, the reason slightly changed.
Before:reason=Error converting value P to type class java.lang.Byte
After:reason=Error converting value P to type class org.jooq.types.UByte
When the Generic JDBC - Execute integrated with the Oracle, MySQL, PostgreSQL, or Redshift databases, and you attempted to pass a special character ('), the error reason slightly changed.
Ex:Insert into table(ids) values ('windyi'eee')
Before:"reason": "Invalid query: Insert into TECTONIC.SNPQA2055 (ids) values ('windyi'eee')"
After:"reason": "error occurred during batching: ORA-00917: missing comma\n, error code: 17081",
When the Generic JDBC - Insert Snap is integrated with the PostgreSQL database, and you attempt to insert a String value into Int types, the error message and reason change.
Before:"error": "Invalid long integer: bigintcol", "reason": "The string value is not a long integer or too large for a long integer"
After:"error": "Error converting value", "reason": "Error converting value bigintcol to type class java.lang.Long"