FAQ—JDBC jOOQ Upgrade from v3.9.1 to v3.17.x

Overview

We are rolling out an important update in the JDBC Snap Pack. This update includes upgrading the jOOQ library to version 3.17.x, which contains several improvements, performance optimization, and key changes that may impact your current integrations with SnapLogic. This FAQ document provides the reasons behind this update, the expected impact, the timelines for the upgrade, the affected databases, and the expected behavior changes.

What is happening?

The jOOQ library for the JDBC Snap Pack is upgraded from v3.9.1 to v3.17.x.

Why are we upgrading the JDBC jOOQ library?

To improve security, compatibility, performance, stability, and reliability.

What is the timeline for this upgrade?

The upgrade to jOOQ 3.17.x for the JDBC Snap Pack will be rolled as per the following timelines:

  • Early availability phase/Latest Release: This phase will allow early adopters to test the new version and provide feedback. You can choose to use the latest version of the Snap Pack for testing and reporting any issues you encounter.

  • General Availability (GA)/Stable Release: The full upgrade to jOOQ 3.17.x will be available to all with the GA release on February 12, 2025. The stable version is then pushed to all production environments.

Will I be impacted?

Yes, you may be impacted if you use the latest version. If you prefer not to proceed with the jOOQ upgrade, you can revert to the previous working version of the Snap Pack.

What are the key areas affected by this upgrade?

  • SQL generation and parsing

  • Database connection pooling

  • Query performance and execution time

  • Error and exception handling

  • Compatibility with some legacy database features

Which databases were tested for the jOOQ upgrade?

Given the various databases in use, we have tested against only specific databases. However, we recommend testing your pipelines in your lower-level environments (development or QA) to ensure compatibility and functionality with your specific database before planning on the production environment.

Mysql
Postgresql
Redshift
Oracle
Snowflake

SQL Sever
Maria DB
DB2
Informix
Cassandra

Teradata
Sybase ASE
Denodo
Vertica
SAPHANA

We recommend you investigate your pipelines and check the Snaps downstream of JDBC Snaps for any of the affected scenarios. Update any dependencies that the downstream Snaps might have with respect to the changing error and outputs (especially in expressions) to conform to the new expected errors and outputs.

Do I need to reconfigure my accounts?

Yes, you need to reconfigure your accounts if the current driver (JARs) is incompatible with the upgraded jOOQ version (v3.17.x). For more details, refer to the breaking changes.

What are the recommended JDBC drivers (JARs) for various databases?

Database

Latest certified and recommended drivers

Dependencies

SQL Server

mssql-jdbc-11.2.0.jre11.jar

For this driver version, you must set the URL property in the Generic Database Account as trustServerCertificate=true

MySQL

mysql-connector-java-5.1.47.jar (MySQL 5.x.x server version)
mysql-connector-j-8.0.17.jar (MySQL 8.x.x server version)

 

Oracle

ojdbc10.jar

 

Snowflake

snowflake-jdbc-3.13.25.jar

 

Redshift

RedshiftJDBC42-1.1.17.1017.jar

 

Postgre SQL

postgresql-42.7.2.jar

 

Vertica

vertica-jdbc-11.0.0-0.jar

 

Teradata

17.20.12

 

Cassandra

v4.11.1

 

Maria DB

mariadb-java-client-2.6.0.jar

 

Informix

ifxjdbc.jar

 

Db2

db2jcc4.jar

 

SAP HANA

ngdbc-2.8.14.jar

 

IBM Db2 i

jt400-20.0.7.jar

 

Sybase

jconn4.jar

 

Athena AWS

AthenaJDBC42_2.0.24.1000.jar

 

Potential Breaking Changes

  • After the jOOQ library upgrade, we recommend that you use the SnapLogic-certified drivers for your target databases. If you do not use the certified drivers, the previously functioning pipelines might be impacted.

  • Additionally, we highly recommend that you migrate to dedicated Database Snap Packs (for example, Oracle, SQL Server, MySQL, Redshift) instead of using the Generic JDBC Snap Pack for your integration needs.

Behavior Changes

The following table lists the change in the output messages:

JDBC Snaps

Endpoints

Scenario

Error/Output from the Old (v3.9.1) JOOQ Library

Error/Output from the New (v3.17.x) JOOQ Library

Generic JDBC - Execute

All databases - Execute

When calling stored procedures

Output message:

Message: Success

Output message:

Message: Success

$UPDATE_COUNT=-1, 0, or 1 (based on the Snap Pack behavior) in the output

Generic JDBC - Select and Generic JDBC - Execute

IBM DB2

When the Snaps integrate with the DB2 for the XML data type.

Output message:

{
"DATA":
{
"AS400JDBCSQLXMLLocator":
{
"binaryStream":
{
"AS400JDBCInputStream":
{
}
"string": "<root><element>Value</element</root>"
"characterStream":
{
"ConvTableReader":
{
"ccsid": 1208
"byteCacheSize": 2562
"cacheSize": 1024
"encoding": "UTF-8"
}}}}}}

Output message:

{ "DATA": "<root><element>Value</element></root>" }

Generic JDBC - Select and Generic JDBC - Execute

PostgreSQL

When the Snaps integrate with the PostgreSQL for the XML data type.

Output message:

"data":
{
"PgSQLXML":
{
"binaryStream":
{
"ByteArrayInputStream":
{
}
"characterStream":
{
"StringReader":
{
}
"string": "<book><title>PostgreSQL Essentials</title><author>John Doe</author></book>"}}}}}

Output message:

"data": "<book><title>PostgreSQL Essentials</title><author>John Doe</author></book>"

The following table lists the change in the error messages:

JDBC Snaps

Endpoints

Scenario

Error/Output from the Old (v3.9.1) JOOQ Library

Error/Output from the New (v3.17.x) JOOQ Library

Generic JDBC - Execute

Sybase - Execute

When Generic JDBC - Execute Snap inserts out-of-range values for int types like BigInt, SmallInt, and TinyInt

error=Batch operation failed, reason=Arithmetic overflow during implicit conversion of NUMERIC value '922337203685478' to an int field. , error code: 0, SQL state: 22001, next exception: Arithmetic overflow during implicit conversion of NUMERIC value '922337203685478.5807' to a MONEY field. , resolution=Please check for valid Snap properties and input data.

error=Batch operation failed, reason=Data truncation, error code: 0, SQL state: 22001, next exception: Arithmetic overflow during implicit conversion of NUMERIC value '922337203685478' to an int field. , resolution=Please check for valid Snap properties and input data. Reason: Expected and input data are not equal, Resolution: Please check expected and input data.

Oracle - Execute

When you pass a special character (') within the single quotes ('').

"reason": "Invalid query: INSERT INTO \"enron\".\"EMP7400\"(NAME) VALUES ('windyi'eee')

reason=You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '"enron"."EMP7400"(NAME) VALUES ('windyi'eee')' at line 1, error code: 1064, SQL state: 42000

MySQL - Execute

When you pass a special character (') within the single quotes ('').

"reason": "Invalid query: INSERT INTO \"enron\".\"EMP7400\"(NAME) VALUES ('windyi'eee');"

reason=You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '"enron"."EMP7400"(NAME) VALUES ('windyi'eee')' at line 1, error code: 1064, SQL state: 42000

Redshift - Execute

When you pass a special character (') within the single quotes ('').

"reason": "Invalid query: insert into singlequote values (101,'na'me')

"reason": "Invalid query: This may be raised due to limitations in the JDBC driver or syntax errors in the SQL statement",

Generic JDBC - Insert

SQL Server - Insert

When the Generic JDBC—Insert Snap has a datatype mismatch.

reason=Error occurred while converting the value of bigintcol field, error=Invalid value for number field definition

reason=Error converting value bigintcol to type class java.lang.Long, error=Error converting value

Oracle - Insert

reason=ORA-01438: value larger than specified precision allowed for this column , error code: 1438, SQL state: 22003, error=SQL operation failed

[{"reason":"The string value is not a number","error":"Invalid integer: value10"}]

SAP HANA - Insert

reason=The string value is not a long integer or too large for a long integer, error=Invalid long integer: bigintcol

reason=SQL [null]; SAP DBTech JDBC: Cannot convert data bigintcol to type long., error=SQL operation failed

Vertica - Insert

When you attempt to insert null, and, there is a datatype mismatch.

reason=Error converting value bigintcol to type class java.lang.Integer, error=Error converting value

reason=[Vertica][VJDBC](3681) ERROR: Invalid input syntax for integer: "bigintcol", error code: 3681, SQL state: 22V02, error=Batch operation failed

PostgreSQL

When you integrate with the PostgreSQL database and attempt to insert a String value in an Int type.

"error": "Invalid long integer: bigintcol",

"reason": "The string value is not a long integer or too large for a long integer"

"error": "Error converting value",

"reason": "Error converting value bigintcol to type class java.lang.Long"

DB2

When you integrate DB2 and JDBC Snaps with the char data type that is more than the prescribed range.

Error: SQLCODE=-433

Error: SQLCODE=-302

Sybase - Insert

When you insert more than the range of Int signed value.

error=Batch operation failed, reason=Data truncation, error code: 0, SQL state: 22001, next exception: Arithmetic overflow during implicit conversion of DECIMAL value '2147483648' to a INT field.

error=Number overflow: 2147483648, reason=Error occurred while converting the value of int_signed_col field, resolution=Please correct the value to match the expected data type, Reason: Expected and input data are not equal

When you insert varchar to type Int.

error=Batch operation failed, reason=Implicit conversion from datatype 'VARCHAR' to 'INT' is not allowed. Use the CONVERT function to run this query. , error code: 257, SQL state: 37000, resolution=Please check for valid Snap properties and input data.

error=Error converting value, reason=Error converting value P to type class java.lang.Integer, resolution=Please provide input data in the appropriate data type., Reason: Expected and input data are not equal, Resolution: Please check expected and input data.

When you insert varchar to type date.

error=Batch operation failed, reason=Syntax error during implicit conversion of VARCHAR value 'Hello' to a DATETIME field. , error code: 249, SQL state: S1000, resolution=Please check for valid Snap properties and input data.

error=Error converting value, reason=Error converting value Hello to type class java.sql.Timestamp, resolution=Please provide input data in the appropriate data type. Reason: Expected and input data are not equal, Resolution: Please check expected and input data.

When you insert varchar to type TinyInt Signed.

error=Error converting value, reason=Error converting value P to type class java.lang.Byte, resolution=Please provide input data in the appropriate data type.

error=Error converting value, reason=Error converting value P to type class org.jooq.types.UByte, resolution=Please provide input data in the appropriate data type. Reason: Expected and input data are not equal, Resolution: Please check expected and input data.

Generic JDBC - Execute and Generic JDBC - Insert

Sybase-Execute

When Generic JDBC - Execute Snap inserts out of range in Positive for money type.

reason=Arithmetic overflow during implicit conversion of NUMERIC value '922337203685478.5807' to a MONEY field . , error code: 0, SQL state: 22001, next exception: Arithmetic overflow during implicit conversion of NUMERIC value '922337203685478.5807' to a MONEY field

reason=Data truncation, error code: 0, SQL state: 22001, next exception: Arithmetic overflow during implicit conversion of NUMERIC value '922337203685478.5807' to a MONEY field.

Here’s a summary of the specific behavior changes in the Snap Pack that affect the output, success, and failure messages.

  • Previously, when you ran multiple statements in the Generic JDBC - Execute Snap, only the reason appeared in the error. If your target database is Snowflake, the error reason now includes the Snowflake class appended in the beginning.
    Before: reason=Actual statement count 2 did not match the desired statement count 1
    After: reason=net.snowflake.client.jdbc.SnowflakeSQLException: Actual statement count 2 did not match the desired statement count 1

  • When Generic JDBC - Insert Snap inserts null into the Vertica - Insert and there is a data type mismatch, a change in the error message has been observed.
    Before: reason=Error converting value bigintcol to type class java.lang.Integer, error=Error converting value
    After: reason=[Vertica][VJDBC](3681) ERROR: Invalid input syntax for integer: "bigintcol", error code: 3681, SQL state: 22V02, error=Batch operation failed

  • Previously, when you integrated DB2 and JDBC Snaps with the char data type, the SQLCODE=-433. Now, the error message is SQLCODE=-302.
    Before: SQLCODE=-433
    After: SQLCODE=-302

  • Previously, when you called stored procedures, the Execute Snap displayed only the success message in the output. Now, the Snap displays both the status and the message in the output for all databases.
    Before: Message: Success
    After: $UPDATE_COUNT=-1, 0, or 1 (based on the Snap Pack behavior) in the output.

  • Previously, when Generic JDBC—Select and Generic JDBC—Execute Snaps were integrated with PostgreSQL or IBM DB2, the metadata was displayed along with the XML data. Now, you only receive the XML data.
    Before: metadata + XML data
    After: XML data

  • Previously, when the Generic JDBC—Insert and Generic JDBC—Execute were integrated with the Sybase database and attempted to insert an out-of-range negative value for the money type, the error reason slightly changed.
    Before: reason=Arithmetic overflow during implicit conversion of NUMERIC value
    After: reason=Data truncation

  • Previously, when the Generic JDBC—Insert and Generic JDBC—Execute were integrated with the Sybase database and attempted to insert out-of-range values for int types like big int, small int, and tiny int, the error reason slightly changed.
    Before: reason=Arithmetic overflow during implicit conversion of NUMERIC value
    After: reason=Data truncation

  • Previously, when the Generic JDBC—Insert was integrated with the Sybase database and varchar was attempted to insert into int datatype, the error message and reason slightly changed.
    Before: error=Batch operation failed, reason=Implicit conversion from datatype 'VARCHAR' to 'INT' is not allowed. Use the CONVERT function to run this query. , error code: 257, SQL state: 37000, resolution=Please check for valid Snap properties and input data.
    After: error=Error converting value, reason=Error converting value P to type class java.lang.Integer, resolution=Please provide input data in the appropriate data type., Reason: Expected and input data are not equal, Resolution: Please check expected and input data

  • Previously, when the Generic JDBC—Insert was integrated with the Sybase database and varchar was attempted to insert into the date datatype, the error message and reason slightly changed.
    Before: error=Batch operation failed, reason=Syntax error during implicit conversion of VARCHAR value 'Hello' to a DATETIME field. , error code: 249, SQL state: S1000, resolution=Please check for valid Snap properties and input data.
    After: error=Error converting value, reason=Error converting value Hello to type class java.sql.Timestamp, resolution=Please provide input data in the appropriate data type. Reason: Expected and input data are not equal, Resolution: Please check expected and input data.

  • Previously, when the Generic JDBC—Insert was integrated with the Sybase database and varchar was attempted to insert into tiny int datatype, the reason slightly changed.
    Before: reason=Error converting value P to type class java.lang.Byte
    After: reason=Error converting value P to type class org.jooq.types.UByte

  • When the Generic JDBC - Execute integrated with the Oracle, MySQL, PostgreSQL, or Redshift databases, and you attempted to pass a special character ('), the error reason slightly changed.
    Ex: Insert into table(ids) values ('windyi'eee')
    Before: "reason": "Invalid query: Insert into TECTONIC.SNPQA2055 (ids) values ('windyi'eee')"
    After: "reason": "error occurred during batching: ORA-00917: missing comma\n, error code: 17081",

  • When the Generic JDBC - Insert Snap is integrated with the PostgreSQL database, and you attempt to insert a String value into Int types, the error message and reason change.
    Before: "error": "Invalid long integer: bigintcol", "reason": "The string value is not a long integer or too large for a long integer"
    After: "error": "Error converting value", "reason": "Error converting value bigintcol to type class java.lang.Long"