Kafka OAuth2 Account

Kafka OAuth2 Account

This page is no longer maintained (Nov 12, 2025). For the most current information, go to https://docs.snaplogic.com/snaps/snaps-data/sp-kafka/kafka-oauth2-acct.html.

In this article:

Overview

You can use this account type to connect Kafka Snaps with data sources that use the Kafka OAuth2 account.

 

Prerequisites

A registered OAuth application in the Confluent portal with appropriate permissions:

  • Client ID

  • Client secret

  • OAuth2 token endpoint

  • Scope

Limitations and Known Issues

None.

Account Settings

  • Asterisk ( * ): Indicates a mandatory field.

  • Suggestion icon ( ): Indicates a list that is dynamically populated based on the configuration.

  • Expression icon ( ): Indicates the value is an expression (if enabled) or a static value (if disabled). Learn more about Using Expressions in SnapLogic.

  • Add icon ( ): Indicates that you can add fields in the field set.

  • Remove icon ( ): Indicates that you can remove fields from the field set.

Field Name

Field Type

Description

Field Name

Field Type

Description

Label*

 

 

String

Specify a unique label for the account.


Default ValueKafka OAuth2 Account
ExampleKafka OAuth2Account

Bootstrap servers*

Use this field to specify the initial list of Kafka broker addresses for a Kafka client to connect to during its initial bootstrap process.

Bootstrap server

 

 

String/Expression

Specify the host:port pairs to establish the initial connection to the Kafka cluster.


Default Value: N/A
Example: localhost:9092

Schema registry URL

 

 

String/Expression

Specify the schema registry server URL.


Default Value: N/A
Example: http://ec2-55-334-44-58.compute-1.amazonaws.com:8000

Advanced Kafka properties

Use this field set to specify any additional Kafka properties for connecting to the Kafka server that are not specifically provided in the Confluent Kafka Snaps. 

To connect to a Confluent Cloud Kafka cluster instance using OAuth/OIDC credentials, you must define SASL extension properties for the pool ID of the Identity Pool and ID of the Kafka cluster. Each extension property must be in lowercase and begin with extension_

kafka-oauth2-adv-prop-ssl.png

Key 

 

 

String/Expression

Specify the key for any Kafka parameters that are not specifically supported by the Snaps.


Default Value: N/A
Example: max.message.size

Value

 

 

String/Expression/Integer

Specify the value for the corresponding key that is not specifically supported by the Snaps.

Default Value: N/A
Example: 5 MB

Security protocol

 

 

String/Expression/Suggestion

Select the security protocol from the dropdown list. The available options are:

  • SASL_SSL

  • SASL_PLAINTEXT

Default Value: SASL_PLAINTEXT
Example: SASL_SSL

Client ID*

 

 

String

Specify the client ID created during the application registration process. This ID enables the application to log in to an identity provisioning program, such as Azure Active Directory. The application ID, also known as the client ID, uniquely identifies your application. Learn more about the application configuration process in the Azure portal.

 

Default Value: N/A
Example: 8231b8a-jbc8-128-73ce-d021j2b279c8

Client secret*

 

 

String

Specify the client secret, which your application uses to securely acquire the tokens. The client secret can be created by following the steps of the application provider.

 

Default value: N/A
Example: value is encrypted

Scope

 

 

String/Expression

Specify the scope to provide a way to manage permissions to protected resources, such as your web API. Learn more about how to set up OAuth2 credentials in the Azure portal.

 

Default Value: N/A
Example: api://e0af525c-c373-44bc-ac99-5f5a-2782268d/default

OAuth2 token endpoint*

 

String/Expression

 

Specify the token endpoint to get the access token.

 

Default valueN/A
Example: https://login.microsoftonline.com/2060acfg-89d9-423d-9514-eac46338ec05/oauth2/v2.0/token

Keystore filepath

 

 

String/Expression

Appears when the Security protocol is SASL_SSL

Specify the keystore file location of the client.


Default Value: N/A
Example: server.keystore.jks

Keystore file password

 

 

String/Expression

Appears when the Security protocol is SASL_SSL

Specify the keystore password to access the keystore file of the client.


Default Value: N/A
Example: KsP@ssw0rd123!

SSL key password

 

 

String/Expression

Appears when the Security protocol is SASL_SSL

Specify the SSL key password.


Default Value: N/A
Example: SslK3yP@ssw0rd!

Truststore filepath

 

 

String/Expression

Appears when the Security protocol is SASL_SSL

Specify the truststore file location of the client.


Default Value: N/A
Example: server.truststore.jks

Truststore password

 

 

String/Expression

Appears when the Security protocol is SASL_SSL

Specify the password to access the truststore file, if used.


Default Value: N/A
Example: Value is encrypted

Schema registry authentication

Appears when you specify the Schema Registry URL.

Use this field set to configure the schema registry details for authentication.

 

Registry cluster ID*

String/Expression

The Registry cluster ID uniquely identifies a Schema registry instance.

Specify the Cluster ID of the Schema Registry. The cluster ID begins with lsrc- (Logical Schema Registry Cluster). Learn more.

Default Value: N/A
Example: lsrc-Dfc93Xc9TzK5ZC6X0k7

Identity pool ID

String/Expression

A group of identities that are allowed to authenticate and interact with the Schema registry.

Specify the ID of the Identity Pool with permissions (read/write/delete schema) to access the Schema Registry.

If this field is left blank, the account looks for an Identity Pool ID defined as a SASL extension property in the Advanced Properties. If it finds one, it uses that as the identity pool. Learn more about creating an Identity Pool ID.

 

Default Value: N/A
Example: pool-b6Yd

 

Client ID

String/Expression

The Client ID obtained from the OAuth/OIDC identity provider (used for authenticating to the Schema Registry).

Default Value: N/A
Example: LKC1234567890

 

Client secret

String/Expression

The Client secret obtained from the client credentials grant from the OAuth/OIDC identity provider (used for authenticating to the Schema Registry. This value is encrypted).

Default Value: N/A
Example: abcdEfghIjklMnopQrStUvwxYz1234567890+/=

 

Scope

String/Expression

Specify the Access token scope used to obtain a client credentials grant from an OAuth/OIDC identity provider.

 

Default Value: N/A
Example: api://trigger-task/.default

 

OAuth2 token endpoint

String/Expression

Specify the OAuth2 token endpoint to which you need to connect.

Default Value: N/A
Example: http://keycloak:8080/realms/cp/protocol/openid-connect/token

Troubleshooting

Error

Reason

Resolution

Error

Reason

Resolution

Failed to validate account

Invalid credentials for Schema Registry or connection failure.

Provide a valid registry URL.

Failed to retrieve OAuth token for Schema Registry.

This occurs when fetching an OAuth token for the Schema Registry.

Ensure that the Schema Registry is running and reachable by the Snaplex.

Client ID or secret is not defined

Invalid client credentials for Schema Registry.

Specify values for Schema Registry client credentials.

When you provide an invalid Schema registry URL, the account fails to validate.

negative-scenario.png

When you provide an invalid OAuth2 token endpoint URL in the Schema Registry authentication section, the account fails to validate.

invalid-oauth-token-endpoint.png


Example

Publish and consume messages in Kafka using Schema Registry authentication

This pipeline demonstrates a Kafka message flow using Schema Registry authentication. It creates sample user data, sends it to a Kafka topic using Avro serialization, extracts metadata information, and then consumes the message from the same topic to retrieve the original data along with Kafka metadata.

ex-schema-registry-pipe-overview.png

Pipeline summary:

  • JSON Generator: Generates sample JSON content containing user information with firstName "vim", lastName "snap", and age 1.

  • Kafka Producer: Publishes the JSON data to test-topic with Avro serialization for both key and value, using leader acknowledgement.

  • Mapper: Extracts and maps Kafka message metadata including offset, topic, and partition information from the producer output.

  • Kafka Consumer: Consumes messages from the Kafka topic using the metadata captured from the producer, seeking to the specific offset and partition with string deserialization.

  • Mapper: Extracts and maps the consumed message data including key, value, and topic information from the Kafka consumer output.

Configure the Kafka OAuth2 Account as follows:

ex-schema-registry-config.png

On validation, the account validates successfully.

  File Modified
No files shared here yet.

Related Content