Generic Hive Database Account

In this article

Overview

You can use this account type to connect Hive Snaps with data sources that use Generic Hive Database account.

Prerequisites

  • A Hive account.

Limitations

  • The Hive Snap Pack does not validate with Apache Hive JDBC v1.2.1 jars or earlier because of a defect in Hive. HDP 2.6.3 and HDP 2.6.1 run on Apache Hive JDBC v1.2.1 jars.

  • To validate Snaps that must work with HDP 2.6.3 and HDP 2.6.1, use JDBC v2.0.0 jars.

Known Issues

  • "Method not supported" error while validating Apache Hive JDBC v1.2.1 or earlier

Account Settings

Generic Hive Database Account.png

 

  • Asterisk ( * ): Indicates a mandatory field.

  • Suggestion icon ( ): Indicates a list that is dynamically populated based on the configuration.

  • Expression icon ( ): Indicates the value is an expression (if enabled) or a static value (if disabled). Learn more about Using Expressions in SnapLogic.

  • Add icon ( ): Indicates that you can add fields in the fieldset.

  • Remove icon ( ): Indicates that you can remove fields from the fieldset.

Field Name

Field Type

Description

Field Name

Field Type

Description

Label*

 

Default Value: N/A
Example: Generic Hive Database Account

String

Specify a unique label for the account.

Account properties*

Username

 

Default Value: N/A
Example: Snapuser 

String/Expressions

Specify the username that is allowed to connect to the database. Username will be used as the default username when retrieving connections. The username must be valid in order to set up the data source.

Password

 

Default Value: N/A
Example: Sn@pUser.3

String/Expressions

Specify the password used to connect to the data source. Password will be used as the default when retrieving connections. The password must be valid in order to set up the data source.

JDBC URL

 

Default Value: N/A
Example: jdbc:hive://hostname/dbname:sasl.qop=auth-int

String/Expression

Specify the URL of the JDBC database.

JDBC JARs

Use this fieldset to specify the JDBC JAR files to be loaded. A different driver binary for a driver must have a different name, the same name can not be reused for a different driver. If this property is left blank, a default JDBC driver will be loaded.

Enter the following JDBC jars to configure the Generic Hive Database account for the concerned cluster.

For HDP

  • Hive-jdbc-2.0.0.2.3.5.0-81-standalone.jar

  • Zookeeper-3.4.6.jar (Use this for setting up Hive with Zookeeper)

For CDH

  • hive_metastore.jar

  • hive_service.jar

  • hiveJDBC4.jar

  • libfb303-0.9.0.jar

  • libthrift-0.9.0.jar

  • TCLIServiceClient.jar

  • Zookeeper-3.3.6.jar (Use this for setting up Hive with Zookeeper)

For CDP/CDW

  • HiveJDBC42.jar

  • The JDBC driver can be uploaded through Designer or Manager and it is stored on a per-project basis. That is, only users with access to that project will see JDBC drivers uploaded. To provide access to all users of your org, place the driver in the /shared project.

  • See Advanced Configurations: Configuring Hive with Kerberos section below for a list of JAR files to be uploaded when configuring Hive with Kerberos.

JDBC Driver Class*

 

Default Value: org.apache.hive.jdbc.HiveDriver
Example: jdbc:hive://hostname/dbname:sasl.qop=auth-int

String

Specify the JDBC Driver class name. 

For HDP Clusters: Enter the following value: org.apache.hive.jdbc.HiveDriver

For CDH/CDP Clusters: Enter the following value: com.cloudera.hive.jdbc4.HS2Driver

Advanced properties

Auto commit

 

Default Value: Selected

Checkbox/Expressions

Select this checkbox to commit a batch immediately after the batch executes. So, only the current executing batch will be rolled back if the Snap fails. If you deselect, then a transaction is started for the Snap run and committed upon run success. The transaction is rolled back if the Snap fails.

For a DB Execute Snap, assume that a stream of documents enter the input view of the Snap and the SQL statement property has JSON paths in the WHERE clause. If the number of documents are large, the Snap executes in more than one batches rather than executing one per each document. Each batch would contain a certain number of WHERE clause values. If Auto commit is turned on, a failure would only roll back the records in the current batch. If Auto commit is turned off, the entire operation would be rolled back. For a single execute statement (with no input view), the setting has no practical effect.

Batch size*

 

Default Value: 50
Example: 10

Integer/Expressions

Specify the number of statements to execute at a time. Using a large batch size could use up the JDBC placeholder limit of 2100.

 

 

Fetch size*

 

Default Value: 100
Example: 100

Integer/Expressions

Specify the number of rows to fetch at a time when executing a query. Large values could cause the server to run out of memory.

 

Max pool size*

 

Default Value: 50
Example: 10

Integer/Expressions

Specify the maximum number of idle connections a pool will maintain at a time.

 

Max lifetime (minutes)*

 

Default Value: 30
Example: 25

Integer/Expressions

Specify the minutes a connection can exist in the pool before it is destroyed.

 

Idle Timeout (minutes)*

 

Default Value: 5
Example: 4

Integer/Expressions

Specify the number of minutes for a connection to remain idle before a test query is run. This helps keep database connections from timing out.

Checkout timeout (milliseconds)*

 

Default Value10000
Example10000

Integer/Expressions

Specify the number of milliseconds to wait for a connection to be available in the pool. Zero waits forever. After set time, then an exception will be thrown and the pipeline will fail.

 

 

Url properties

 

Use this fieldset to specify properties to use in JDBC Url. These properties will need to be configured when setting up SSL connection. See Advanced Configurations: Configuring Hive with SSL section below for details.

Url property name

 

Default Value: N/A
ExamplemaxAllowedPacket 

String/Expressions

Specify a name for the URL property to be used by the account.

Url property value

 

Default ValueN/A
Example1000

String/Expressions

Specify a value for the URL property name.

 

Hadoop properties

Authentication method*

 

Default ValueNone
ExampleKerberos

Dropdown list

Select the Authentication method to use when connecting to the Hadoop service.  

  • None: Allows connection even without the Username and Password

  • Kerberos: Allows connection with Kerberos details such as Client Principal, Keytab file, and Service principal

  • User ID: Allows connection with Username only

  • User ID and Password: Allows connection with Username and Password

 

Use Zookeeper 

 

Default ValueDeselected

Checkbox

Select this checkbox if you want the Snap to use Zookeeper to locate the Hadoop service instead of a specific hostname. If the checkbox is selected, use Zookeeper to resolve the location of the database instead of using the hostname field in the standard block.

Zookeeper URL


Default Value: N/A
Examplehostname1:port,hostname2:port/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2

String

Specify the URL of the Zookeeper service. Zookeeper URL formats are different for CDH and HDP.

  • For HDP:

    • Format: hostname1:port,hostname2:port/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2

    • Example: na77sl-ihdc-ux02011.clouddev.snaplogic.com:2181,na77sl-ihdc-ux02012.clouddev.snaplogic.com:2181,na77sl-ihdc-ux02013.clouddev.snaplogic.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2

  • For CDH:

    • Format: zk=hostname1:port,hostname2:port/hiveserver2

    • Example: Zk = jdbc:hive2://cdhclusterqa-1-1.cloudev.snaplogic.com:2181.cdhclusterqa-1-2.clouddev.snaplogic.com:2181.cdhclusterqa-1-3.clouddev.smaplogic.com:2181/hiveserver2

Hive properties

JDBC Subprotocol*

 

Default ValueHive
ExampleImpala

Dropdown list

Specify the JDBC Subprotocol to be used. This is required when the Authentication method is KerberosAvailable options are:

  • Hive

  • Impala

Kerberos properties

Use this fieldset to configure information required for the Kerberos authentication. These properties must be configured if you select Kerberos in the Authentication method property.

Client Principal

 

Default Value: N/A
Example: hiveclient@EXAMPLE.COM

String

Specify the principal used to authenticate to Kerberos KDC (Kerberos Key Distribution Center - Network service used by the clients and servers for authentication). 

Keytab File

 

Default Value: N/A
Exampleetc/krb5.keytab

String

Specify the Keytab file (file used to store encryption keys) used to authenticate to Kerberos KDC.

Service Principal

 

Default Value: N/A
Example:  hive/host@REALM or impala/host@REALM

String

Specify the principal used by an instance of a service. For example, 

  • If you are connecting to a specific server. use: hive/host@REALM or impala/host@REALM

  • If you are connecting (more common for the Snap) to any compliant host (see Use Zookeeper property's description) in which case the principal is: 'hive/_HOST@REALM' or 'impala/_HOST@REALM'. 

Snap Pack History


Related Content