Configuring eXtreme Execute Accounts


You must configure eXtreme Execute accounts to enable the PySpark Script and JAR Submit Snaps to connect to Azure Databricks or AWS EMR clusters.

The eXtreme Execute account has the following account types:

See Accounts to learn more about SnapLogic Accounts.

Known Issue

  • When ELT and Spark SQL 2.x Snap account credentials—such as user names, passwords, client secrets, auth codes and tokens, secret keys, and keystores—are auto-filled using the Google Chrome browser, the accounts and hence the Pipelines fail. This is because the browser overwrites the field values with its own encrypted values that the SnapLogic Platform cannot read. SnapLogic recommends that you do not auto-save your Snap account credentials in the Chrome browser. 
  • Ensure that you delete any credentials that the browser has already saved for, and then perform ONE of the following actions:
    • Option 1: Click  that appears in the address bar after you submit your login credentials at, and then click Never
    • Option 2: Disable the Offer to save Passwords option at chrome://settings/passwords while working with your SnapLogic Pipelines. If you disable this option, your Chrome browser will not remember your passwords on any other website.

Snap Compatibility

Configuring eXtreme Execute Accounts

You can configure your eXtreme Execute account in the SnapLogic UI using either the Designer or Manager.

Using SnapLogic Designer

  1. Drag the PySpark Script or JAR Submit Snap to the Canvas.

  2. Click the Snap to open it's settings.

  3. Click the Account tab.

  4. Click the Account Reference drop-down list and select an existing account, or click Add Account to create a new account.

  5. Select the Location in which you want to create the account, select the Account Type, and click Continue to view details that you must provide for the account that you want to create.

  6. Enter the required account details. For detailed guidance on how to provide information associated with each account type, use the following links:
  7. Optionally, enter additional information on this account in the Notes field of the Info tab.

  8. Click Validate to verify the account information if the account type you are creating supports validation. 

  9. Click Apply to complete configuring the account.

Using SnapLogic Manager

  1. Navigate to the project with which you want to associate the new account and click  > Account > Extreme Execute, followed by the appropriate account type.
    The settings popup associated with the selected account appears.

  2. Enter the required account details in the fields provided.

  3. Optionally, enter additional information on this account in the Notes field of the Info tab.

  4. Click Validate to verify the account information. 

  5. Click Apply to complete configuring the account to the project.

You should avoid updating the account credentials while Pipelines using that account are executing. This may lead to unexpected results, including locking your AWS or Azure Databricks account.

See Also

Snap Pack History

 Click to view/expand

4.27 (main12833)

  • No updates made.

4.26 (main11181)

  • No updates made.

4.25 (main9554)

  • No updates made.

4.24 (main8556)

4.23 (main7430)

  • Accounts support validation. Thus, you can click Validate in the account settings dialog to validate that your account is configured correctly. 

4.22 (main6403)

  • No updates made.

4.21 Patch 421patches5928

  • Adds Hierarchical Data Format v5 (HDF5) support in AWS EMR. With this enhancement, you can read HDF5 files and parse them into JSON files for further data processing. See Enabling HDF5 Support for details.
  • Adds support for Python virtual environment to the PySpark Script Snap to enable reading HDF5 files in the S3 bucket. You can specify the path for the virtual machine environment's ZIP file in this field.

4.21 Patch 421patches5851

  • Optimizes Spark engine execution on AWS EMR, requiring lesser compute resources.

4.21 (snapsmrc542)

  • No updates made.

4.20 (snapsmrc535)

  • Introduced a new account type, Azure Databricks Account. This enhancement makes account configuration mandatory for the PySpark Script and JAR Submit Snaps.
  • Enhanced the PySpark Script Snap to display the Pipeline Execution Statistics after a Pipeline with the Snap executes successfully.

4.19 (snapsmrc528)

  • No updates made.

4.18 (snapsmrc523)

  • No updates made.

4.17 Patch ALL7402

  • Pushed automatic rebuild of the latest version of each Snap Pack to SnapLogic UAT and Elastic servers.

4.17 (snapsmrc515)

  • No updates made. Automatic rebuild with a platform release.

4.16 (snapsmrc508)

  • New Snap Pack. Execute Java Spark and PySpark applications through the SnapLogic platform. Snaps in this Snap Pack are:
    • JAR Submit: Upload your existing Spark Java JAR programs as eXtreme Pipelines.
    • PySpark Script: Upload your existing PySpark scripts as eXtreme Pipelines.