HDFS Writer

In this article

Overview

This Snap reads a binary data stream from its input view and writes a file in HDFS (Hadoop File System). It also helps pick a file by suggesting a list of directories and files. For the HDFS protocol, use a SnapLogic on-premises Groundplex and ensure its instance is within the Hadoop cluster and SSH authentication has already been established. The Snap also supports writing to a kerberized cluster through the HDFS protocol. This Snap supports HDFS, ADL (Azure Data Lake), ABFS(Azure Data Lake Storage Gen 2 ), and WASB(Azure storage) protocols. HDFS 2.4.0 is supported for the HDFS protocol. This Snap supports reading from HDFS Encryption.

Snap Type

The HDFS Writer Snap is a Write-type Snap.

Prerequisites

None.

Support for Ultra Pipelines

Works in Ultra Pipelines.

Limitations

  • Append (File action) is supported for ADL protocol only.

  • File names with the following special characters are not supported in the HDFS Writer Snap: '+', '?', '/', ':'.

Snap Views

Type

Format

Number of Views

Examples of Upstream and Downstream Snaps

Description

Input

Binary

  • Min: 1

  • Max: 1

  • CSV Formatter

  • JSON Formatter

  • XML Formatter

 

Output

Document

  • Min: 0

  • Max: 1

  • Maper

  • File Reader

The following is an example of the output document map data:

{

"filename": "hdfs://ec2-54-198-212-134.compute-1.amazonaws.com:8020/user/john/input/sample.csv", 3 "fileAction": "overwritten" }

The value of the "fileAction" field can be "overwritten" or "created" or "ignored". The value "ignored" indicate that the Snap did not overwrite the existing file because the value of the File action property is "IGNORE".

Error

Error handling is a generic way to handle errors without losing data or failing the Snap execution. You can handle the errors that the Snap might encounter while running the Pipeline by choosing one of the following options from the When errors occur list under the Views tab:

  • Stop Pipeline Execution: Stops the current pipeline execution if the Snap encounters an error.

  • Discard Error Data and Continue: Ignores the error, discards that record, and continues with the remaining records.

  • Route Error Data to Error View: Routes the error data to an error view without stopping the Snap execution.

Learn more about Error handling in Pipelines.

Snap Settings

  • Asterisk ( * ): Indicates a mandatory field.

  • Suggestion icon (): Indicates a list that is dynamically populated based on the configuration.

  • Expression icon ( ): Indicates the value is an expression (if enabled) or a static value (if disabled). Learn more about Using Expressions in SnapLogic.

  • Add icon ( ): Indicates that you can add fields in the field set.

  • Remove icon ( ): Indicates that you can remove fields from the field set.

  • Upload icon ( ): Indicates that you can upload files.

Field

Field Types

Description

Label*


Default Value: HDFS Writer
Example: HDFS Writer

String

Specify a unique name for the Snap.

Directory


Default Value: hdfs://<hostname>:<port>/
Example:

  • hdfs://ec2-54-198-212-134.compute-1.amazonaws.com:8020/user/john/input/

  • wasb:///snaplogic/testDir/

  • wasbs:///snaplogic/testDir/

  • $dirname

  • adl://snapqa/ 

  • abfs(s):///filesystem2/dir1

  • abfs(s)://filesystem2@snaplogicaccount.dfs.core.windows.net/dir1

String/Expression/Suggestion

Specify the URL for HDFS directory. It should start with hdfs file protocol in the following format:

  • hdfs://<hostname>:<port>/<path to directory>/

  • wasb:///<container name>/<path to directory>/

  • wasbs:///<container name>/<path to directory>/

  • adl://<container name>/<path to directory>/ 

  • abfs(s):///filesystem/<path>/

  • abfs(s)://filesystem@accountname.endpoint/<path>

The Directory property is not used in the Pipeline execution or preview and used only in the Suggest operation. When you click the Suggestion icon, the Snap displays a list of subdirectories under the given directory. It generates the list by applying the value of the Filter property.

SnapLogic automatically appends azuredatalakestore.net to the store name you specify when using Azure Data Lake; therefore, you do not have to add azuredatalakestore.net to the URI while specifying the directory.

File filter


Default Value: *



String/Expression

Specify the Glob filter pattern.

Use glob patterns to display a list of directories or files when you click the Suggest icon in the Directory or File property. A complete glob pattern is formed by combining the value of the Directory property with the Filter property. If the value of the Directory property does not end with "/", the Snap appends one, so that the value of the Filter property is applied to the directory specified by the Directory property.

 

The following rules are used to interpret glob patterns:

The * character matches zero or more characters of a name component without crossing directory boundaries. For example, the *.csv pattern matches a path that represents a file name ending in .csv, and *.* matches all file names that contain a period.

The ** characters match zero or more characters across directories; therefore, it matches all files or directories in the current directory and in its subdirectories. For example, /home/** matches all files and directories in the /home/ directory.

The ? character matches exactly one character of a name component. For example, 'foo.?' matches file names that start with 'foo.' and are followed by a single-character extension.

The \ character is used to escape characters that would otherwise be interpreted as special characters. The expression \\ matches a single backslash, and \{ matches a left brace, for example.

The ! character is used to exclude matching files from the output. 

The [ ] characters form a bracket expression that matches a single character of a name component out of a set of characters. For example, '[abc]' matches 'a', 'b', or 'c'. The hyphen (-) may be used to specify a range, so '[a-z]' specifies a range that matches from 'a' to 'z' (inclusive). These forms can be mixed, so '[abce-g]' matches 'a', 'b', 'c', 'e', 'f' or 'g'. If the character after the [ is a ! then it is used for negation, so '[!a-c]' matches any character except 'a', 'b', or 'c'.

Within a bracket expression, the '*', '?', and '\' characters match themselves. The '-' character matches itself if it is the first character within the brackets, or the first character after the !, if negating.

The '{ }' characters are a group of sub-patterns where the group returns a match if any sub-pattern in the group matches the contents of a target directory. The ',' character is used to separate sub-patterns. Groups cannot be nested. For example, the pattern '*.{csv, json}' matches file names ending with '.csv' or '.json'.

Leading dot characters in a file name are treated as regular characters in match operations. For example, the '*' glob pattern matches file name ".login".

All other characters match themselves.

Examples:

'*.csv' matches all files with a csv extension in the current directory only.

'**.csv' matches all files with a csv extension in the current directory and in all its subdirectories.

*[!{.pdf,.tmp}] excludes all files with the extension PDF or TMP.

 

File

 

Default Value: N/A
Example:

  • sample.csv

  • tmp/another.csv

  • $filename

String/Expression/Suggestion

Specify the filename or a relative path to a file under the directory given in the Directory property. It should not start with a URL separator "/". The File property can be a JavaScript expression which will be evaluated with values from the input view document. When you click the Suggest icon, it will display a list of regular files under the directory in the Directory property. It generates the list by applying the value of the Filter property.


Flush interval (kB)

Default Value: -1
Example: 0

Interval

Specify the flush interval in kilobytes to flush a specified size of data during the file upload. This Snap can flush the output stream each time a given size of data is written to the target file server.

Number Of Retries

Default Value: 0
Example: 1

Integer/Expression

Specify the maximum number of attempts to be made to receive a response. 

Retry Interval (seconds)

Default Value1
Example: 5

Integer/Expression

Specify the time interval between two successive retry requests. A retry happens only when the previous attempt resulted in an exception.

File action*


Default ValueOverwrite
Example: Append 



Dropdown list

Select an action to perform if the specified file already exists:

  • Overwrite - The Snap attempts to write the file without checking for the file's existence for a better performance, and the "fileAction" field will be "overwritten" in the output view data.

  • Append - The Snap appends records in the incoming documents to the existing file.

  • Ignore - If the file already exists, the Snap does not throw an exception and does not overwrite the file, but writes an output document indicating that it has been 'ignored'.

  • Error - The error displays in the Pipeline Run Log if the file already exists.

File permissions for various users

Use this field set to select the user and the desired file permissions.

User type

Default Value: N/A
Example:  owner, group, others

String/Expression/Suggestion

It should be 'owner' or 'group' or 'others'. Each row can have only one user type and each user type should appear only once. Please select one from the suggested list.

 

File permissions

Default Value: N/A
Example:  read, write, execute, read+write, read+write+execute

String/Expression/Suggestion

It can be any combination of {read, write, execute} separated by '+' character. Please select one from the suggested list.

 

User Impersonation


Default Value: Deselected

Checkbox

Select this check box to enable user impersonation.

Output for each file written


Default Value: Deselected

Checkbox

Enables you to produce a different output document for each file that is written. If the Snap receives multiple binary input data and the File expression property is dynamically evaluated to a filename by using the Content-Location field from the input metadata, each binary data can be written to a different target file.

By default, the Snap produces only one output document with a filename that corresponds to the last file that was written.

 

Snap Execution


Default Value: Validate & Execute
Example: Execute only

Dropdown list

Troubleshooting


Snap Pack History

Hadoop Snap Pack