This adapter sample illustrates the use of the TIBCO StreamBase® Text File Reader Adapter for Apache Hadoop Distributed File System (HDFS) by reading a file and emitting a tuple that contains the file's contents in a string field.
MyFile.txt file used in this sample must be placed on your HDFS file system before this sample can run.
You must also open the
filereader.sbapp file in the
src/main/eventflow/ folder. Select the Parameters tab and edit the
HDFS_USER values to represent your HDFS setup, as well as to be able to access the required files.
In StreamBase Studio, import this sample with the following steps:
From the top-level menu, select> .
hdfsto narrow the list of options.
Select hdfsfilereader from the Large Data Storage and Analysis category.
StreamBase Studio creates a project for this sample.
In the Project Explorer view, open the sample you just loaded.
If you see red marks on a project folder, wait a moment for the project to load its features.
If the red marks do not resolve themselves after a minute, select the project, right-click, and select> from the context menu.
filereader.sbappfile and click the Run button. This opens the SB Test/Debug perspective and starts the module.
In the Manual Input view, clickto send the default
In the Output Streams view, observe tuples emitted on the
Dataoutput streams, the latter of which contains the contents of the configured default file,
Press F9 or click the Terminate EventFlow Fragment button.
When you load the sample into StreamBase Studio, Studio copies the sample project's files to your Studio workspace, which is normally part of your home directory, with full access rights.
Load this sample in StreamBase Studio, and thereafter use the Studio workspace copy of the sample to run and test it, even when running from the command prompt.
Using the workspace copy of the sample avoids permission problems. The default workspace location for this sample is:
See Default Installation Directories for the default location of
studio-workspace on your system.