This adapter sample illustrates the use of the TIBCO StreamBase® Text File Reader Adapter for Apache Hadoop Distributed File System (HDFS) by reading a file and emitting a tuple that contains the file's contents in a string field.
The file used in this sample MyFile.txt
needs to be placed on your HDFS file system before this sample will be able to run.
You must also open the filereader.sbapp
and select the Parameters
tab and edit the HDFS_FILE_PATH
and HDFS_USER
values to represent your HDFS setup to be able to access the required files.
-
In the Package Explorer, double-click to open the
filereader.sbapp
application. Make sure the application is the currently active tab in the EventFlow Editor. -
Click the Run button. This opens the SB Test/Debug perspective and starts the application.
-
In the Manual Input view, click
to send the defaultnull
tuple. -
In the Application Output view, observe tuples emitted on the
Status
andData
output streams, the latter of which contains the contents of the configured default file,MyFile.txt
. -
Press F9 or click the Stop Running Application button.
When you load the sample into StreamBase Studio, Studio copies the sample project's files to your Studio workspace, which is normally part of your home directory, with full access rights.
Important
Load this sample in StreamBase Studio, and thereafter use the Studio workspace copy of the sample to run and test it, even when running from the command prompt.
Using the workspace copy of the sample avoids permission problems. The default workspace location for this sample is:
studio-workspace
/sample_adapter_embedded_hdfsfilereader
See Default Installation Directories for the default location of studio-workspace
on your system.
In the default TIBCO StreamBase installation, this sample's files are initially installed in:
streambase-install-dir
/sample/adapter/embedded/hdfsfilereader
See Default Installation Directories for the default location of studio-workspace
on your system.