Amazon S3 File System Access via HDFS Adapters

About The Samples

These samples illustrate how to access the Amazon S3 file system using the TIBCO StreamBase® File Writer for Apache Hadoop Distributed File System (HDFS) and TIBCO StreamBase® File Reader for Apache Hadoop Distributed File System (HDFS).

Importing This Sample into StreamBase Studio

In StreamBase Studio, import this sample with the following steps:

  • From the top menu, select FileLoad StreamBase Sample.

  • Type s3 to narrow the list of options.

  • Select S3 file system access via HDFS from the Large Data Storage and Analysis category.

  • Click OK.

StreamBase Studio creates a single project containing the sample files.

Initial Setup

Open the sample application, FileBasic.sbapp or FileAdvanced.sbapp, select the Parameters tab, and edit the value to represent:

  • Your current S3 bucket

  • Where you would like to store the sample data

You must also open the sample S3 configuration file, sample_s3a.conf, and enter your security access keys for your S3 file system. See this link for other ways to authenticate with S3: Authenticating with S3.

Running The Basic Sample in StreamBase Studio

  1. In the Package Explorer, double-click to open the FileBasic.sbapp application. Make sure the application is the currently active tab in the EventFlow Editor.

  2. Click the Run button. This opens the SB Test/Debug perspective and starts the application.

  3. In the Manual Output view, switch the Stream to Data, then enter a string value such as test.

  4. Click Send Data to send a data tuple to be written to the file. Repeat for as many lines as you wish.

  5. In the Application Output view, observe tuples emitted on the Status output streams indicating actions performed on the file.

  6. In the Manual Output view, switch the Stream to WriteControl, then enter Close into the Command field.

  7. Click Send Data to send a control tuple, which closes the current file for writing.

  8. In the Manual Output view, switch the Stream to ReadControl, then click Send Data to send a control tuple, which reads the default file.

  9. Press F9 or click the Stop Running Application button.

  10. This demo has now created a file in your S3 file system called sample/Sample.txt containing the lines of data you submitted.

Running The Advanced Sample in StreamBase Studio

  1. In the Package Explorer, double-click to open the FileAdvanced.sbapp application. Make sure the application is the currently active tab in the EventFlow Editor.

  2. Click the Run button. This opens the SB Test/Debug perspective and starts the application.

  3. In the Application Output view, observe tuples emitted on the Status output streams indicating actions performed to the files.

  4. Press F9 or click the Stop Running Application button.

  5. This demo has now created multiple files in your S3 file system:

    1. sample/Sample.gz — This file is a GZip compressed file created from the SampleIn.txt file.

    2. sample/Sample.gz2 — This file is a BZip2 compressed file created from the SampleIn.txt file.

    3. sample/Sample.zip — This file is a Zip compressed file created from the SampleIn.txt file.

    4. sample/SampleOut.txt — This file is an uncompressed file created from the SampleIn.txt file.

Sample Location

When you load the sample into StreamBase Studio, Studio copies the sample project's files to your Studio workspace, which is normally part of your home directory, with full access rights.

Important

Load this sample in StreamBase Studio, and thereafter use the Studio workspace copy of the sample to run and test it, even when running from the command prompt.

Using the workspace copy of the sample avoids permission problems. The default workspace location for this sample is:

studio-workspace/sample_adapter_embedded_hdfsS3

See Default Installation Directories for the default location of studio-workspace on your system.