Shared Tables Across Containers Sample

The Shared Tables Across Containers sample application demonstrates how to use the same StreamBase Query Tables from different modules in different containers.

This sample includes three equal-value EventFlow modules; that is, none of the three is a top-level module that references the other two. Instead, the module TableHome.sbapp contains a Query Table, while Writer.sbapp writes to that table, and Reader.sbapp reads from that table. All three modules reside in their own containers so that table access is always across container boundaries.

This Sample's Files

The cross-container shared table sample includes the following files:

File Purpose
cross-table.sbdeploy A StreamBase deployment file that specifies loading each EventFlow module into its own named container. Use this file to run this sample from StreamBase Studio and from the command line.
TableHome.sbapp An EventFlow module consisting of a single element, a shared Query Table. The sbdeploy file specifies running this module in the container named default.
Writer.sbapp An EventFlow module that performs writes to the shared Query Table defined in TableHome.sbapp. This module includes a Query Table with a Data Location setting of Defined by connection path. This module runs in a container named writer.
fswrite.sbfs A StreamBase feed simulation file used by the embedded feed simulation adapter in the Writer.sbapp module. The feed simulation generates a stream of transaction entries for products with one of seven product codes (SKUs) across four possible store locations.
Reader.sbapp An EventFlow module that performs reads on the shared Query Table defined in TableHome.sbapp. This module also includes a Query Table with a Data Location setting as above. This module runs in a container named reader.
fsread.sbfs A StreamBase feed simulation file used by the embedded feed simulation adapter in the Reader.sbapp module. The feed simulation generates a stream of random reads of the Query Table using one of the seven valid SKU product codes as row keys. This returns the last-written transaction record for the product code entered.

This Sample's Launch Configuration

The Shared Tables Across Containers sample's launch configuration includes the following application module to container mappings:

Application Module Container
TableHome.sbapp default
Writer.sbapp writer
Reader.sbapp reader

Running This Sample in StreamBase Studio

  1. In the Package Explorer, open the sample_cross-containers-table project folder.

  2. Right-click cross-table.sbdeploy. In the resulting context menu, select Run As, then select cross-containers-table.

    This opens the StreamBase Test/Debug perspective and starts the sample with three containers in one StreamBase Server instance.

    The Feed Simulation Adapter in the Writer.sbapp module automatically starts writing transaction records to the shared Query Table resident in the TableHome.sbapp module. Another Feed Simulation Adapter in the Reader.sbapp module sends one of the seven valid SKU values, randomly selected, to a Query Read component that reads from the same Query Table. Each SKU value sent in a read query emits the last-written transaction record for the specified SKU.

  3. Observe transaction data like the following in the Application Output view streaming at a rate of about one entry per second.

    Time Output Stream Fields
    17:08:57 reader.OutputStream sku=EFGH, category=electronics, storeLocation=NYC, quantityRemaining=3, lastSoldPrice=19.5
    17:08:58 reader.OutputStream sku=WXYZ, category=snacks, storeLocation=PHX, quantityRemaining=6, lastSoldPrice=18.5
  4. Notice that there are no entries for the writer.out:FeedSimulation_1 stream except at start up time. The writes performed by the writer are not logged to an output stream, but you can see evidence that they are happening by observing the changing data output by the reader.OutputStream stream.

  5. [Optional] You can choose to manually enter an sku to Reader.sbapp in the Manual Input view.

    1. In the Application Output view, select reader.OutputStream in the Stream drop-down list.

    2. In the Manual Input view, select reader.FSControl from the Stream drop-down list.

    3. Enter the following tuple and click Send Data. This stops the stream of read operations of the shared Query Table driven by the feed simulation in Reader.sbapp. However, writes to the Query Table from the Writer.sbapp module continue unchanged.

    4. In the Manual Input view, select reader.ManualFeed from the Stream drop-down list.

    5. Enter one of the following case-sensitive values in the sku field:

      SKU Literal Represents
      ABCD SKU in the toys category
      EFGH SKU in the electronics category
      LMNO SKU in the books category
      QRST SKU in the greeting cards category
      WXYZ SKU in the snacks category
      IJKL SKU in the sporting goods category
      PQRS SKU in the games category
    6. In the Manual lnput view, click Send Data. Observe that an entry appears in the Application Output view for the sku you entered.

    7. If desired, enter start, null in the reader.FSControl stream to restart the feed-generated reads of the Query Table.

  6. When done, press F9 or click the Stop Running Application button.

Running This Sample at the Command Prompt

To run this sample from the Linux or OS X terminal prompt, or on Windows, from a StreamBase Command Prompt, use the following steps.

  1. Open three terminal windows or StreamBase Command Prompts. In each window, navigate to the directory in your Studio workspace that contains your copy of this sample.

  2. In window 1, run this sample's three applications, each in its own container, by launching the provided sbdeploy file:

    sbd cross-table.sbdeploy

  3. In window 2, enter the following command to dequeue tuples from the output stream of Reader.sbapp. Tuples are emitted right away because the Reader.sbapp module has an auto-starting Feed Simulation Adapter that issues a random sequence of valid sku values.

    sbc dequeue reader.OutputStream

  4. To stop the automatic read stream and enter sku values manually, follow these sub-steps:

    1. In window 2, press Ctrl+C to stop the dequeue process.

    2. In window 2, enter the following command to open the control stream in Reader.sbapp:

      sbc enq reader.FSControl

    3. In window 2, enter the following command to stop the automatic stream of query reads:

      stop, null

    4. In window 2, press Ctrl+C to close the enqueuer session.

    5. In window 2, restart the dequeue on the Reader app's OutputStream, using the following command. This time, no tuples will automatically emit.

      sbc dequeue reader.OutputStream

    6. In window 3, open an enqueue stream for manual input:

      sbc enq reader.ManualFeed

    7. In window 3, enter one of the following valid sku values, one per line. The values are case-sensitive:

      SKU Literal Represents
      ABCD SKU in the toys category
      EFGH SKU in the electronics category
      LMNO SKU in the books category
      QRST SKU in the greeting cards category
      WXYZ SKU in the snacks category
      IJKL SKU in the sporting goods category
      PQRS SKU in the games category
    8. In window 2, observe the last-written record for the sku value entered, read from the shared Query Table.

  5. In windows 2 and 3, press Ctrl+C to close the enqueuer and dequeuer. Then run sbadmin shutdown to close the server.

Importing This Sample into StreamBase Studio

In StreamBase Studio, import this sample with the following steps:

  • From the top menu, select FileLoad StreamBase Sample.

  • In the search field, type cross to narrow the list of samples.

  • Select Create a shared table across containers from the Data Constructs and Operators category.

  • Click OK.

StreamBase Studio creates a single project for all the operator samples.

Sample Location

When you load the sample into StreamBase Studio, Studio copies the sample project's files to your Studio workspace, which is normally part of your home directory, with full access rights.

Important

Load this sample in StreamBase Studio, and thereafter use the Studio workspace copy of the sample to run and test it, even when running from the command prompt.

Using the workspace copy of the sample avoids permission problems. The default workspace location for this sample is:

studio-workspace/cross-container-table

See Default Installation Directories for the default location of studio-workspace on your system.

In the default TIBCO StreamBase installation, this sample's files are initially installed in:

streambase-install-dir/sample/cross-container-table

See Default Installation Directories for the default location of studio-workspace on your system.