Contents
The Shared Tables Across Containers sample application demonstrates how to use the same StreamBase Query Tables from different modules in different containers.
This sample includes three equal-value EventFlow modules; that is, none of the three is a top-level module that references
the other two. Instead, the module TableHome.sbapp
contains a Query Table, while Writer.sbapp
writes to that table, and Reader.sbapp
reads from that table. All three modules reside in their own containers so that table access is always across container boundaries.
The cross-container shared table sample includes the following files:
File | Purpose |
---|---|
cross-table.sbdeploy |
A StreamBase deployment file that specifies loading each EventFlow module into its own named container. Use this file to run this sample from StreamBase Studio and from the command line. |
TableHome.sbapp |
An EventFlow module consisting of a single element, a shared Query Table. The sbdeploy file specifies running this module
in the container named default .
|
Writer.sbapp |
An EventFlow module that performs writes to the shared Query Table defined in TableHome.sbapp . This module includes a Query Table with a Data Location setting of Defined by connection path. This module runs in a container named writer .
|
fswrite.sbfs |
A StreamBase feed simulation file used by the embedded feed simulation adapter in the Writer.sbapp module. The feed simulation generates a stream of transaction entries for products with one of seven product codes (SKUs)
across four possible store locations.
|
Reader.sbapp |
An EventFlow module that performs reads on the shared Query Table defined in TableHome.sbapp . This module also includes a Query Table with a Data Location setting as above. This module runs in a container named reader .
|
fsread.sbfs |
A StreamBase feed simulation file used by the embedded feed simulation adapter in the Reader.sbapp module. The feed simulation generates a stream of random reads of the Query Table using one of the seven valid SKU product
codes as row keys. This returns the last-written transaction record for the product code entered.
|
The Shared Tables Across Containers sample's launch configuration includes the following application module to container mappings:
Application Module | Container |
---|---|
TableHome.sbapp |
default |
Writer.sbapp |
writer |
Reader.sbapp |
reader |
-
In the Package Explorer, open the
sample_cross-containers-table
project folder. -
Right-click
cross-table.sbdeploy
. In the resulting context menu, select Run As, then select .This opens the StreamBase Test/Debug perspective and starts the sample with three containers in one StreamBase Server instance.
The Feed Simulation Adapter in the
Writer.sbapp
module automatically starts writing transaction records to the shared Query Table resident in theTableHome.sbapp
module. Another Feed Simulation Adapter in theReader.sbapp
module sends one of the seven valid SKU values, randomly selected, to a Query Read component that reads from the same Query Table. Each SKU value sent in a read query emits the last-written transaction record for the specified SKU. -
Observe transaction data like the following in the Application Output view streaming at a rate of about one entry per second.
Time Output Stream Fields 17:08:57 reader.OutputStream sku=EFGH, category=electronics, storeLocation=NYC, quantityRemaining=3, lastSoldPrice=19.5 17:08:58 reader.OutputStream sku=WXYZ, category=snacks, storeLocation=PHX, quantityRemaining=6, lastSoldPrice=18.5 -
Notice that there are no entries for the writer.out:FeedSimulation_1 stream except at start up time. The writes performed by the writer are not logged to an output stream, but you can see evidence that they are happening by observing the changing data output by the
reader.OutputStream
stream. -
[Optional] You can choose to manually enter an
sku
toReader.sbapp
in the Manual Input view.-
In the Application Output view, select
reader.OutputStream
in the Stream drop-down list. -
In the Manual Input view, select
reader.FSControl
from the Stream drop-down list. -
Enter the following tuple and click
. This stops the stream of read operations of the shared Query Table driven by the feed simulation inReader.sbapp
. However, writes to the Query Table from theWriter.sbapp
module continue unchanged. -
In the Manual Input view, select
reader.ManualFeed
from the Stream drop-down list. -
Enter one of the following case-sensitive values in the sku field:
SKU Literal Represents ABCD SKU in the toys category EFGH SKU in the electronics category LMNO SKU in the books category QRST SKU in the greeting cards category WXYZ SKU in the snacks category IJKL SKU in the sporting goods category PQRS SKU in the games category -
In the Manual lnput view, click sku you entered.
. Observe that an entry appears in the Application Output view for the -
If desired, enter
start, null
in thereader.FSControl
stream to restart the feed-generated reads of the Query Table.
-
-
When done, press F9 or click the Stop Running Application button.
To run this sample from the Linux or OS X terminal prompt, or on Windows, from a StreamBase Command Prompt, use the following steps.
-
Open three terminal windows or StreamBase Command Prompts. In each window, navigate to the directory in your Studio workspace that contains your copy of this sample.
-
In window 1, run this sample's three applications, each in its own container, by launching the provided sbdeploy file:
sbd cross-table.sbdeploy
-
In window 2, enter the following command to dequeue tuples from the output stream of
Reader.sbapp
. Tuples are emitted right away because theReader.sbapp
module has an auto-starting Feed Simulation Adapter that issues a random sequence of validsku
values.sbc dequeue reader.OutputStream
-
To stop the automatic read stream and enter
sku
values manually, follow these sub-steps:-
In window 2, press Ctrl+C to stop the dequeue process.
-
In window 2, enter the following command to open the control stream in
Reader.sbapp
:sbc enq reader.FSControl
-
In window 2, enter the following command to stop the automatic stream of query reads:
stop, null
-
In window 2, press Ctrl+C to close the enqueuer session.
-
In window 2, restart the dequeue on the Reader app's OutputStream, using the following command. This time, no tuples will automatically emit.
sbc dequeue reader.OutputStream
-
In window 3, open an enqueue stream for manual input:
sbc enq reader.ManualFeed
-
In window 3, enter one of the following valid sku values, one per line. The values are case-sensitive:
SKU Literal Represents ABCD SKU in the toys category EFGH SKU in the electronics category LMNO SKU in the books category QRST SKU in the greeting cards category WXYZ SKU in the snacks category IJKL SKU in the sporting goods category PQRS SKU in the games category -
In window 2, observe the last-written record for the sku value entered, read from the shared Query Table.
-
-
In windows 2 and 3, press Ctrl+C to close the enqueuer and dequeuer. Then run
sbadmin shutdown
to close the server.
In StreamBase Studio, import this sample with the following steps:
-
From the top menu, select
→ . -
In the search field, type
cross
to narrow the list of samples. -
Select
Create a shared table across containers
from the Data Constructs and Operators category. -
Click OK.
StreamBase Studio creates a single project for all the operator samples.
When you load the sample into StreamBase Studio, Studio copies the sample project's files to your Studio workspace, which is normally part of your home directory, with full access rights.
Important
Load this sample in StreamBase Studio, and thereafter use the Studio workspace copy of the sample to run and test it, even when running from the command prompt.
Using the workspace copy of the sample avoids permission problems. The default workspace location for this sample is:
studio-workspace
/cross-container-table
See Default Installation Directories for the default location of studio-workspace
on your system.
In the default TIBCO StreamBase installation, this sample's files are initially installed in:
streambase-install-dir
/sample/cross-container-table
See Default Installation Directories for the default location of studio-workspace
on your system.