LiveView Adapters Sample

Introduction

This sample uses the LiveView adapters to query, publish, delete, and create tables on a running LiveView server from an EventFlow application. This sample assumes you are running the Hello World LiveView sample on localhost. There is a module parameter named LIVEVIEW_URI that you can change so all the adapters point to a different LiveView server.

Importing These Samples into StreamBase Studio

In StreamBase Studio, import this sample and the companion Hello LiveView sample with the following steps:

  • From the top-level menu, select File>Import Samples and Community Content.

  • Open the Spotfire LiveView category.

  • Hold Ctrl (Windows) or command (Mac) to make multiple selections, then select the Hello LiveView and LiveView Adapters samples.

  • Click Import Now.

StreamBase Studio creates a separate project for each sample.

Running the LiveView Adapter Samples

  1. First, run the Hello LiveView sample:

    • When this sample loads, it opens the LiveView Project Viewer, showing four icons. To start the sample, click the Run button in the upper right corner of the Project Viewer (or select the project's name in the Project Explorer view, right-click, and select Run As>LiveView Fragment).

    • The Console view shows several messages as the LiveView Server compiles the project and starts. Wait for the console message All tables have been loaded before proceeding to the next step.

  2. In the Project Explorer view, navigate back to the sample_adapter_embedded_lv-sbd folder.

  3. Open the src/main/eventflow/packageName folder.

  4. Double-click to open the lv2sbd.sbapp module.

  5. Make sure the module is the currently active tab in the EventFlow Editor, then click the Run button in Studio's top-level menu. This opens the SB Test/Debug perspective and starts the module.

  6. In the Output Streams view, observe tuples emitted by various streams.

    • The ReadyStatus, QueryStatus, PublishStatus, DeleteStatus, and AlertStatus streams all show the LiveView Server status as CONNECTED.

    • The QueryStatus stream shows the status of the two preconfigured queries:

      select * from ItemsSales where category=='book'
      select avg(LastSoldPrice) as AvgSoldPrice, category from ItemsSales where true group by category
    • The QueryOut stream shows the output from the first query.

    • The JSONQueryOut stream shows the output from the second query.

  7. Using the QueryIn, PublishIn, DeleteIn, CreateDropIn streams you can register additional queries, publish data, delete data, or create or drop tables on the configured LiveView server. See the individual adapter documentation for what the individual fields mean on these input streams.

  8. Adapters with the suffix RuntimeURI have the Use Runtime URIs feature enabled. They will not connect to LiveView on startup and instead will wait for a URI to be sent through their command input ports. If Hello LiveView is running on lv://localhost:11080, you can send a tuple with Connect in the ControlAction field and lv://localhost:11080 in the URI field. This will tell the adapter to connect to the LiveView server you designate. Adapters which share a common Connection Key will also share a connection. This sample features a Query and Publish adapter which share a connection, and a Delete adapter which has a different Connection Key value and will not share a connection. Sending a tuple with Disconnect as the ControlAction will disconnect all adapters in a shared connection pool.

  9. When done:

    1. Press F9 or click the Terminate EventFlow Fragment button to stop the EventFlow fragment.

    2. Press Ctrl+F9 (Windows), or command+F9 (Mac), or click the Terminate LiveView Fragment button to stop the Hello LiveView server.

Importing the Reliable Publish Sample

The run this sample, you must also run the Recovery, Kafka sample, then edit and run it separately. Prepare to run the Reliable Publish sample with the following steps:

  • From the top-level menu, click File>Import Samples and Community Content.

  • Open the Spotfire LiveView category.

  • Holding Ctrl (Windows) or command (Mac) to make multiple selections, select the LiveView Adapters and Recovery, Kafka samples.

  • Click Import Now.

StreamBase Studio creates separate projects for each sample. The Recovery, Kafka sample comes in as seven separate Studio projects.

Running the Reliable Publish Sample

The Reliable publish sample demonstrates one aspect of the LiveView Client Reliable publish interface. The sample demonstrates how a long-lived publisher can reliably deliver data to a LiveView server that can experience a server interruption — either a communications failure or a server crash — and the publisher can ensure that all the data it sent is present in the recovered LiveView table. The publisher must store tuples it sends until they are acknowledged by the server. This task is often accomplished by some form of persistent message bus, but this sample only uses a feed simulation data source and an in-memory Query Table to hold the tuples waiting to be acknowledged.

For more information on reliable publishing and LiveView server recovery from failure, see LiveView Data Recovery.

  1. First, prepare the Recovery, Kafka sample.

    • Of the seven project folders imported by this sample, you only deal with the one named lv_sample_kafka_recovery.

    • To avoid the dependencies required by Kafka, and to simplify this Reliable Publish sample, delete the LVPublisher.lvconf file from the src/main/liveview folder in the lv_sample_kafka_recovery project.

    • If the red marks remain on any of the seven folders, right-click any of them and run Maven>Update Project. Select the six project folders that begin with lv_sample_kafka and click OK.

    • Select the lv_sample_kafka_recovery project's name in the Project Explorer view, right-click and select Run As>LiveView Fragment.

    • The Console view shows several messages as the LiveView Server compiles the project and starts. Wait for the console message All tables have been loaded before proceeding to the next step.

  2. Using the lv-client command line tool, query the Orders table:

    lv-client -p 10087 live "select * from Orders"
  3. In the Project Explorer view, navigate to the sample_adapter_embedded_lv-sbd folder.

  4. Open the src/main/eventflow/packageName folder.

    Select the ReliablePublish.sbapp module, right-click and select Run As>EventFlow Fragment.

  5. The publisher begins publishing to the LiveView server. In the Console view, notice that the OrderID field is an incrementing long value. The OrderID number matches the Row# field in the output of the lv-client command.

  6. In the Debug view, select the lv_sample_kafka_recovery project, right-click and select Terminate. This disconnects and stops output from the lv-client command.

    Note

    Do not use the usual Terminate LiveView Fragment button to stop the server for this test, which would remove the entire node directory for the LiveView project.

  7. Re-run the lv_sample_kafka_recovery sample as a LiveView Fragment.

  8. When the LiveView recovery sample is ready, the Publish adapter name ReliablePublisher connects to the restarted LiveView server, identifies the highest sequence number it has stored, and replays all missing tuples from that point forward.

    You can start and stop the LiveView recovery sample many times, or you can also use the lv-client killsession command to kill the publisher. In all cases, the result is that rows are never never dropped. One easy way to validate this is that the highest OrderID number in the Console view should match the total number of rows in the table shown by the lv-client command.

  9. When done:

    1. Press F9 or click the Terminate EventFlow Fragment button to stop the EventFlow fragment.

    2. Press Ctrl+F9 (Windows), or command+F9 (Mac), or click the Terminate LiveView Fragment button to stop the LiveView server.

Sample Location

When you load the sample into StreamBase® Studio, Studio copies the sample project's files to your Studio workspace, which is normally part of your home directory, with full access rights.

Important

Load this sample in StreamBase® Studio, and thereafter use the Studio workspace copy of the sample to run and test it, even when running from the command prompt.

Using the workspace copy of the sample avoids permission problems. The default workspace location for this sample is:

studio-workspace/sample_adapter_embedded_lv-sbd

See Default Installation Directories for the default location of studio-workspace on your system.