EventFlow: Communication with Containerized Apps

It is a best practice using docker exec to communicate with your app while running in Docker. This command asks Docker to run your epadmin commands inside the Docker container and return the results.

Prerequisites

This page assumes you have the docker_1st application running in Docker as described on EventFlow: Creating Docker Images and EventFlow: Run and Manage with Docker Commands.

To confirm, run the docker ps command and look for a container with the firstapp alias name:

CONTAINER ID IMAGE             COMMAND                 CREATED        STATUS        PORTS  NAMES
3c9ff715ada5 docker_1st:1.0.0  "/bin/sh -c ${STREAM…"  15 minutes ago Up 15 minutes        firstapp

Run epadmin Using docker exec Commands

You can run any command known to your Docker container as long as you know the container name you assigned with the --name option of the docker run command. This allows you to run any StreamBase epadmin command by prefixing it with docker exec aliasName. For example, use the following command to see the StreamBase Runtime environment in the container:

docker exec firstapp epadmin display services

To run most other epadmin commands inside the container, you must know the node name you assigned with the STREAMING_NODENAME variable. For example:

docker exec firstapp epadmin --servicename=A.cluster display node
docker exec firstapp epadmin --servicename=A.cluster display engine

Sending Data Manually

Run epadmin commands in the Docker container to receive data sent from the Docker host and return the results:

  1. Open a StreamBase Command Prompt (Windows) or StreamBase-configured shell prompt (macOS). This is window 1.

  2. To verify that you are not running a StreamBase application locally on your host machine, run epadmin display services. The command returns silently if no services are running.

    (This command might return a list of services on other machines in your local subnet. In this case, run:

    epadmin display services --servicename=clustername

    where clustername is usually your system login name. See Clusters.)

  3. Connect to the StreamBase node running in the Docker container, and run the same display services command. prefixed with docker exec firstapp.

    docker exec firstapp epadmin display services
  4. Compare the output of the two epadmin display services commands. This confirms that you are not running a StreamBase application locally but are running one in the Docker container.

  5. In window 1, set up a dequeue for the first output stream. You must use the -it options to specify an interactive TTY session:

    docker exec -it firstapp epadmin --servicename=A.cluster dequeue stream --path=BigTrades

    (Remember that the stream names are defined in the firstapp.sbapp EventFlow module now running inside the container.)

  6. Open window 2 and set up a dequeue for the other output stream.

    docker exec -it firstapp epadmin --servicename=A.cluster dequeue stream --path=AllTheRest

    Notice that no output is produced yet. The command windows are waiting to show output from the two output streams.

  7. Open window 3 and set up an enqueue to the app's input stream. There is only one input stream in this EventFlow module, so it does not need to be named:

    docker exec -it firstapp epadmin --servicename=A.cluster enqueue stream
  8. Send data manually. In window 3, send the following tuples:

    • IBM, 4000

    • MSFT, 5600

    • GOOG, 12000

  9. Observe the following in window 1:

    • [A.cluster] Tuple = GOOG,12000

    And the following in window 2:

    • [A.cluster] Tuple = IBM,4000

    • [A.cluster] Tuple = MSFT,5600

  10. Type Ctrl+C in window 3 to stop the enqueue processes. Leave the two dequeue command windows running as-is.

Sending Data with a Feed Simulation

Next, run a feed simulation to enqueue tuples:

  1. In window 3, run:

    docker exec -it firstapp epadmin --servicename=A.cluster start playback 
      --sim=/var/opt/tibco/streambase/node/A.cluster/application/fragments/
           com.tibco.sb.sample.firstapp/firstapp-enum.sbfs

    This command is split onto three lines for clarity. Remember to enter as one long line, closing up the pathname to the feed simulation file so that it has no spaces.

  2. Data now flows to the two output ports, and therefore to the two dequeue windows 1 and 2. Only trades with stock quantities over 10000 go to the BigTrades port dequeued in window 1.

  3. To end the feed simulation, run:

    docker exec -it firstapp epadmin --servicename=A.cluster stop playback
  4. Type Ctrl+C in windows 1 and 2 to stop the dequeue processes. Close the three windows as needed.

  5. When done, you can stop and remove the container as described in Stopping and Removing the Container in the previous page.