Java VM Memory Settings

This page discusses the settings that affect the amount of memory usable by EventFlow and LiveView fragments and by StreamBase Studio.

No Default Memory Settings

Unlike previous releases, StreamBase releases starting with 10.0 do not have internally set default Java VM memory settings for fragments and nodes. In the absence of explicit settings, the default values used are those of the JVM engine that runs the fragment. You can determine your JVM's defaults with a command like the following for macOS and Linux:

java -XX:+PrintFlagsFinal -version | grep HeapSize

Or use this version on Windows:

java -XX:+PrintFlagsFinal -version | findstr HeapSize

Also consult Oracle documentation on this subject.

In most cases, you only need to be concerned with the JVM settings for fragment launches as discussed in the following sections. For editing very large applications in Studio, you might need to separately increase Studio's memory footprint, as described in JVM Memory for StreamBase Studio.

Remember that increasing the JVM memory for a fragment can also increase the size of the fragment's node directory on disk. This is usually not an issue on a server deployment, but might be a concern on a development machine.

JVM Memory for EventFlow Fragments

Configure JVM memory for an EventFlow fragment in a HOCON configuration file of type javaengine, or sbengine and ldmengine, which is a superset of the javaengine type.

To increase the JVM heap memory for an EventFlow fragment, include a configuration file like the following in the src/main/configurations folder of your Studio project.

name = "sbengine"
version = "1.0.0"
type = "com.tibco.ep.streambase.configuration.sbengine"
configuration = {
  StreamBaseEngine = {
    ...
    jvmArgs = [
      "-Xmx2048m"
      "-Xms512m"
      "-XX:+UseG1GC"
      "-XX:MaxGCPauseMillis=500"
      "-XX:ConcGCThreads=1"
    ]
    streambase = {
    ...
    }
  }
}

JVM Memory for LiveView Fragments

LiveView fragments require at least 3072 MB, up to 8192 MB or more. The suggested minimum size is 4096 MB.

Configure JVM memory for an EventFlow fragment in a HOCON configuration file of type javaengine, or ldmengine, which is a superset of the javaengine type.

All LiveView samples included in StreamBase have a configuration file like the following example. To increase the JVM heap memory for a LiveView fragment, increase the -Xmx setting in a configuration file like the following in the src/main/configurations folder of your Studio project.

By default, when you create a LiveView fragment, Studio generates a configuration file like the following example in the src/main/configurations folder of your Studio project.

name = "ldmengine"
version = "1.0.0"
type = "com.tibco.ep.ldm.configuration.ldmengine"

configuration = {
  LDMEngine = {
    // Recommended mimimum JVM 1.8 flags for LiveView
    jvmArgs = [
      "-Xmx3g"
      "-Xms512m"
      "-XX:+UseG1GC"
      "-XX:MaxGCPauseMillis=500"
      "-XX:ConcGCThreads=1"
    ]
    ldm = {
    }
  }
}

JVM Memory for StreamBase Studio

By default, StreamBase Studio allocates -Xms256m -Xmx1024m. In general, StreamBase users do not need to adjust the JVM memory settings for running Studio. When Studio runs or debugs a fragment on the same machine, it launches nodes as separate JVM processes with their own memory settings.

When Studio launches a fragment, it honors the jvmArgs settings in configuration files, as described above. Thus, Studio running with its default memory settings can launch nodes with larger memory settings defined in configuration, as long as there is enough system memory (including virtual memory) to support the launch.

An exception exists for those editing very large StreamBase modules with hundreds of components and many sub-modules. In this case, typechecking time and system response time can improve with larger Studio memory settings.

Adjust Studio's JVM memory settings by setting the STREAMBASE_STUDIO_VMARGS environment variable for the environment in which Studio starts, as described in STREAMBASE_STUDIO_VMARGS. For example, the following setting provides an increase in the default values:

STREAMBASE_STUDIO_VMARGS=-Xms1024M -Xmx2048M ...

Verify that your machine has enough system memory to support simultaneous editing and launching of large applications. 64-bit Windows and macOS systems should have 4 GB to take advantage of large JVM settings, with 8 GB the recommended minimum for editing and running LiveView projects.

Follow these rules when changing Studio JVM memory settings:

  • You can use the STREAMBASE_STUDIO_VMARGS environment variable to set Java properties for Studio as well as its JVM memory settings, as described in STREAMBASE_STUDIO_VMARGS. Remember to always specify memory settings when you add a property setting to the variable. This usually means you must re-specify the default settings.

  • Raise the -Xmx setting incrementally, stopping to test the results. Try adding 512M at a time to the -Xmx value, then run Studio to test its responsiveness:

    -Xms512M
    -Xmx1526M

    Next, try:

    -Xms1024M
    -Xmx2048M

    and so on.

Native Code Competition for Memory Resources

Remember that StreamBase Studio running in one Java VM launches fragments into nodes, each running with their own separate JVM engine processes with their own memory constraints. If you increase only Studio's memory settings, that does not help the nodes that are hosting a large fragment; in fact, it could hamper them.

In a sense, Studio and Server must compete for memory resources on memory-constrained development systems. For this reason, only increase Studio's JVM memory settings by the minimum amount that supports acceptable response times for typechecking and editing large fragments. For example, on an 8 GB 64-bit system used for editing a large EventFlow module with dozens of subordinate modules, you might allocate up to 2 GB for Studio, and leave the rest for node launches.

Any memory allocated by native code in the node startup process is allocated before the JVM engine starts. Thus, memory allocated by native code components is outside the JVM heap and competes for overall memory resources with both Studio and launched nodes. Native code memory allocations include any memory allocated by native code operators and adapters, and by native code DLLs, .dynlib, or .so files called by operators or adapters.

It bears repeating that Studio is designed for authoring, testing, and debugging of fragments, but is not designed to host high performance runs of fragments and nodes, or for benchmarking of node performance, as discussed in StreamBase Studio Performance.

Diagnostic Settings and Tools

You can configure StreamBase Studio to display its own Java heap memory usage. Run Window>Preferences, select the top-level General page, and select the Show heap status check box. Thereafter, Studio shows current and maximum reserved heap memory usage on the far right of the status bar, in the lower right corner of the Studio window.

To diagnose and troubleshoot memory usage with StreamBase Server, consider adding these additional settings in the jvm-args parameter. These settings cause additional information to be reported to the server console (or to the Console view in Studio) unless specifically overridden.

  • To view information about the Java Just-In-Time (JIT) compiler (HotSpot), try adding:

    -XX:+PrintCompilation
  • If the JVM is using an excessive amount of memory, or spending too much time performing garbage collection, try adding this argument to get more specific information:

    -verbose:gc

    With this argument, the JVM periodically generates output telling you how large the Java heap size is and how much time is being spent in garbage collection.

You can also use the following tools provided by StreamBase or included as part of its Java installation to inspect memory usage. The tools are listed in order of module-level to Java object-level inspection.

  • Use StreamBase profiling to review operator and queue changes over time.

  • Use the Java JConsole utility to see the JVM-level memory use pattern, real-time objects, and threads.

  • Use the Java jmap utility to see what Java objects are on the heap at any point in time. You can also use the Eclipse plug-in MAT (Memory Analyzer Tool) to show the output of jmap in graphical form.

  • Use the Java jstack utility to see what Java threads are working at any point in time.