Google Protocol Buffer Input Adapter

Introduction

The TIBCO StreamBaseĀ® Input Adapter for Google Protocol Buffers (Protobuf for short) parses the Protobuf typed byte array based on a custom formatted descriptor that you provide.

User can either provide a .desc file describing the desired message type, or by default the system will auto generate an internal descriptor based on the input tuple's schema.

Adapter Properties

This section describes the properties you can set for this adapter, using the various tabs of the Properties view in StreamBase Studio.

General Tab

Name: Use this required field to specify or change the name of this instance of this component, which must be unique in the current EventFlow module. The name must contain only alphabetic characters, numbers, and underscores, and no hyphens or other special characters. The first character must be alphabetic or an underscore.

Adapter: A read-only field that shows the formal name of the adapter.

Class name: Shows the fully qualified class name that implements the functionality of this adapter. If you need to reference this class name elsewhere in your application, you can right-click this field and select Copy from the context menu to place the full class name in the system clipboard.

Start options: This field provides a link to the Cluster Aware tab, where you configure the conditions under which this adapter starts.

Enable Error Output Port: Select this check box to add an Error Port to this component. In the EventFlow canvas, the Error Port shows as a red output port, always the last port for the component. See Using Error Ports to learn about Error Ports.

Description: Optionally enter text to briefly describe the component's purpose and function. In the EventFlow Editor canvas, you can see the description by pressing Ctrl while the component's tooltip is displayed.

Adapter Properties Tab

Property Type Description
File Name resource chooser A customized Google protobuf file in .desc style.
Message Index int Each descriptor file could contain more than one message type; this index specifies which message type will be used. If no file provided above, this is disabled.
Protobuf Field Name string The corresponding Protobuf type byte array's field name.(optional)
Timestamp Format string Determines timestamp format of StreamBase data type Timestamp.
Enable Pass Through Fields check box Enable passing through all fields together with output.
Pass Through Fields Field Name string Name for the pass through fields.
Log Level INFO Controls the level of verbosity the adapter uses to issue informational traces to the console. This setting is independent of the containing application's overall log level. Available values, in increasing order of verbosity, are: OFF, ERROR, WARN, INFO, DEBUG, TRACE.

Edit Schemas Tab

Property Type Description
Proto Output Schema schema The schema used to convert a Protobuf type byte array into tuples with this schema.

Cluster Aware Tab

Use the settings in this tab to allow this operator or adapter to start and stop based on conditions that occur at runtime in a cluster with more than one node. During initial development of the fragment that contains this operator or adapter, and for maximum compatibility with TIBCO Streaming releases before 10.5.0, leave the Cluster start policy control in its default setting, Start with module.

Cluster awareness is an advanced topic that requires an understanding of StreamBase Runtime architecture features, including clusters, quorums, availability zones, and partitions. See Cluster Awareness Tab Settings on the Using Cluster Awareness page for instructions on configuring this tab.

Concurrency Tab

Use the Concurrency tab to specify parallel regions for this instance of this component, or multiplicity options, or both. The Concurrency tab settings are described in Concurrency Options, and dispatch styles are described in Dispatch Styles.

Caution

Concurrency settings are not suitable for every application, and using these settings requires a thorough analysis of your application. For details, see Execution Order and Concurrency, which includes important guidelines for using the concurrency options.

Data Input Port

The data port is the default input port for the Protobuf2Tuple adapter, and is always enabled. Use the data port to convert tuple data to the Protobuf type byte array.

The data type comparison table:

StreamBase Data Type Protobuf StreamBase Data Type
STRING STRING
DOUBLE DOUBLE/FLOAT
BOOL BOOLEAN/STRING
LONG INT64
DOUBLE INT64
TUPLE MESSAGE
INT INT32
BLOB BYTE_STRING
TIMESTAMP LONG/STRING

Data Output Port

  • With descriptor file: the output tuple's schema will be generated based on the provided file.

  • Without descriptor file: an internal descriptor will be generated based on the input data's schema.

  • Pass Through Fields: if enabled, all data will be passed together with the output tuple.

Customized Descriptor

When you want to customize the output schema for the message, you must provide a descriptor file that compiles from a Google Protobuf proto (both proto2 and proto3 are supported).

You can generate the descriptor file from the command line:

protoc protoFile.proto --descriptor_set_out=descFile.desc

Below is a proto file example:

syntax = "proto3";
import "google/protobuf/descriptor.proto";

message Input {
  string name = 1;
  int32 id = 2;  // Unique ID number for this person.
  Address add = 3;
  repeated PhoneNumber phones = 4;  //List
  
    message PhoneNumber {
        string number = 1;
        }
  
    message Address {
        string zip = 1;
        string street = 2;
    }
}