Contents
This topic describes how to run the sample applications for the Apache Kafka Adapter Suite, and illustrates how to use the Kafka adapters when connecting a StreamBase application to a Kafka message broker.
The first sample, kafka.sbapp
, demonstrates a complete
process of connecting to a Kafka broker with a consumer and producer and sending
messages. The second sample, kafkaCustomSerialize.sbapp
, is a similar example but uses the
custom serializer to serialize and deserialize tuple messages to and from the broker
using a JSON string format. A third sample, kafkaAdmin.sbapp
, demonstrates the use of the Kafka Admin adapter,
which can be used to create and delete topics, list all topics on the cluster and
list all brokers on the cluster.
All samples assume a Kafka cluster is accessible from the host machine. The provided default is localhost on port 9092, but you can change this by editing the corresponding adapter properties in each sample.
The Apache Kafka adapter suite is implemented against the version of the Kafka libraries listed on the Supported Configurations page.
In StreamBase Studio, import this sample with the following steps:
-
From the top-level menu, select
> . -
Enter
Kafka
to narrow the list of options. -
Select Apache Kafka Producer, Consumer, and Admin adapters from the StreamBase Messaging Adapters category.
-
Click
.
StreamBase Studio creates a single project for the Kafka adapter samples in your current Studio workspace.
-
In the Project Explorer view, open the project you just loaded.
If you see red marks on a project folder, wait a moment for the project to load its features.
If the red marks do not resolve themselves after a minute, select the project, right-click, and select
> from the context menu. -
Open the
src/main/eventflow/
folder.packageName
-
Open the
kakfaAdmin.sbapp
file. On the canvas, select the Kafka Admin adapter and edit its Bootstrap Servers (Admin adapter) property to correctly access your Kafka cluster. -
Click the Run button. This opens the SB Test/Debug perspective and starts the module.
-
In the Manual Input view, select the AdminCommand input stream.
-
Enter
createTopic
in the command field andtopic1
in the topic field, and click the button to create this topic. Note that if this topic already existed in your Kafka cluster, this command will harmlessly fail (as evidenced by a tuple on the ConsumerStatus port). -
Still in the AdminCommand input stream, enter
topics
into the command field (all other fields are ignored for this command). -
Click
. -
Observe the output on the TopicConfigs output stream, listing the
topic1
topic created in step 6 (along with any other topics already present in your cluster). -
When done, press F9 or click the Terminate EventFlow Fragment button.
-
In the Project Explorer view, open the project you just loaded.
If you see red marks on a project folder, wait a moment for the project to load its features.
If the red marks do not resolve themselves after a minute, select the project, right-click, and select
> from the context menu. -
Open the
src/main/eventflow/
folder.packageName
-
Open the
kakfa.sbapp
file. On the canvas, select each Kafka adapter and edit its Brokers property to correctly access your Kafka cluster. -
Click the Run button. This opens the SB Test/Debug perspective and starts the module.
-
In the Manual Input view, select the ConsumerControl input stream.
-
Enter
subscribe
in the command field andtopic1
in the topic field, and click the button to subscribe to this topic. -
In the Manual Input view, select the PublishIn input stream.
-
Enter
topic1
into the topic field and any message text into the message field. -
Click
to send the message to Kafka. -
Observe your message emitted on both the PublishData and KafkaMessage output streams.
-
When done, press F9 or click the Terminate EventFlow Fragment button.
-
Continuing in the same project, open the
file, edit its adapters' Brokers property to point to your Kafka cluster and click the Run button. This opens the SB Test/Debug perspective and starts the module.kafkaCustomSerialize.sbapp
-
In the Manual Input view, select the ConsumerControl input stream.
-
Enter
subscribe
in the command field andtopic1
in the topic field, and click the button to subscribe to this topic. -
In the Manual Input view, select the PublishIn input stream.
-
Enter
topic1
into the topic field and any test values into the Field1 and Field2 subfields of message field. -
Click
. -
Observe your message emitted on the KafkaMessage output stream.
-
When done, press F9 or click the Terminate EventFlow Fragment button.
When you load the sample into StreamBase® Studio, Studio copies the sample project's files to your Studio workspace, which is normally part of your home directory, with full access rights.
Important
Load this sample in StreamBase® Studio, and thereafter use the Studio workspace copy of the sample to run and test it, even when running from the command prompt.
Using the workspace copy of the sample avoids permission problems. The default workspace location for this sample is:
studio-workspace
/sample_adapter_embedded_kafka
See Default Installation
Directories for the default location of studio-workspace
on your system.