Build a real-time system monitoring using a Kafka broker and a dashboard using WebSockets. Here is a complete tutorial covering warp10-kafka-plugin, the Warp 10 accelerator feature and a Discovery dashboard.

By following this tutorial, you will be able to handle our Kafka plugin, the Warp 10 accelerator, and Web-sockets usage within Discovery dashboards.
Here are the steps:
In this tutorial we will assume that your computer IP is
192.168.1.1
(you can useipconfig
on Windows orifconfig
on Linux to grab your own IP).
Start a local Kafka broker
For convenience, we will use Docker images and a docker-compose
file to start our broker. Of course, do not use that in production. it is just for test purposes.
Create a file named docker-compose.yml
(customize with your IP address):
and start it with:
The Kafka broker is up and running.
Start and configure a local Warp 10 instance
Also for convenience, we will use our Docker image. In this tutorial, we use a persistent volume in /opt/warp10
_3.0 on our local computer.
Warp 10 is up and running.
Tokens generation
You should get a result like this:
Now, we have to install our Kafka plugin.
For convenience, we have to change the permissions of the /opt/warp10_3.0/warp10
directory. Do not do this in production. The user id inside the docker image for warp 10 is 942. You can also create a local user with the same UID.
Kafka plugin
You can do this manually or take advantage of our WarpFleet Repository mechanism.
The WarpFleet way
First, you need Gradle on your computer.
Read more about the WarpFleet Gradle plugin |
In /opt/warp10_3.0/warp10
, create a file named build.gradle
:
And run:
As it is said, we have to configure the plugin in /opt/warp10_3.0/warp10/etc/conf.d/99-io.warp10-warp10-plugin-kafka.conf
, we will do that later.
The manual way
This is a less fun way to install an extension or a plugin. Plugins or extensions may need jar dependencies you will have to handle by yourself. Luckily, the Kafka plugin does not need such dependencies.
Download the Kafka plugin jar in /opt/warp10_3.0/warp10/lib
, you can find the link here.

Create an empty file named 99-io.warp10-warp10-plugin-kafka.conf
in /opt/warp10_3.0/warp10/etc/conf.d
:
Kafka Plugin configuration
Create a new directory with the same user and rights as the others:
Edit /opt/warp10_3.0/warp10/etc/conf.d/99-io.warp10-warp10-plugin-kafka.conf
:
The Kafka plugin installation is done!
Warp 10 Accelerator
The Warp 10 Accelerator is an option of the standalone version of Warp 10 that adds an in-memory cache to a Warp 10 instance. This cache covers a certain period of time and can be used to store data for ultra-fast access. Learn more about this feature.
Create and edit /opt/warp10_3.0/warp10/etc/conf.d/99-accelerator.conf
:
We create 3 chunks of 1 hour each.
The Accelerator configuration is done!
Warp 10 restart
A dummy producer
We will create a small NodeJS script that will produce dummy random data.
In a directory (ie: /home/me/workspace/kafka-producer
), run:
Then, create /home/me/workspace/kafka-producer/index.js
:
And run it:
Kafka consumer
In /opt/warp10_3.0/warp10/kafka
, create a file named test.mc2
(customize with your IP address):
At this stage, you will see in /opt/warp10_3.0/warp10/logs/warp10.log
:
Now, modify test.mc2
to convert our data to a GTS and insert it into Warp 10:
Open WarpStudio to test our data by fetching the last minute of data:
You should have a chart like this:

As we are using Warp 10 accelerator, fetching last minute fetch data in RAM, and ACCEL.REPORT
should return
"status":true,"accelerated":true
A real-time dashboard displaying live data
Now, it is time to display our live data.
Building a dashboard in WarpStudio
Open WarpStudio and use the "discovery empty dashboard" snippet and modify it to add a line chart ("discovery-tile line, chart" snippet):
Execute it and display the "Discovery" tab. You should see something like that:

The Web version of the dashboard
In your workspace, create a file named index.html
:
Open it in your favorite browser to see your live data.

Going further
You can customize your Kafka handler by increasing the parallelism.
You can also pimp your dashboard:

Here is a complete sample.
Read more
Industrie du futur : les données sur le chemin critique - Partie 3
Edge computing: Build your own IoT Platform
Health data analysis made easy with Warp 10

Senior Software Engineer