Skip to content

Harnessing the Power of HiveMQ Cloud and Confluent Cloud

by Magi Erber
20 min read

In the world of real-time data processing and messaging systems, MQTT and Kafka have emerged as two powerful technologies that work exceptionally well together especially as deployments scale. MQTT, a lightweight and efficient protocol designed for constrained devices and unreliable networks, finds its perfect counterpart in Kafka, a distributed streaming platform built for high-throughput and fault-tolerant data processing.

By choosing HiveMQ Cloud for MQTT and Confluent Cloud for Kafka, organizations gain access to feature-rich platforms that optimize MQTT communication and enhance Kafka deployments for mission-critical applications. This powerful combination enables seamless integration between MQTT-based IoT devices and Kafka’s stream processing capabilities, unlocking the full potential of IoT infrastructure for scalable and future-ready solutions.

This post will focus on how to get started quickly and easily to integrate IoT data using HiveMQ Cloud and Confluent Cloud. While the use cases are varied, we’ll consider a smart home environment as an example, where various IoT devices such as temperature sensors, motion detectors, and smart thermostats are deployed throughout a home. The goal is to collect real-time data from these smart home devices, process it, and take appropriate actions based on the data. In this example, MQTT and Kafka work together to enable the collection, processing, and actioning of IoT data in real-time. MQTT efficiently captures the data from devices, while Kafka ensures reliable and scalable stream processing, enabling smarter and more automated systems in the context of a smart home or any IoT-driven environment.

Connecting HiveMQ Cloud with Confluent Cloud for MQTT and Kafka Integration

HiveMQ Cloud integrates easily and seamlessly with Confluent Cloud. A simple configuration enables efficient data streaming between HiveMQ Cloud and Confluent Cloud clusters for bidirectional message exchange without the operational burden.

For this example, we are using the HiveMQ Cloud (Starter Plan) to connect a HiveMQ Cloud cluster with a Confluent Cloud Basic Cluster. If you already have an existing account for both services, you can skip ahead.

Setting up a Confluent Cloud Cluster

If you do not yet have an account with Confluent Cloud, visit confluent.io and sign up for Confluent Cloud. New signups receive $400 to spend during their first 30 days.

1. Create a Confluent Cloud cluster:

Once signed up, create your first Confluent Cloud cluster (choose the plan that suits you best) and follow the instructions provided by Confluent Cloud. The default options are sufficient for this example.

2. Generate an API key:

After you have created your first Confluent Cloud cluster, navigate inside of the cluster context to the API Keys and select Create Key. Generate an API key with global access and store the generated key in a safe place. This API key will be used to authenticate your HiveMQ Cloud broker with your Confluent Cloud cluster and enable the data flow between both.

Create API Key

3. Define the Kafka topic:

Next, we need to create a Kafka topic where we want to store all the data produced from all the MQTT devices. For this select Topics from the side navigation and create a topic with default settings. For this example, we simply changed the default topic_0 to device_data. You can skip the creation of a schema for this tutorial.

Define Kafka Topic

Select the topic you just created and move to the Messages tab to see any new incoming messages. Keep the browser (tab) open, so you can return later to see any new messages arriving.

Messages Arriving

Now your Confluent Cloud cluster is set up and ready for data ingestion.

Setting up HiveMQ Cloud

4. Create a HiveMQ Cloud Starter cluster:

You will also need to create a cluster with HiveMQ Cloud. Visit https://console.hivemq.cloud and select the Starter Plan from the list. Follow the cluster creation wizard and start the cluster.

Now your HiveMQ Cloud cluster is created, we need to enable the integration with Confluent Cloud. For this move inside the cluster management view by selecting Manage Cluster from the overview page.

Manage Clusters

5. Select the Confluent Cloud integration:

To enable the integration select the Confluent Cloud integration form the Integrations tab in the top menu.

Select Confluent Cloud Integration

By selecting Configure, the configuration options of the Integration are visible. We can broadly divide the configuration option in 2 distinct areas:

  • Connection configuration parameters

  • Topic mapping parameters

6. Configure the connection to Confluent Cloud:

To establish a connection between the two services we need only two configuration parameters from Confluent: the bootstrap servers and the API key that we have created earlier. This information is shared by Confluent in the file you downloaded while creating the API key. The file looks similar to this example:

Configure the connection to Confluent Cloud

Open this file and copy the Bootstrap server information from it. Make sure you copy the entire string in the format of host:port and paste it into the input field provided for that purpose. Next copy and paste the API key and the API secret from your file in the corresponding input field to secure the connection between HiveMQ Cloud and Confluent Cloud.

Connection between HiveMQ Cloud and Confluent Cloud

7. Establish the topic mapping:

Now that the clusters know each other, we need to match the MQTT topic(s) with the Kafka topic. Below the connection configuration, you find the topic mappings for both directions. HiveMQ to Confluent and Confluent to HiveMQ. You can configure both directions, but you must at least configure one direction.

Let’s start with gathering the sensor data from our smart home devices, which moves data from HiveMQ to Confluent.

For the topic mapping, define an MQTT topic filter for the MQTT topics from which you want to forward messages to your Kafka cluster in the Source Topic input field. The integration will forward any message published with an MQTT topic matching this filter. You can only select one topic filter, but wildcards are supported for the topic filter. For easy demonstration, we are using an example topic like livingroom/+/temperature to forward all MQTT messages from temperature measuring devices in the living room. The + in the topic filter is used to match all topics, starting with livingroom and ending with temperature, so we can simultaneously subscribe to all temperature sensors in the living room, e.g. livingroom/sensor-a7jd8nfb/temperature. You can learn more about MQTT Wildcards in this article.

Note: You can also use the multi-level wildcard (#) as a topic filter only, to collect all MQTT messages published on HiveMQ. Please note that you will see some measurement messages that the HiveMQ Team uses to monitor the health of your cluster. These messages will be forwarded to Confluent if you are using this multi-level wildcard solely, as all published MQTT topics match this wildcard.
Now let’s define the destination of the data at the Confluent Cloud cluster.

Enter the Kafka topic you just have created with Confluent Cloud in the destination topic input field. In our case this is device_data in this case.

Topic Mapping

8. Now we have configured how messages published to any topic on HiveMQ will be forwarded to your Confluent Cloud Kafka cluster.

Enable the integration by selecting Enable at the bottom of the configuration page. It can take a couple of seconds until the connection is established.

Publish IoT Device Data

Now that both systems are set up we can start publishing data from the Smart Home devices. For this example, we are using the HiveMQ Web Client to generate example sensor data, but you can use any other debugging tool or real device.

Client Connection Settings

9. Connect your Smart Home devices

The HiveMQ Cloud Web Client can be found in the top navigation bar of the cluster under Web Client. Connect the Web Client with generated credentials. This option automatically creates random credentials that are used by the client to connect with the HiveMQ Cloud broker. If you want to connect another client you first need to create access credentials under the Access Management tab in the top navigation bar. This is now taken over by the Web Client automatically.

10. Publish an MQTT message

Now specify the parameters for an MQTT message of an example temperature sensor in the living room of your smart home.

Define the topic you want to publish the message to. In this example, we use livingroom/temperature. The payload of the message contains a couple of details like the sensor_id, a timestamp, and the measured temperature. Copy and paste this example into the message field and publish the message.

{
"sensor_id": "12345",
"timestamp": 1654312345,
"temperature": 25.5,
"unit": "Celsius"
}

Publish Message

Now the message is sent from the Web Client (or in your case maybe the IoT device) to HiveMQ Cloud and forwarded through the Confluent Cloud integration to the Kafka cluster running on Confluent Cloud.

11. View the receiving messages on Confluent Cloud:

To see the same message now arriving in Confluent Cloud, move back to your Confluent Cloud cluster to the topics view that we have left open after configuring it. The message will already be displayed in this view. You can now send more data via the Web Client and watch them arriving there.

From here data is integrated with your Confluent Cloud cluster and you can leverage all Steam Processing services offered by Kafka, e.g. ksqlDB.

Send Controls Back to IoT Devices

Now that you have gathered data from your IoT devices and ingested them with Confluent Cloud, you might want to send some controls back to the IoT devices. For example to change the room temperature.

12. Create a new topic on Confluent:

Move back to Confluent and create a new topic with default settings. You can name the topic as you want, we prefer thermal_control in this case. Open the messages tab again and switch to HiveMQ Cloud.

New Topic on Confluent Cloud

13. Update Confluent Cloud integration configuration:

In the Confluent Cloud integration topic mapping configuration, switch from HiveMQ to Confluent to Confluent to HiveMQ direction and enter the Confluent Cloud Kafka topic you just created. Fill out the destination topic input field with the MQTT topic, where the controls towards the IoT device should be published. As an example, we are using livingroom/heatings here.

Topic Mapping on Confluent14. Subscribe to the commands topic with your IoT device:

You can use the Web Client again as an example IoT device, but of course, you can do the following with any device and tool you like. Switch to the Web Client again and subscribe to all topics. Keep the tab open to watch the arriving messages.

Topic Subscriptions

15. Produce a command that should be forwarded to your IoT device:

Let’s produce a command from a backend service that is delivered via Confluent and HiveMQ to your IoT devices. We can use the test data generator from Confluent to demonstrate this message flow. This data generator is a bit hidden directly on top of the messages view.

Produce Command

You can produce data with the suggested payload, or you define the payload to reflect your use case. We are using this data:

{
"sensor_id": "12345",
"timestamp": 1654312345,
"temperature": 25.5,
"unit": "Celsius",
"action": "increase",
"increment": 2.5
}

Select Produce and see how the message is published on the Kafka topic.

16. See the message arriving at your IoT device:

You can double check if the message arrived on your IoT devices by moving back to HiveMQ Cloud and searching for the message in the Web Client.

We have explored the powerful combination of MQTT and Kafka, demonstrating how they can work together to create a robust and scalable data processing pipeline. By leveraging MQTT’s lightweight and efficient data collection capabilities and Kafka’s high-throughput, fault-tolerant streaming capabilities, we have unlocked new possibilities for handling real-time data streams.

With HiveMQ Cloud and Confluent Cloud working together, you can implement a wide range of use cases, including IoT applications, smart home solutions, industrial automation, real-time analytics, and more. The solution seamlessly integrates MQTT and Kafka for efficient data ingestion, reliable streaming, scalability, and seamless integration with the broader data processing ecosystem.

Embrace the power of MQTT and Kafka to drive innovation, transform your data processing workflows, and propel your organization into the realm of real-time data management. Get started and test out your use case.

Ready to elevate your MQTT experience? Try HiveMQ Cloud Starter for 15 days absolutely free and get dedicated resources, 99.95% uptime, and round-the-clock support.

Get Started FREE

Magi Erber

Magi Erber is Senior Product Manager at HiveMQ. She loves creating software that delights customers and helps them realizing innovative IoT solutions.

  • Contact Magi Erber via e-mail

Related content:

HiveMQ logo
Review HiveMQ on G2