Skip to content

Connect HiveMQ to Azure Event Hubs

by Matthias Hofschen
15 min read

This blog post focuses on how to forward MQTT messages from your HiveMQ cluster to the Azure Event Hubs ingestion service. As an Azure cloud customer, this allows you to use a 100% MQTT-compliant message broker and leverage Azure data services for streaming, analysis, and storage of messages.

Introduction

IoT architectures regularly require complex setups to accomplish their goals and deliver the intended business benefits. Industrial Internet of Things (IIoT) use cases that gather device and sensor data from geographically distributed factories are a good example. One critical component is an MQTT-compliant message broker like HiveMQ that can be installed in the factory as well as regional data centers and the cloud. Connected devices and sensors publish (and receive) messages via HiveMQ. HiveMQ can forward these messages to a message queue such as Azure Event Hubs. Stream processors then act as consumers of the message queue to store and analyze messages either for real-time applications or business intelligence (BI) analysis. IIot use-cases and applications often center around

  • production visibility and monitoring

  • automated data collection

  • increased machine utilization

  • shorter improvement cycle times

  • facility management

  • optimization of the supply chain

You can find more information on MQTT and IIoT here.

Azure Event Hubs

Azure Event Hubs is offered as a fully managed service and comes in different pricing tiers. Azure customers can use the Event Hubs service to integrate complex analyses and storage of MQTT messages into their Azure Cloud architectures. In this case, the Event Hubs service acts as a message queue for big data streaming and event ingestion. Conceptually, Azure Event Hubs takes a similar role as Apache Kafka. Both technologies are distributed partitioned commit logs that buffer messages for a limited time and make them available through consumer APIs for downstream applications. For example, applications such as stream processors consume messages from the queue and provide them as input to machine learning models for predictive maintenance. The queue becomes a central element in the architecture from which messages are distributed to various backend applications.

Azure Event Hubs (Standard pricing tier and above) provides an endpoint that is compatible with the Apache Kafka producer and consumer APIs (version 1.0 and above). An Azure Event Hubs namespace is comparable to a Kafka cluster. Within a namespace, a limited number of Event Hub queues can be created. One Event Hub queue is comparable to a Kafka topic. Each Event Hub queue is further divided into partitions with messages ordered per partition.

Depending on the pricing tier, different quotas and limits apply to the usage and throughput for messages. Here are some of the quotas and limitations that may be important for your use cases and applications:

  • Message size is limited to 1 MB

  • Number of Event Hub queues per namespace is limited to 10 (Standard tier)

  • Maximum message retention period is 7 days (Standard tier)

  • The total number of messages is limited by the throughput units (TU) configured, with the maximum of 40 TUs the limit is 40000 messages or 40 Mb per second. With higher pricing tiers there can be more throughput.

Check the current quotas and limits of Azure Event Hubs for more information.

Azure Resource Group and HiveMQ Kafka ExtensionHiveMQ and the Enterprise Extension for Kafka

HiveMQ is a 100% MQTT specification-compatible message broker. Through the unique and powerful enterprise extension system, HiveMQ offers modular integrations with other services such as Apache Kafka clusters. The Kafka integration is accomplished through the HiveMQ Enterprise Extension for Kafka. The extension can be configured for bidirectional message communication with Kafka queues and Kafka-compatible queues. Azure Event Hubs provides a compatibility layer for Kafka consumers and producers.

Let’s explore how to integrate a HiveMQ cluster with the Azure Event Hubs service using the HiveMQ Enterprise Extension for Kafka.

Prerequisites

  • Microsoft Azure account

  • HiveMQ license (optional). If you don’t have a valid HiveMQ license, the HiveMQ cluster you deploy automatically uses a trial version that is restricted to non-production use and 25 concurrent connections. To test and evaluate HiveMQ with more connections, contact HiveMQ.

  • HiveMQ Enterprise Extension for Kafka license (optional). The HiveMQ cluster includes the Extension for Kafka. If you don’t have a valid license, the extension can be used as a trial version for up to 5 hours, after which a full restart is required to start a new 5 hour period.

HiveMQ Installation

In this blog post, we download and install the HiveMQ platform locally.

  • Download the HiveMQ platform and unzip the downloaded file

  • Start the HiveMQ broker with the appropriate startup script for your operating system in the <HIVEMQ_HOME>/bin directory, for example:

bin/run.sh
  • The complete instructions can be found here.

Naturally, you can install HiveMQ on a wide variety of different platforms and infrastructures. Check these blog posts or the HiveMQ documentation for specific installation instructions:

HiveMQ Kafka Extension Configuration

By default, the HiveMQ platform download contains the HiveMQ enterprise extensions and several tools. You can enable and test the trial version of the HiveMQ Enterprise Extension for Kafka for 5 hours without a restart.

  • To enable the extension, remove the DISABLED file in the <HIVEMQ_HOME>/extensions/hivemq-kafka-extension directory

  • Rename the kafka-configuration-example.xml file to kafka-configuration.xml

  • Edit the kafka-configuration.xml file to match the following example. Take note of the three uppercase placeholders. In the next section, we will replace these placeholders with variables obtained during your Azure Event Hubs installation.

<?xml version="1.0" encoding="UTF-8" ?>
<kafka-configuration xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
                     xsi:noNamespaceSchemaLocation="kafka-extension.xsd">
    <kafka-clusters>
        <kafka-cluster>
            <id>event-hubs-service</id>
            <bootstrap-servers>
                EVENT-HUB-NAMESPACE.servicebus.windows.net:9093
            </bootstrap-servers>
            <authentication>
                <plain>
                    <username>$ConnectionString</username>
                    <password>SAS_CONNECTION_STRING-PRIMARY_KEY</password>
                </plain>
            </authentication>
            <tls>
                <enabled>true</enabled>
            </tls>
        </kafka-cluster>
    </kafka-clusters>
    # Forward published messages that match the 'test/blogpost/#' topic filter to Event Hubs
    <mqtt-to-kafka-mappings>
        <mqtt-to-kafka-mapping>
            <id>to-event-hub-queue</id>
            <cluster-id>event-hubs-service</cluster-id>
            <mqtt-topic-filters>
                <mqtt-topic-filter>test/blogpost/#</mqtt-topic-filter>
            </mqtt-topic-filters>
            <kafka-topic>EVENT-HUB-QUEUE-NAME</kafka-topic>
        </mqtt-to-kafka-mapping>
    </mqtt-to-kafka-mappings>
</kafka-configuration>

Configure Azure Event Hubs

In this section, we set up your Azure Event Hubs and substitute the placeholders in the Kafka configuration file.

  • Log into your Azure account and create a resource group for your project

    Logging into your Azure account and create a resource group for your project

  • Within the resource group, click Create and then search for and select Event Hubs

    Microsoft Event Hubs

  • Click Create Event Hubs. On the next page, enter a name for your Event Hubs namespace and select the Standard pricing tier

    Entering a name for your Event Hubs namespace

  • After your namespace is successfully created, save the namespace name for later use in the Kafka Extension configuration file. Now, select your new Event Hubs namespace

  • Next, select Event Hubs and create a new Event Hub queue by clicking the [+] button at the top of the screen

    HiveMQ Events Hubs

  • Enter a name for your Event Hub queue and define the number of partitions as well as the message retention time that your use-case requires

    Entering a name for your Event Hub queue and defining the number of partitions

  • Save the name of your Event Hub queue for later use in your Kafka Extension configuration file

  • Return to the Event Hubs namespace overview and select Shared access policies

    Shared access policies

  • Create a new shared access policy for your HiveMQ cluster. Select the permission Manage, which also includes send and listen permissions

    Add SAS Policy

  • Select your newly-created policy and copy the parameter for Connection string-primary key

    SAS Policy extension

  • Save the Connection string-primary key value for later use in your Kafka Extension configuration file as the password

Finish the Installation

To complete the Kafka extension configuration, substitute the three uppercase variables in your kafka-configuration.xml file with the collected values:

  • Replace the EVENT-HUB-NAMESPACE with your Event Hub namespace name.

  • Replace the SAS_CONNECTION_STRING-PRIMARY_KEY with your Connection string-primary key.

  • Replace the EVENT-HUB-QUEUE-NAME with your Event Hub queue name.

Start Your HiveMQ Cluster

  • Start your HiveMQ broker. For example with

 bin/run.sh
  • Check the <HIVEMQ_HOME>/log/hivemq.log file for error messages from the Kafka extension and make sure that the extension loads successfully.

  • Browse to the HiveMQ Control Center (http://localhost:8080) and check that the Kafka extension dashboard displays a green checkmark. (user=admin, password=hivemq)

Test Your Installation

Do a quick functional test with the open-source MQTT CLI tool that is included in your HiveMQ platform download. The directory <HIVEMQ_HOME>tools/mqtt-cli contains the command line tool that can be used as an MQTT client to test your HiveMQ broker and the connection to Azure Event Hubs. For more information, see the MQTT CLI documentation.

  • Start the MQTT CLI

bin/mqtt sh 
  • Connect to the local HiveMQ broker via the standard MQTT port 1883

con -i my-publisher 
  • Publish messages using the configured MQTT topic filter that forwards messages to your Event Hub queue

pub -t test/blogpost/test-publish -m ‘Hello from HiveMQ’

  • Check the Azure Event Hubs overview page and observe that messages are received from HiveMQ.

You can also check the status and success of transferring messages to Event Hubs in the HiveMQ Control Center.

Considerations

Keep in mind that all Azure resources you create will be charged to your Azure account. Don’t forget to clean them up. The easiest way to delete all created resources is to delete the Azure resource group that you created in the first step. This should remove all associated resources.

Your message throughput depends on your Azure Event Hubs configuration and pricing tier, the number of HiveMQ cluster nodes you deploy, and the number of partitions you create for each of the Event Hub queues.

Summary

We would love to hear from you. Reach out to us if we can help!

Matthias Hofschen

Matthias Hofschen is Engineering Manager at HiveMQ.

  • Contact Matthias Hofschen via e-mail

Related content:

HiveMQ logo
Review HiveMQ on G2