Skip to content

Transforming Factory Operations with Digital Twins Powered by MQTT and UNS

by Kelly Watt
27 min read

Digital transformation in the industrial world often involves technical terms like "digital factory," "Industry 4.0," "industrial digital twins," "MQTT brokers /data streaming," and "unified namespace (UNS)" to represent complex concepts. However, those speaking about digital transformation often fail to define these concepts clearly in simple terms, compare how they are different, and demonstrate how they are utilized in various industrial settings. As a digital twin consultant who has helped many companies transform, I would like to briefly explain these terms and their relationship to real-time data, enabling high-quality decision-making when digital transformation is executed well. 

The Digital Factory

The concept of a digital factory represents a transformative evolution in manufacturing processes. At its essence, the digital factory eliminates the dependency on paper-based systems, enabling machines and systems to communicate seamlessly with one another, as well as business applications and processes. This shift facilitates real-time decision-making, ensuring that responses to rapidly changing production needs are both timely and accurate. By integrating historical data with current metrics, the digital factory can anticipate future trends, address challenges proactively, and optimize operations in unprecedented ways. However, paper and manual processes will undermine the promise of the digital factory. 

Industry 4.0 technologies, such as automation, smart sensors, and advanced analytics, are at the core of the digital factory. They use real-time information to make decisions, enhance predictive capabilities, and enable autonomous operations. This information is typically gathered from sensor readings and control logic. These technologies empower businesses to stay ahead of market trends, create new revenue streams, and revolutionize processes by integrating and utilizing real-time system data. It's important to note that Industry 4.0 implementations don't depend on spatial contextualization like a digital twin, and they are not Digital Factories themselves but rather a component of both.

The Digital Factory and Industry 4.0

Industrial Digital Twins: The Core of Modern Manufacturing

Much like a digital factory, the industrial digital twin depends heavily on converting paper-based systems into digital formats and integrating different business applications and processes. It also uses Industry 4.0 technologies, real-time and historical data, and automation intelligence and control logic. Industrial Digital Twins represent the current state of an asset, process, or organizational/operational system, such as production lines, HVAC systems, data centers, or physical equipment. Sensors placed on these physical assets gather data on performance, energy output, and system health, which is then processed and applied to a digital counterpart, the twin. What makes the industrial digital twin unique is that 3D and 2D visualization provide spatial context and real-world relationships to unified data, which enhances the decision-making process.

Industrial Digital TwinsImage Source: https://www.digitaltwinsim.com/

Likewise, the digital twin seeks to improve various business processes, which are often reliant on institutional knowledge or documented in paper formats. As institutional knowledge retires, digital processes must encapsulate this expertise, offering methods that allow for quick data analysis and decision-making. The loss of this knowledge can weaken an organization's understanding of past performance and undermine efforts to innovate in the future. 

Overcoming Interoperability Challenges with Unified Namespace (UNS)

For the digital factory or digital twin to be effective, it is essential to integrate spatial, live, historical, and static data into a comprehensive virtual model. This is where a common namespace or data model becomes valuable. A common data model, such as a Unified Namespace (UNS), ensures that data is synchronized, providing snapshots in time and historical context. 

UNS is a data framework that helps overcome the challenges faced with traditional ISA95 model architectures by providing a single source of truth for all data and information from manufacturing operations. The UNS provides:

  • The place where the current state of the industrial operations lives

  • The hub through which the smart assets in the company’s business communicate

  • The architectural foundation of the company’s Industry 4.0 and digital twins use cases

A UNS is able to take the manufacturing operations technology (OT) data which is typically organized in the ISA model of Enterprise->Site->Area->Line->Cell into a more open architecture where the OT data is combined with the Information Technology (IT) data, providing a common data layer for all of the enterprise that is contextualized for advanced use cases like digital twins.

This unified approach enables a deeper understanding of how different elements within the factory impact each other.

Consider the manufacturing execution systems (MES) used for tracking and documenting the transformation of complex raw materials into finished products in real-time. Additionally, consider the enterprise resource management and planning software (ERP) that contains critical data for measuring overall production, management, and operational productivity. Now, consider the complexity that arises when dealing with separate systems that are not synchronized in real-time. How can one quickly obtain a snapshot in time that clearly and accurately represents what is happening across factory operations?

Without a common data model or UNS, organizing and quickly finding information becomes nearly impossible, especially when making critical maintenance decisions, including planning, resourcing, and scheduling.

The same is true for applying classification standards to the product life cycle model (PLM) or 3D BIM model, which may include design elements, asset information, system relationships, labels, resources, costs, warranty, supporting documents, procedures, specifications, parts lists, and performance and maintenance requirements to name a few. Although these are static data sources, wasting time searching for them or operating without assimilating them can significantly undermine good decision-making.

Utilizing Ontologies and Semantic Models for Data Orchestration 

Establishing ontologies and semantic models creates standard definitions of instances or entities (objects/assets), classes, attributes, and relationships representing systems and events. In simpler terms, these ontologies and semantic models provide the standard relationship of components in a system and how they operate interdependently. These ontologies and semantic models form the glue for intelligent operations, especially when running machine learning models, artificial intelligence, or simulations.

Regardless of your data source - static, live, historical, or operational - the data must be contextualized to drive additional value. Take, for instance, a singular temperature sensor, while valuable, becomes significantly more powerful when you can better understand key relationships of that single data point including: 

  • The operational environment

  • The historical performance (progressive issue)

  • The operating picture (MES) showing in real-time upstream or downstream inputs/outputs

  • Recent work completed on the monitored system documented in the Computerized Maintenance Management System (CMMS)

  • The resources required to address the issue long-term (ERP), such as capital investment, planning, and prioritization

In the typical factory, these data points are all in different business systems, and assembling the context is difficult and time-consuming. Ontologies and semantic models contextualize data points and their critical relationships, enabling more comprehensive understanding, problem-solving, and decision-making.

Tailoring Data for Consumption in a Digital Twin

One of the critical aspects of a successful industrial digital twin, digital factory, or Industry 4.0 implementation, is ensuring that the right data is delivered to the right people in the right format. This tailored approach supports informed decision-making and ensures that the data provided is actionable and impactful. Delivering too much data for the wrong use case and stakeholder simply creates noise. The data should be available if needed but curated and built purposefully for the intended use case and stakeholder consumption.

Different stakeholders have different data needs. Therefore, it's crucial to present curated data to each user, allowing them to make intelligent, well-informed decisions, complete with clear recommendations. The surfaced data will differ significantly for an executive concerned with plant operations and performance versus a control room operator, a line worker, an asset manager, or a maintenance team. A similar approach can be effective for managers of large city operations or complex airport environments but with different business systems, stakeholders, and use cases.

The Digital Twin Maturity Model: A Structured Approach

The process of implementing a digital twin begins with obtaining buy-in from the leadership and acknowledging the evolving landscape. It is essential to analyze the use cases and uncover any dependencies, prioritizing those that offer the most value with the least effort or time to implement. Justification for each business use case is crucial to stakeholder buy-in, prioritization, adoption, and utilization of the digital twin.

The starting point in the Digital Twin Maturity Model is the System of Systems Model, where machines speak to machines in real-time, and data is interoperable and based on classification standards, ontology, and semantic relationships, enabling prediction, real-time analysis, simulation, and augmented operations. This requires data to be integrated using a Common Data Model such as a UNS, or shown below as a Common Data Environment, which normalizes and brings data together. 

The Digital Twin Maturity Model below outlines this journey well using levels 1-5:

  1. Descriptive: Understand the physical environment and gather static data.

  2. Diagnostic: Begin to automate and build dashboards with operational insights.

  3. Predictive: Consolidate and contextualize data to predict outcomes using machine learning models.

  4. Prescriptive: Use real-time data to recommend optimizations.

  5. Autonomous: Achieve continuous, AI-driven decision-making with minimal human intervention.

The Digital Twin Maturity Model: A Structured Approach

Best Practices for Setting Up and Maintaining Digital Twins

A structured approach, such as Digital Twin Consulting’s Digital Twin Assessment Process (DTAP), is critical for setting up and maintaining digital twins, from defining objectives to institutionalizing the process, before tools and interfaces are added or removed. Successful digital twin programs focus on strategic master planning and project roadmaps with agile project feedback loops to stakeholders to ensure use case deployments are successful, and project goals are met in short sprints. Effective implementation should take little time to realize tangible value, although dependencies and gaps may need to be closed before a use case is fully realized. 

Digital Twin Assessment Process™

The DTAP process involves three key phases:

  1. Strategic Planning: Gather data, assess quality, and identify use cases.This step is all about understanding leadership goals, identifying business needs, gathering data about business systems, assessing data quality, identifying various use cases to drive enterprise value, understanding data and system dependencies, assessing required investments, uncovering critical gaps, and aligning with potential financial outcomes. 

  2. Tactical Roadmap Development: Build scalable roadmaps that align with leadership goals.This step involves going through a gated process to develop tactical project roadmaps for each digital twin use case implementation, including project planning, estimating costs, scheduling tasks, ensuring that available technologies are scalable to the use cases, and gathering the necessary data points to support each project roadmaps.

  3. Implementation: Oversee projects and manage program implementation.

This step involves oversight of each use case implementation project and program management of digital twin testing, training, launch, and adoption.

Use case implementation project and program management of digital twin testing, training, launch, and adoption

Data Management Best Practices for Digital Twins

Traditional and aging data infrastructures, where data communication and management are challenging, are perfect use cases for the adoption of additional technologies, such as MQTT, for real-time communication.

Most existing industrial systems model traditional pyramidal network-and-system architectural structures (the ISA 95 functional model). As depicted in the figure below, this architectural approach is characterized by a technology stack that includes factory-floor components at the bottom and enterprise/cloud components at the top. Within this stack, each layer is connected and only communicates to the sequential layers directly above or below it. So, data moves up or down one layer at a time using point-to-point connections.

Traditional architectureThe traditional architecture worked well when manufacturing systems operated in isolation and didn't need to communicate with each other or combine data. However, in the modern world of Industry 4.0, it's unsuitable for digital transformation use cases like digital twins, where data from each layer is necessary for effective implementation. The traditional architecture cannot meet modern scalability requirements due to the point-to-point integrations required to move data from the shop floor to the top floor. This architecture hinders innovation, creates technical debts, and leads to isolated data. The traditional Industrial Internet of Things (IIoT) client-server architecture designed to address data silos is also limiting due to the necessity for multiple dedicated connections. This results in a tangled architecture and puts strain on network bandwidth.

Point-to-point spaghetti architecture in IIoTMQTT was created as an IIoT messaging protocol to overcome these challenges. It is a lightweight publish/subscribe-based messaging open standard protocol used in IIoT for data communication and is designed for efficient communication between industrial data sources and smart factory solutions in low-bandwidth, high-latency, or unreliable networks. Instead of needing to set up direct communication as in the case of client-server architectures, in MQTT architectures, both publishing and subscribing clients communicate through a centralized server, also known as a message broker (Fig. 9). This enables multiple enterprise systems to subscribe and ingest various manufacturing data via MQTT topics, allows devices to publish data to a broker, and enables applications to subscribe to that data. MQTT solutions like HiveMQ Enterprise MQTT Platform can help address traditional IIoT data challenges while supporting digital twin use cases by facilitating real-time data exchange with high reliability, scalability, and security by implementing a UNS. An MQTT broker, with its publish/subscribe architecture, is ideal for implementing UNS, breaking down data silos, and enabling comprehensive digital twin use cases.

HiveMQ Data Management Enables Digital Twins

HiveMQ’s proven enterprise MQTT platform is reliable under real-world stress, is built for flexibility, security, and scale, and provides real-world solutions to support smart manufacturing digital twin data use cases by implementing a UNS data architecture.

HiveMQ provides a HiveMQ Edge gateway that can translate device sensor data points in various formats like OPC UA, Modbus etc. into MQTT; store and forward the data if connection is lost; transform, normalize and contextualize the data through Data Hub so that it can all be interpreted easily and bridge to an enterprise broker. Data can be further consolidated, transformed and contextualized in the Enterprise MQTT broker using the Data Hub feature. This enables creation of the UNS at the different levels to enable the data needs for digital twin use cases.

Digital Twin Case Study: DFW Airport

At DFW Airport, HiveMQ’s MQTT platform supports digital twin use cases for smart city and logistics needs. The platform enables real-time fault detection and diagnosis, reduces energy usage, improves workforce efficiency, and enhances operational oversight, demonstrating the power of digital twins in a complex, real-world environment.

Some of the challenges faced included the need for reliable real-time communication from five different controls systems, legacy communication protocols, and the need for remote end device communications.

HiveMQ provided the enterprise MQTT platform solution which standardized IoT with Sparkplug, and streamed data between the digital twin, remote devices and controls. This enabled real-time fault detection and diagnosis (FDD) for the digital twins to help reduce energy use by 20% and drive operational workforce efficiency by 25%. It also enabled district water metering by reducing unaccounted water loss from 25% to 10% with the help of reducing communication time on flow from 6 hours to 15 minute intervals, while adding new real-time sensing capabilities and remote actuated valve controls.

Other benefits included integration of flight data, fault detection, operational oversight for passenger bridge perfect turn, asset condition monitoring, and reduction of jet-fuel burn goals.

Conclusion: Harnessing Digital Twins for Smart Manufacturing

Digital transformation for industrial manufacturing, digital plants, and complex infrastructure like cities and international airports requires starting with a common definition of the language we use, along with a proven strategic assessment process for master planning, use case prioritization, and technical project roadmaps centered on business value. The core approach must be strategically aligned and centered upon a common data model such as UNS that includes real-time data, where the larger digital twin relates the spatial environment to all related data. By leveraging real-time insights and predictive analytics, the digital factory or digital twin empowers organizations to make informed decisions that drive efficiency and innovation. 

Critical to the success of digital factories and industrial digital twins is the importance of real-time event-based data feeding the system of systems architecture centered on a common data model such as UNS which is enabled by investing in a robust data management solution. MQTT and UNS frameworks provide the necessary infrastructure to support advanced digital twin use cases, proven time and time again by industry examples to drive value to digital transformation efforts, digital plant, industry 4.0 initiatives and industrial digital twins. 

To learn more about HiveMQ, download and try out the software for free. To get started on your digital twins and get your DTAP assessment, contact Digital Twin Consulting.

Watch our webinar, Smart Data to Smart Decisions: The Power of Digital Twins, where we discuss the data foundation required for Digital Twins, like MQTT and UNS, and a case study on how HiveMQ is enabling Digital Twins for a major US airport.

Chapters
  • 00:00 - Introduction
  • 02:02 - What is a Digital Factory?
  • 03:20 - Industry 4.0: Enabling the Digital Factory
  • 05:25 - Digital Twin Core Components, Spatial Context and Visualization
  • 09:45 - Industrial Use Cases for Digital Twins
  • 12:52 - Interoperable Digital Twins
  • 14:58 - A Digital Twin Maturity Model
  • 16:34 - Why Digital Initiatives Fail?
  • 17:46 - Digital Twin Assessment Process (DTAP)
  • 19:00 - Computer Integrated Manufacturing
  • 21:49 - The Role of MQTT for Digital Twins and How HiveMQ Ca Help
  • 25:27 - The Role of Unified Namespace in Powering Digital Twins
  • 33:06 - Custom Success Story: Enabling Water System
  • 36:44 - Demo
  • 42:00 - Q&A

Kelly Watt

Kelly Watt is the Senior Digital Twin Consultant at Digital Twin Consulting, boasting extensive expertise in reality capture and digital twins. Currently, Kelly focuses on 3D visualization for critical infrastructure and leads digital twin implementations across various sectors including oil and gas, power, and transportation. His major projects include multi-year international airport contracts at DFW and San Felipe MX.

  • Kelly Watt on LinkedIn
HiveMQ logo
Review HiveMQ on G2