AI in Operational Technology: Unlocking Value Through Industrial Data
Artificial Intelligence (AI) everywhere. Well, almost. AI is pervasive today, yet there are still some significant frontiers. It has first shown up in obvious places—consumer applications of convenience such as assistants for writing, agents for daily tasks, voice control, research, image, video, and music generation. There is likely no industry or segment untouched or untouchable by AI.
However, AI’s role in operational technology (OT) may be less publicized, yet it is more important than most realize. This article explores how manufacturers can unlock real AI value in operational technology by building an AI-ready data foundation via a Unified Namespace, aligning people/process/security, and using clear metrics to scale from pilots to autonomy.
Adoption of AI in OT is seemingly slow and cautious now, with many reporting they are in the initial stages of exploration. Recent surveys and research show that gains from AI-enablement can be significant for companies that take a serious interest in AI. However, to realize these positive, transformative results, OT project initiatives must have a sound, effective mastery of data, people, process, and technology aspects. Success metrics for AI initiatives are essential for measuring progress and directing future AI OT efforts.
What we’ve seen so far, impressive as it is, is only the tip of the iceberg. There is vast, untapped potential for AI’s transformative use in manufacturing settings. While Machine Learning (ML) is firmly entrenched in select areas like sensor processing, vision systems, and other niche areas, broad application of AI is still uncharted. There is still a hype-factor to overcome in certain areas. AI can deliver great value to organizations. However, AI initiatives must be carefully planned, mapped, executed, and continuously tested in order for business to achieve realizable and sustained gains. AI use is not a project; rather, it needs to be thought of as a strategic transformer. Successful AI initiatives must align efforts across IT and OT areas with the core dimensions of people, process, and technology, underpinned by solid data and architecture with an unwavering focus on security, trust, and safety.
First, let’s start with a definition of Terms:
Term/Acronym | Definition (Source) |
|---|---|
AI (Artificial Intelligence) | Systems that perform tasks, generate content (GenAI), or generate insights (Analytical AI) using data and modeling. |
OT (Operational Technology) | The technology realm traditionally focused on the production side, especially in manufacturing, with systems often having a long useful service life (5-20 years). |
IT (Information Technology) | The realm traditionally focused on overall supply chain, data processing, analysis, and internal end-user applications, which adapts and adopts new technologies quickly. |
UNS (Unified Namespace) | A powerful architectural pattern and single, virtual source of truth that unifies, contextualizes, and governs OT/IT data, enabling scalable and AI-ready data foundations. |
GenAI (Generative AI) | A core category of Industrial AI that generates new content and is applied in industrial settings for use cases like operator chat bots and active, responsive guidance. |
ML (Machine Learning) | An established subset of AI firmly entrenched in specific OT areas like sensor processing and vision systems. |
OEE (Overall Equipment Effectiveness) | A classic measure of efficiency used to optimize processes and drive value in AI initiatives. |
PM (Predictive Maintenance) | A critical use case for AI in OT that uses machine learning to detect equipment failures before they occur, reducing downtime and maintenance costs. |
QC (Quality Control) | A critical use case for AI in OT, often leveraging computer vision for faster defect detection. |
IIoT (Industrial Internet of Things) | The interconnected network of sensors, instruments, and other devices networked together with industrial applications of computer systems, including manufacturing and energy management. |
ICS (Industrial Control Systems) | Systems, often coupled with PLCs, used to monitor and control industrial processes and physical equipment on the shop floor. |
PLCs (Programmable Logic Controllers) | Specialized industrial computers used in OT environments to automate and control mechanical processes, such as assembly lines or robotic devices. |
CV (Computer Vision) | A field of AI that enables machines to gain a high-level understanding from digital images or videos, typically used in OT for quality control and defect detection. |
The IIoT AI Data Pyramid
AI is widely seen as a prime enabler of innovation. Let’s briefly understand what is required for effective AI projects and applications. Data is the bedrock of AI. Data for AI must be high quality, clean, integrated, well-governed, and standardized using a common model. Effective data acquisition and data quality are foundational for AI use in OT. Building on this foundation, normalizing, integrating, and contextualizing OT data is required to make use of OT data broadly within AI applications. Next, the ready data is used in AI modeling, tuning, and fine-tuning. Finally, the data is ready to be used in AI applications and as a foundation for further innovation.

AI Adoption and Integration - The Challenge
A better understanding of AI and AI requirements is needed to adopt AI within Operational Technology (OT) realms.
AI in IT
Operational technology has been traditionally focused on the production side, especially in manufacturing whereas IT has traditionally been focused on overall supply chain, data processing, analysis, and internal end-user applications. IT is quick to adapt and adopt new technologies. Refresh cycles and useful systems lifecycles tend to remain in the single-digit 3-5 year range. IT systems can and often do adapt at a high rate of change easily.
AI in OT
The same is not so true for OT systems. A major challenge in OT environments is the rate of change for adoption of new technologies. OT environments at the physical level sometimes have equipment and systems that are in use for 5-10 years. It is not uncommon for some systems’ useful lifetimes to range over 10-20 years of useful service. The long life of OT systems has contributed to a low rate of change within these environments. This is changing with focus on energy efficiency, reducing cycle time, and increasing Overall Equipment Effectiveness (OEE).
Motivators
The promise of AI value in an OT setting is realized when that critical data from the shop floor and other OT systems is used to gain insights into process, quality, efficiency, and product improvements. Throughout most industries, there is great interest in AI. This interest in AI needs to be focused on specific objectives or outcomes to structure an approach and realize value. Then these more narrowly focused projects can work together to derive larger value creation cycles.
A recent article from Automation World states:
The greatest impact occurs when AI technologies work together across multiple systems, combining quality data with energy data, predictive maintenance insights and supply chain information to enable multilevel optimization and autonomous decision-making that goes beyond what isolated AI applications can achieve.
Taking this to be true, then the burden on project definition, execution, and measurement is elevated. For AI to be successful, an organization must run an AI program rather than merely an AI project. AI aims must be strategic, not tactical or operational, to provide the greatest value to organizations. Therefore, successful AI initiatives in OT space require defining clear business value, proper engagement of the project team, command of all aspects relating to data, and well-planned supportive technology.
Business value can be realized by linking existing, long-life physical systems with the latest AI-enabled OT—a critical step in maximizing existing investments and using leading-edge AI technologies to get the most out of OT systems in the larger manufacturing setting.
Next, we’ll examine specific drivers of business value to better articulate the case.
Key Drivers of Business Value
The core drivers of business value in AI systems are unchanged from those in IT systems. AI, though, is a powerful tool for realizing business value. A reported 72% of manufacturers are using AI for cost reduction and operational efficiency. However, adoption in operational technology remains more cautious than in enterprise IT.
For AI, these drivers are interrelated and include:
Realizable Cost Reduction - In practice actually achieve these goals: reduce costs, optimize process, eliminate waste, increase repeatability, and automate
Improve operational efficiency - reduce cycle time, increase OEE
Machine and OT Refresh - Selective and targeted refresh of physical manufacturing systems coupled with strategic and tactical upgrades of peripheral equipment like industrial control systems (ICS), PLCs, and other shop floor/edge OT.
Generate new insights and product improvements – this is a frontier where AI is revolutionizing the dynamics of product development.
Critical Applications of AI in OT
Established Applications of AI in OT
Established Machine Learning (ML) and AI applications in the OT space such as manufacturing include (Automation World citation here):
OEE and Predictive Maintenance (PM)
Quality Control and Safety - vision systems, defect detection
Robotics
Energy Management Systems
AI-ready data streaming, predictive maintenance, and real-time production optimization
Some recent industry surveys (nam.org citation) have suggested that the main cross-cutting aspects of AI use in the OT space are:
Aspect | Application and Use |
|---|---|
Efficiency | Classic overall equipment effectiveness (OEE), predictive maintenance (PM) |
Safety | AI technologies are deployed to proactively identify and mitigate potential safety hazards, ensuring a secure working environment for employees. |
Product Development and Design | Innovation; shortened product design cycles; radically enhanced customization abilities |
Training | AI-assisted guidance, training modules, simulations; improved employee knowledge and performance |
Supply Chain | Optimized supply chains - more cost effectiveness, less waste, just-in-time JIT supply optimization, more resilient supply chain flows |
The recent Industrial AI Market Report 2025-2030 (August 2025) classifies three (3) core categories for Industrial AI:
Category | Examples |
|---|---|
Analytical AI | Identify patterns, generate insights, support decision making using supervised and unsupervised methods - classic ML applications |
Autonomy-enabling AI | Systems perform and make decisions without human intervention |
Generative-AI (GenAI) | Operator chat bots and active, responsive guidance |
Still, with all of the current burgeoning applications of artificial intelligence in operational technology and IIoT, there is relatively low use of AI in manufacturing. This is particularly true for advanced AI use cases. Up to two-thirds of survey respondents report exploring AI in manufacturing settings. However, only a small percentage are fully implementing AI in manufacturing settings. This suggests a latent, untapped potential and more insights to be gained. We need improved views and insights into AI’s use in manufacturing.
Emerging Use of AI in OT
According to a recent Automation World report, four (4) critical use cases are emerging as AI game-changers:
Predictive Maintenance (PM)
Quality Control (QC) and computer vision (CV)
Advanced Robotics and Cobotics (computers and robots collaborating)
Energy Management systems
The industry has shown steady but slow progress with PM, QC, CV applications. These gains are foundational but offer a basis for continued development and innovation.
As we consider emerging use, it is wise to temper expectations around ROI and ground with solid metrics and a well thought-out approach.
Some areas that are being explored for further AI use are:
Purpose-built analysis of sensor data; building on advanced ML and AI techniques
Data acquisition at the Edge
Supporting quality
Expanded depth of use in OEE and PM
Expanded focus on Energy efficiency - optimizing delivery and consumption
AI at the Edge - limited agentic AI for approved uses focused on learning and safety
Keys to Success
Several factors are key determinants of the success of AI initiatives in Operational technology. These determinants are divided into human factors (people and process) and technology factors (data foundations, UNS, and digital transformation).
Process
Incremental Deployment and Scalability
Stable, capable machine learning process (MLOps) for OT environments
Clear ROI and Business Ownership
Named business owner (plant manager, reliability leader)
Clear cost-benefit tracking
Benefits realized in operations, not just dashboards
Human Factors
The first key to success for AI in OT applications is the human factor. These include:
Strong sponsorship and oversight high in the organization
Well-defined projects
Clear objectives
Enabled and effective project team
Human-Centered Design & Trust
Strong OT-IT-Data Science Collaboration
Project-to-System Berthing
Change Management & Skills Enablement
Measured results - starting with clear, high-value use cases
Executive Sponsorship and Support
Projects with strong executive support tend to do better. That is, these supported projects achieve more objectives, can be used by a broader audience within the organization, and conform better to budgets and timelines as compared to more limited scoped projects. Owing to the complexities of AI, the need for executive support is heightened. AI is most effective when it is applied in a well-conceived, well-coordinated way crossing departmental boundaries. These data domains can span data production and acquisition, data integration, and all the way through to end-users of AI-enabled applications.
Clear Objectives, Defined Outcomes
Clear definition of business value and defined outcomes are essential for confidence and success in AI-centric projects. According to a survey and analysis conducted by HiveMQ, almost ⅓ of respondents reported: “Lack of budget and uncertain ROI is a key challenge for 28% of respondents.”
… 80% of IoT projects fail to scale due to the complexity of integration and the inability to support scaling systems. Organizations need a strong data foundation that addresses these challenges on which to build anything from basic to advanced IIoT use cases - from predictive maintenance to AI.
Enabled and Effective Project Team
A strong, effective project team is key to any project. Effective AI’s requirement of data uniformity and accessibility throughout the whole organization imposes more stringency on project teams. Everyone must be on the same page. Projects have to progress in concert so that OT data can be usable in IT settings. Project teams need to share common baseline AI knowledge. Not everyone needs to be an ML expert or an AI architect. But, everyone must be AI literate.
Transition to Operations
As a project is berthed and enters first-phase rollout, change management becomes increasingly important. OT staff skills enablement emerges as an ongoing part of the AI systems lifecycle. The project may be declared a success in dry-dock. Operational metrics become an integral part of the AI system. Owners and staff must continue to collect success metrics, measure them against goals, and adapt as needed.
Workflow Adaptability
The way work is done affects an organization’s success. The ability to embrace change, update or wholesale replace workflows is central to adopting and reaping AI’s many benefits. AI applications bring new efficiencies, new insights, and new ways of working when even modestly successful. To keep this momentum and sustain increasing productivity, old ways of working necessarily give way to new workflows. In some cases, whole tasks or functions become minimized or eliminated. Changes in workflows and tasks let people work in more effective ways. Doing so upscales people’s abilities which is an important benefit of AI.
Technical Factors
The second key factor grouping is the technical aspect—data, technology, infrastructure. This includes:
OT Architecture from Device, Edge, to Cloud
Data
UNS
Cybersecurity and Safety
OT Architecture
The Operational Technology (OT) architecture must be well-defined and specifically tuned to support Artificial Intelligence (AI). This tuning encompasses conformed configuration, the adoption of a robust data architecture like the Unified Namespace (UNS), and comprehensive security and governance protocols. A fundamental requirement is to define clear boundaries of function and control, distinguishing deterministic control systems from areas where AI is deployed, such as operational monitoring and automated process control. This OT architecture should have multiple zones, starting at the Edge with functions like data collection, protocol translation, cleansing, contextualization into UNS, and local machine learning inferencing. The Plant Level then collects this standardized data from both direct MQTT-enabled clients and Edge gateways, enabling plant-level data flows and bi-directional bridging to higher-level systems for broader analytics.
This foundational structure supports scalable AI Inferencing for critical business objectives, including data quality decisions, filtering, and storage for use cases like Predictive Maintenance (PM), Overall Equipment Effectiveness (OEE), energy management, and supply chain coordination.
The Centralized Broker acts as the secure, high-capacity backbone, facilitating essential plant-to-HQ data flows and advanced supervisory control. Ultimately, the architecture’s goal is to break down silos and integrate manufacturing data with the rest of the business, ensuring seamless, often bi-directional, data exchange with enterprise IT Systems such as ERP, MES, Data Lakes, Finance, and Sales.
Data - A Solid Foundation
The first steps in laying a data foundation are often mechanical. These steps involve solving integration challenges and taming complexity. Collecting all data, when and where it is produced is typically the first frontier.
Greenfield and brownfield projects require different approaches. Greenfield data collection in projects can sometimes be defined at time of conception. Greenfield projects have more freedom in choosing data sources, equipment types, protocols like MQTT. The opposite is often true in brownfield projects, projects that may have existed for years or decades. Various and disparate equipment types, and therefore data transmission protocols, often exist.
UNS
A whole, integrated view of an organization’s data is required for AI applications. On the solid data foundation, a UNS, a single source of truth, is required to make AI effective. Data must be accessible in a single “virtual” source of truth. UNS can be thought of as an approach for integrating data, defining truth at all levels, everywhere in the organization. It achieves high-quality, contextualized data that is interchangeable between IT + OT. This data need not be collected centrally in its entirety. And, sometimes cannot be due to sheer volume and velocity of data. Rather, this data needs to be integrated, contextualized, and accessible where it is needed. In the process of moving to an UNS, data silos are removed.
Cybersecurity and Safety
For AI initiatives in Operational Technology (OT) to succeed, they must be built upon a robust foundation of Security and Trust. The security pillar requires a pervasive, end-to-end strategy, starting with strong device-level security and security in transit. This includes implementing a comprehensive zero trust model that extends from the edge device through the broker, the cloud, and all consuming systems. Core to this is encryption everywhere, ensuring data is protected at rest, in-flight, and even in-processing through the use of technologies like secure enclaves and trusted compute.
The pillar of Trust and Safety is addressed by incorporating critical human and technical Factors. This involves establishing clear AI governance and AI Guardrails, and maintaining Human in the loop oversight, especially in process control. A key technical challenge is Managing Determinism by clearly defining what systems can tolerate probabilistic or stochastic outputs and where absolute determinism is required (e.g., in safety systems). This also necessitates careful selection between Machine Learning (ML) and deep learning systems, the strategic application of multi-domain AI (like vision systems for quality control), and effectively managing probabilistic systems through AI tuning to ensure narrow and reliable outputs.
Automation to Autonomy
The degree to which processes and workflows can be transformed and benefit from AI is determined by the level of automation AI can bring. That automation ultimately leads to greater levels of autonomy at many levels within the system and subsystems in OT. Increased automation may lead to welcome, incremental efficiencies in certain steps of a process whether technical or human.
According to a recent Automation World article: “AI may be the dominant force in enterprise IT deployments, but the technology is more cautiously applied to industrial digital transformation as of slow, but steady evolution from automation to autonomy.”
As a Force-Multiplier
Autonomy brings higher order of magnitude of efficiency when the automation is applied cross-system in a holistic way. Autonomy makes it possible to reimagine the way work is done. This reimagination of work is where powerful innovation occurs.
For example, we can easily see that automating certain steps of an error-prone, highly time-intensive process will improve overall product quality and shorten cycle times. Naturally, this will lead to improvements in cost efficiency perhaps in the single to double-digit ranges. There is a limit to simple automation. And, that limit is defined by the process itself.
As An Innovator
On the other hand, true autonomy enables a rethinking of how the process itself is defined. The outputs and goals remain the same—decrease time, decrease cost, improve product quality, production output, and profit margin. In one sense, autonomy in AI systems can be thought of as automation carried out to a very high degree.
Human-in-the-loop (HITL)
However, Autonomy means that sub-systems and components are inherently more intelligent. The components use data and information with advanced AI analytics to detect conditions, adjust, and refine sub-process flows dynamically, thereby lessening the need for humans to be required at low levels. In this way, the humans in the loop can perform higher-level, more valuable tasks and focus on optimizing the overall flow.
Increased autonomy makes it possible to exchange and compose process flows at a more rapid rate while retaining and even improving process quality and throughput. Autonomous systems can adapt more rapidly to changing conditions and flows with less human intervention.
AI in OT Outlook
Artificial Intelligence is already showing early wins. While already potent, AI in OT still has a large, untapped growth potential. For those embracing AI strategically, the returns on investment hold large potential. Building on foundational aspects of data acquisition, data quality, and usability is key to successes with expanded OT AI adoption. On this foundation, organizations can further innovate often in unique and unknowable ways. A clear start is on the horizon in product innovation, quality, novel uses of AI in cobotics, and collaboration. Radical, AI-driven innovation with humans in the loop is where the true benefit of AI begins.
Sources
Bill Sommers
Bill Sommers is a Technical Account Manager at HiveMQ, where he champions customer success by bridging technical expertise with IoT innovation. With a strong background in capacity planning, Kubernetes, cloud-native integration, and microservices, Bill brings extensive experience across diverse domains, including healthcare, financial services, academia, and the public sector. At HiveMQ, he guides customers in leveraging MQTT, HiveMQ, UNS, and Sparkplug to drive digital transformation and Industry 4.0 initiatives. A skilled advocate for customer needs, he ensures seamless technical support, fosters satisfaction, and contributes to the MQTT community through technical insights and code contributions.
