Why contextualization is the key to solving manufacturing’s data fragmentation problem
Introduction
Walk into any mid-sized factory, and you’ll find a complex web of software systems running behind the scenes. A typical site with 500 employees might rely on 200 or more unique programs, ranging from production planning tools and MES to SCADA controls and specialized machine software. Add warehouse management, inventory tracking, and maintenance systems into the mix, and you’ve got data flowing from dozens of disparate sources, each built for a specific task but rarely designed to work together.
Manufacturers have never had more data to work with than today. Modern machines generate a steady stream of data, thanks to better sensors, smarter controllers, and automated tracking systems. Meanwhile, advances in IT infrastructure give companies more capacity and flexibility than ever to store, process, and analyze that data. In theory, this should be a golden age for manufacturing, driving improvements in cost, speed, and quality.
But in reality, most manufacturers still struggle to get a detailed, real-time picture of what’s happening on the factory floor. “One of the biggest pain points is simply accessing the data,” says Rafael Lorenz, an operations expert and industrial engineer who has worked with manufacturers across industries and teaches AI applications in manufacturing at ETH Zurich. “In pharma, for example, you might want to understand lead times for a batch, but the data is spread across a dozen systems, named with different conventions, and formatted in different ways. Putting it all together is extremely difficult.”
The result is a familiar frustration. IBM estimates that up to 90% of sensor data from industrial IoT devices never gets used —not because manufacturers don’t see the value, but because they can’t access or interpret it in context. With key metrics siloed across different machines, lines, and systems, today’s factories are data-rich but information-poor.
Data Without Context Is Useless
Imagine being handed a pile of Scrabble tiles—only there’s no board, you have no idea what letters have been played, and you don’t even know which language you’re supposed to be using. That’s what working with raw manufacturing data feels like. The pieces may be there, but without knowing what machine the data came from, what product is being made, when it was collected, or what the surrounding conditions were, there’s no way to make sense of it.
Francesco Marzoni, a data leader with more than two decades leading global data and analytics strategies at IKEA, Nestlé, Bayer, and Procter & Gamble, argues for a proactive approach to data readiness. “You need to treat data in an ‘always-on’ way,” he says. “It needs to be integrated, maintained, and ready to activate in real time when the business needs it. Because if you wait to clean or harmonize your data until the moment you need it, it’s already too late to realize the value you’ve identified.”
That lack of readiness makes it especially hard to solve real-world problems. Lorenz points to root cause analysis as a common sticking point. “If you don’t have the right variables connected, you might end up chasing the wrong explanation for a breakdown or a drop in throughput,” he says. “There may be a factor that explains the issue perfectly, but if it’s not linked to the rest of your data, you’ll never find it.”
This is the gap that contextualization is meant to solve. It’s the process of linking raw data points to their physical, temporal, and functional surroundings: What part of the line was it from? What was the machine doing at the time? What product was being run? Without that context, it’s impossible to interpret a temperature reading or pressure spike in any meaningful way. Even the most sophisticated analytics tools, or the most promising AI models, end up flying blind.
Many AI deployments in manufacturing falter for exactly this reason. AI thrives on data, but it can’t do much with isolated metrics. “You can’t just feed random temperatures to an AI model and expect it to generate insights,” says Torbjørn Netland, professor at ETH Zurich and co-founder of EthonAI. “You need something to optimize against, whether that’s yield, throughput time, quality, or safety.” Context tells the model what the data means and how it relates to the real-world problem you’re trying to solve.
That lack of context is also why AI adoption in manufacturing has been slower than many expected. One factory might have high-frequency sensor data from a mixing machine but no record of what product was being produced. Another might know the shift and operator but not the equipment settings or maintenance history. Without a system to tie it all together—“one ring to bind them all,” as Netland puts it—insights remain elusive.
Contextualization Is Hard…
Manufacturers have always used data to run their operations. A century ago, that might have meant handwritten logs tracking machine uptime, punch cards used for scheduling, or inventory counts scratched on clipboards. As factories digitized, spreadsheets replaced ledgers, and ERP systems took over orders and production schedules. What’s new today isn’t the idea of using data, but rather the sheer volume, variety, and complexity of it.
While contextualization may sound straightforward in theory, in practice, it’s enormously challenging. “We have more and more machines, controlled by more and more computers, with even more business software on top,” Netland says. “And yet, the integration of IT and OT has historically been non-existent and even today remains extremely difficult.”
Lorenz adds, “It’s not like you’re connecting one system to another. You’re trying to bridge hundreds of systems, each with their own logic, structure, and data definitions, and managed by people who come from fundamentally different contexts and backgrounds.”
Part of the problem is that manufacturers naturally want the best tool for each job. Specialized point solutions excel at narrow tasks, but they struggle to capture the interdependencies that drive performance across a production line. As Netland points out, it’s a familiar trade-off: “The Swiss Army Knife is a wonderful product that solves many problems. But if you want a good knife, you don’t choose a Swiss Army knife. If you want a good screwdriver, you don’t choose a Swiss Army knife.” That mindset favors precisions, but over time, the accumulation of highly specialized tools leads to silos. They work well in isolation, but fall short when problems span systems. And in high-stakes situations—when the line is down or quality is slipping—you don’t need another specialized tool. You need the manufacturing equivalent of an A-team multitool: something that cuts across systems, highlights what matters, and helps you act fast.
There have been plenty of attempts to bridge IT and OT, under banners like computer-integrated manufacturing (CIM), cyber-physical systems (CPS), and Industry 4.0. But in practice, these visions fall short. Some newer protocols like OPC UA show promise as a kind of “Esperanto for industrial data,” Netland says, but legacy equipment still dominates most factory floors. A typical site might run a mix of new and decades-old machines, each using different formats and communication protocols. The result is fragmentation across not just departments and systems, but entire generations of machines and data.
But the Payoff Is Worth It
When manufacturers can gather, understand, and act on their data in real time, the potential upside is huge. But doing so consistently, and at scale, takes more than just better sensors and faster computers. It requires a new kind of system that connects the dots between data collection and decision-making.
That’s where Industrial AI Platforms come in. These systems are designed to integrate data across machines, production lines, and facilities, enriching raw signals with context about where, when, and how they were generated. Rather than trying to replace existing systems like MES or SCADA, an Industrial AI Platform sits on top of them, bringing together operational and business data to create a complete picture of the factory in motion.
Take Lindt & Sprüngli, for example. At several of their sites in Europe and the U.S., Lindt uses EthonAI’s Industrial AI Platform to oversee production of its famous Lindor truffles. The process, from molding to filling to packaging, generates vast amounts of process and quality data, much of it affected by seasonal conditions like humidity and temperature. EthonAI’s Industrial AI Platform contextualizes that data in real time, helping operators pinpoint the root causes of quality deviations and proactively fine-tune machine settings before issues arise.
And the benefits go beyond a single plant. Because EthonAI’s Industrial AI Platform connects data across locations, it allows companies to compare performance, share best practices, and drive continuous improvement at scale. At Lindt, that means tighter process control, fewer quality issues, and more confident decision-making across their global operations.
Lorenz sees tremendous promise in Industrial AI Platforms. “The impact can be extremely large, especially in terms of bringing different definitions and data sources together,” he says. “When everyone understands what a KPI means, knows where the data comes from, and can trust it, it becomes much easier to make decisions and solve problems together.”
Marzoni agrees, highlighting the power of contextualized data to help everyone on the factory floor. “Manufacturing analytics is not a question of choosing between technology or people,” he says. “It’s about empowering people with the right tools so they can perform at 10x what they do today, while continuously feeding the right knowledge and context back into the system.”
In the end, an Industrial AI Platform gives manufacturers something they’ve long been missing: a clear, unified view of their operations. It bridges the gap between IT and OT, between legacy systems and new AI-driven tools. Instead of trying to tame the complexity of modern manufacturing, it leans directly into it—using contextualization to drive operational and business impact.
Arman Pour Tak Dost is a Go-To-Market Manager at EthonAI. He is particularly focused on the practical applications of Industrial AI across short- and long-term horizons to create lasting competitive advantage.3>