February 15, 2021
Though the digital twin is still in the early developmental stages, the concept generates great interest, promising unprecedented visibility into the state and performance of products and processes throughout their lifecycles. Though a full spectrum of industries can apply the technology, the concept called a closed-loop digital twin seems to hold great potential for the manufacturing sector (Fig. 1).
The technology raises the prospect of dramatic changes in how machine makers and manufacturers hone their products and optimize equipment and processes. A manufacturing closed-loop digital twin aims to use analytics and data from a broad spectrum of business and operational sources to minimize costs, avoid unplanned downtime and optimize quality, yield and efficiency.
Industry watchers see closed-loop digital twins tightening the connection between the machine builder, line planner and manufacturing organizations. The goal is to help manufacturers learn to use their lines better—to enhance speeds and quality—and promote more efficient supply chains.
“By having a closed loop, one can ensure that insights get into the hands of plant personnel in real time, enabling running of the day-to-day operations most effectively,” says Ranbir Saini, senior director, digital product manager, at GE Digital.
Integrating Manufacturing
The idea behind these closed-loop digital twins is to achieve real-time integrated manufacturing.
The main vehicle for this integration is the virtual model, which promises to enable automation engineers to benchmark production metrics in real time. Ideally, these models include all of the factors that affect efficiency and profitability of production, including data on machines, processes, labor, incoming material quality and order flow.
This level of closed-loop control has become practical as a result of advancements in hardware, software, sensors and systems technologies. These enhancements provide engineers with a more complete data set, including real-time information captured by a broad spectrum of sensors, which play an increasingly important role in the creation and maintenance of closed-loop digital twins.
“The best tools available for connecting sensor data to the cloud are edge gateways and devices.”
Leveraging these new capabilities, proponents contend, data modeling will be able to provide a single source of truth, as well as the kinds of data required to support analytics and real-time decision-making.
How to Develop a Closed-Loop Digital Twin
Manufacturers can use closed-loop digital twins on many scales, including individual pieces of machinery, entire production lines or plantwide operations.
To learn how an engineering team creates a closed-loop digital twin for individual manufacturing machines, it’s best to look at the development process as it moves through the various phases of the asset’s lifecycle, building the closed loop, layer by layer.
“We tend to describe the machine digital twin lifecycles in three phases—design, manufacturing and production,” says Bill Davis, solution director of industrial machinery and heavy equipment industry at Siemens Digital Industries Software.
“It is important to have a digitalization strategy that covers all these phases because information is often created in one phase and leveraged in another. This is a holistic view of the closed loop, but it’s just as important to recognize that each phase has information created, managed and leveraged within it as well.”
A first step in building a closed-loop digital twin in the design phase is the identification of the key elements required to support the digital twin. Here, the engineering team builds a model or a set of models that describe the behavior and capabilities of the machine.
A 3D representation of the machine often meets this requirement, but the reality is that all engineering domains involved produce their own specific models, such as electrical and pneumatics schematics or programmable logic controller/human machine interface (PLC/HMI) programs.
To accommodate this, the engineering team needs a single simulation model that represents all engineering domains, along with the expected machine operations that connect intended use cases to expected outcomes. This virtual machine model combines automation, electrical and kinematic software simulation models (Fig. 2).
The combination of the three software models is referred to as “software in the loop.” This means that engineers can simulate the behavior of the machine as close as possible before anything is physically built.
In this case, the code running in the virtual PLC and HMI triggers the simulations in the physical or kinematic models, so engineers can graphically see the movement of machine components, as well as product moving through the machine.
This allows the engineers to verify the sequence of operations and error handling. The goal is to prove out the automation code virtually before it is loaded into the physical PLC controlling the machine.
Complementing the digital side of the simulation, the model should also internalize sensor readings to approximate the physical machine’s performance, and confirm that the virtual model is consistent with the engineer’s design. The engineer can then use physical data from the real machine to establish performance.
Once the real machine is built, the engineers have the first opportunity to use the closed-loop digital twin by connecting the physical or kinematic model to the PLC in the machine.
“Here, data from the running machine is triggering the simulations, and the model mirrors the real machine behavior,” says Davis. “We can then verify if the physical or kinematic models are correct and ultimately fine-tune the running of the machine.”
The Larger Picture
When developing closed-loop digital twins on larger scales—such as for production lines or plantwide operations—it’s important to mind the distinctions between implementations for individual machines and those for broader applications. Each case has very different practical considerations.
For example, with broader implementations, the information flow should be directed at the productive output of the line or plant as a total.
“We can collect data over a period of time,” says Davis. “For example, what was the data from last night’s 8-hour shift? This data can then be used for bottleneck analysis and as the basis for a more accurate simulation of how the next shift will run when combined with the new orders for that day.”
Many factors can complicate large-scale closed loops. For instance, a production line often consists of equipment from multiple machine manufacturers, with varying degrees of complexity and maturity, especially in the case of Internet of Things technology and controls. Simple machines tend to have less control sophistication, and the same is true of older machines.
Further, many manufacturers deploy proprietary manufacturing processes, and they closely guard the technology, and choose to design their own machines. In these cases, organizations must create benchmarks and analytics from scratch, tailor data to the asset’s unique features and set aside some general rules of thumb.
Windows on the Physical World
Sensors are a key underlying component of a closed-loop digital twin; they provide physical data on the asset’s condition throughout its operating life. These bridges to the real world enable engineers and plant managers to remotely monitor, diagnose and resolve service issues, often in real time.
“With the power of sensors, service managers have unprecedented capabilities when it comes to taking proactive maintenance steps in real time,” says Steve Dertien, executive vice president and chief technology officer at PTC.
“By implementing sensors within an existing IIoT [Industrial Internet of Things] system, maintenance engineers have an instant connection to assets across the manufacturing floor, enabling their digital twins to automatically detect anomalous readings and forcing decisive action, thereby reducing downtime and optimizing maintenance planning/execution.”
In one of the first steps in closed-loop digital twin development, engineering teams leverage sensors that are already installed in machines and production lines. This requires an architecture that supports legacy data protocols that existing sensor systems use.
“Here, a case can be made for choosing popular data protocols, such as MQTT [message queuing telemetry transport] or OPC-UA [open platform communications unified architecture], for compatibility with modern analytics software,” says Saini.
“This becomes especially important for near-real-time systems because data protocol translation is a big factor when measuring latency. The system used to collect data should provide modern containerized architecture to provide customers with the ability to vertically scale to provide support for new data protocols on an on-demand basis.”
If the developers want to go one step further and add new sensors to the mix, the process becomes more complicated.
“In some brownfield situations, it may not be possible to add new sensing devices to existing control systems,” says Colm Gavin, portfolio development manager at Siemens Digital Industries Software.
“So, the simplest way to start would be to connect any new sensors to a small PLC, which can be used to feed the data up to a higher-level system, such as a SCADA [supervisory control and data acquisition] system,” he says.
Once an architecture is in place to ensure adequate access to sensor data, development teams must contend with sensor data noise and relevance issues. Noise and relevance problems arise from the fact that sensors and data acquisition tools often generate more data than is needed to support a digital twin.
“Imagine having sub-second time series data for a collection from 100-plus different sensors on a product,” says Jonathan Scott, a digital transformation evangelist at Razorleaf. “Some of those data points are useful in characterizing the digital twin, but most are just noise. Tools like Hadoop for summarizing big data are useful for this challenge.”
Engineers must also normalize the data. This means expressing complex environmental inputs in terms that analytical models can use. These issues arise because of the heterogeneous nature of the data streams that the sensors and machines on production lines generate.
“It is not simple to feed sensor data into simulation tools until some normalization takes place,” says Scott. “Evolving standards and multidiscipline analysis and optimization tools should help in this area.”
A Place for Edge Devices
A technology closely linked to sensor use in closed-loop digital twins is the edge device. These local systems can capture data from a broad range of sensors and pass it on to higher-level systems, such as enterprise and cloud applications. When coupled with edge devices, sensors gain access to previously unavailable local data processing resources, and provide greater local control than simple store-and-forward devices. Edge devices also give sensors efficient entry points to manufacturers’ operational technology and information technology networks.
In addition, edge devices provide good entry points to any cloud setup (Fig. 3).
“The best tools available for connecting sensor data to the cloud are edge gateways and devices,” says Davis. “Gateways offer a sense of security in that they aggregate and condition the data for cloud analytics. If immediacy is needed, edge devices that can communicate either directly to the machine or through an intermediate PLC are ways to minimize the data leakage and enhance local response when sensors indicate fast response is necessary.”
All these features make edge devices extremely helpful for plant engineers adapting and incorporating existing equipment with the broader plant closed-loop process.
Challenges, Tradeoffs and Solutions
Given that closed-loop digital twin technology is still new, it should be no surprise that implementing a closed-loop digital twin entails significant challenges and tradeoffs. Some of these stem from conditions on the manufacturing floor. Other concerns involve how time-bound the feedback loop must be.
Many challenges that arise from the manufacturing environment are a result of the age of technologies in play.
“Accessing, standardizing and analyzing the physical and digital data may not be as easy as anticipated, considering that many machines on a factory floor have been there for 20-plus years and their digital definition—including CAD files—may not exist. And on the physical experience end, the sensor data can be corrupted by incomplete information,” says David Immerman, senior research analyst at PTC.
Other challenges stem from the diversity of technologies that go into the digital twin. For example, developers establishing a closed-loop digital twin of a machine in a production line must contend with a daunting mix of hardware platforms with which the machine must interact and support.
Complicating matters further, most hardware manufacturers offer their own proprietary code and authoring tools. As a result, often machine integrators and line builders find themselves writing PLC, HMI and even SCADA integration code from scratch.
Two technologies offering the best prospects for overcoming these hurdles are standards and open tools and architectures.
“The heterogeneity of platforms and interfaces remains the single biggest challenge to overcome,” says Sameer Kher, senior director of R&D at Ansys. “Relying on standards and encouraging open ecosystems is essential to address this challenge.”