The global adoption of the core aspects of the Industrial Revolution 4.0 saw a remarkable acceleration from the early onset of the COVID19 pandemic and is now moving ahead faster than expected and brings with it the promise of the technical paradigm of “smart everything”, where all the machines that we use are interconnected and are able to publish and subscribe to information from other machines, through “Internet of Things” or simple, IoT.
The IoT, which has already developed into a multi-billion dollar trend with no slow down in the foreseeable future, has brought the concept of modern automation to classic manufacturing environments through Operational Technology and Internet. It has exposed industrial sensing and control equipment to the internet and the world of virtualization. The data that is provided by these smart machines is collected and ingested in a new master data management system called, Digital Twins.
A Digital Twin is a Digital (Virtual) Representation of a physical system or object in real-time. The concept’s origination is attributed to books of science fiction over the past several decades, but the first person known to have suggested its application in manufacturing is Michael Grieves at the University of Michigan in 2002.
NASA then pioneered the first practical definition of Digital Twins in 2010, when they created full-scale digital simulations of space capsules, to improve them by replicating and analyzing potential (and actual) issues that they would run into or were already dealing with.
Over the course of the past decade, Digital Twins have expanded to physical manufacturing, urban planning, construction, healthcare, automotive, and virtually into anything that can be imagined.
Market and technology trend researchers like Gartner, Forbes, and McKinsey started recognizing the Digital Twins as one of its top 10 strategic technology trends, since 2017, claiming that during the first half of the 2020s, Digital Twins will be virtualizing billions of things and their market would grow into all sectors for the foreseeable future.
A Digital Twin is a software model that ingests real-time, real-world data about a physical system or object and creates predictions or simulations of the new state of that physical system or object based on the data.
The Digital Twin of a physical system or object is based on a Digital Thread, which is the lowest level design and specification for a Digital Twin and is the key to maintaining the model’s accuracy. Digital Thread is also used to map the traceability of the Digital Twin’s simulated behavior and characteristics back to its initial (and later continuously revised and optimized) requirements and components.
The Engineering Change Order (ECO) is used to process changes to product design and results in the creation of a new version of the item’s Digital Thread, leading to a new version of the Digital Twin.
The Digital Twin Prototype (DTP) contains its designs, analysis, and processes to replicate a real-world system or product, and is created before the physical system or product is implemented (or manufactured).
The Digital Twin Instance (DTI) is the Digital Twin of each individual instance of the system or product once it is implemented (or manufactured).
The Digital Twin Aggregate (DTA) is the synchronized aggregation of Digital Twin Instances, whose data and functions can be used in implementing the physical system or product, predicting its behavior, and simulating its future states.
Based on its integration level and degree of information and data flow between a digital copy and its physical counterpart, the Digital Twins are classified into three subcategories: Digital Model (DM), Digital Shadow (DS), and Digital Twin.
The Digital Twin is essentially a logical construct, which means that other applications would have the actual data needed for their creation. The context and construct of the information contained in Digital Twins are based on their use cases and are strongly industry-oriented. In the workplace, Digital Twins are often part of Robotic Process Automation (RPA) and part of the hyper-automation trend.
The life cycle of a Digital Twin starts in Data Science and Machine Learning pipelines, creating a Mathematical Model based on the research of the physical characteristics and behavior factors that rule the physical system or object that is under replication to simulates the real world in a digital (aka virtual) workspace.
To maintain its accuracy and relevancy, a Digital Twin is implemented in a way to receive data from sensors gathering them from a real-world counterpart. This makes IoT a key partnering component and supporting factor in the model’s quality and recency by letting the Digital Twin simulate the real-world system or physical object in real-time, and generate insights into its performance and potential problems, which will help with continuous improvement and optimization of the physical twin.
Most IoT solutions focus on turning machine-generated big data into analytical insight. Engineers can then create dashboards and reporting solutions to gain business intelligence around these machines and how they operate. Digital twins provide real-time lines of sight’ into what’s happening with physical objects, which can significantly reduce maintenance costs.
Research shows that nearly 90% of Digital Twin deployments run simulations and over half of them are or will be integrated with other Digital Twins, creating comprehensive “compound” Digital Twins of an organization’s entire operations.
It is important to note that Digital Twins comprise transactional data like maintenance records, as well as real-time information, process simulation data, geospatial data, time-series data, and analytics.
One caveat is that Digital Twins aren’t silver bullets for all problems as they can create unnecessary complexity and become technology overkill for some business problems. Also, building Digital Twins is complex, and there is no standardized platform for doing so. In contrast with many emerging technologies, commercial Digital-Twins may come from some of the largest companies in the field which are experts in their fields but may not be as strong in the supporting technologies that Digital Twins need for their optimized functioning.
Sometimes the key impediment in implementing Digital Twins is lacking in the “Digitalization” process. Not all industries have been mandated by regulatory bodies – or traditionally had economic incentives – to implement real-time, remote monitoring and now it would become a serious burden for them to convert the manually managed spreadsheets, drawings, and ad-hoc analytical tools they use into a comprehensive digital model that can be monitored, referenced, and revised remotely.
Another main issue is that some classic manufacturing organizations suffer from a misconception that their traditional monitoring and control mechanisms (e.g., a Distributed Control System (DCS) or a Supervisory Control and Data Acquisition (SCADA) platform) contain all the data they need.
They would further fall into the trap of assuming that by connecting these systems they can establish remote data access for visualization and analytics, while only part of their remote operational data flows through those platforms, and not all of them are remotely accessible. Some others are also far from the point of having all their needed data digitized for remote access or real-time operational insights.
IBM has been presenting Digital Twins as part of its IoT offering over the past few years. Microsoft has recently extended its own Digital Twins as part of its Azure cloud services. General Electric, has been developing Digital-Twins internally as part of its jet-engine manufacturing process and started offering that technology and expertise to clients. Their “digital wind farm” created new ways to improve productivity for their aerodynamic products. They also use the Digital Twins to experiment with the configuration of each wind turbine prior to construction. Its goal is to generate 20% gains in efficiency by analyzing the data from each turbine that is fed to its virtual equivalent.
Ganesh Bell, the Chief Digital Officer of General Electrics says, “For every physical asset in the world, we have a virtual copy running in the cloud that gets richer with every second of operational data”.
There are also several technology organizations at a variety of progress in trying to establish platforms or end-to-end pipelines for Digital Twins.
Digital-twin use cases
Digital-twin business applications are found in a number of sectors, including (but not limited to):
The arrival of Digital Twins has disrupted the entire product lifecycle management (PLM) and has enabled companies to create a Digital Footprint of their products from early design, through the manufacturing and then into servicing and operations. The has significantly raised the efficiency of PLMs in all stages of the process and reduce costs in many points.
A large array of sensors are placed throughout the physical manufacturing process; collecting, and sending data from different aspects, such as environmental conditions, behavioral characteristics of the machines, and quality of the work that is being performed. All this data is continuously communicated and ingested by the Digital Twin, which enables it to serves as a replica of the live occurrences in the factory.
The exponential growth in the market of IoT has contributed to the significant drop in their costs, lowering the barrier to experimenting and innovating with them and driving the future digital evolution of the manufacturing industry. Digital Twins have also promoted autonomy in manufacturing, which in turn has allowed the production systems to respond to unexpected events in an efficient and intelligent way.
As is the case with manufacturing, Digital Twins in the healthcare industry were originally used in product or equipment prognostics and had allowed for taking a more data-driven approach to healthcare. The rise in computational power while dropping the costs, has enabled healthcare organizations to build personalized models for patients, which dynamically adjust and re-calibrate based on tracked health and lifestyle parameters of customers.
Digital twins have created “Virtual Patients”, with live, realistic, detailed, and relevant data on the health status of individual patients, and have allowed healthcare researchers and regulatory bodies to compare, rank and analyze individual records and their aggregate classifications, in search of patterns and trends for optimization.
Digital twins have enabled healthcare service providers to customize their services based on observed and predicted responses of individual patients, leading to better and more efficient care.
When it comes to using Digital Twins in healthcare, certain regulatory and fairness caveats need to be considered as well, since this technology may not be available to all patients due to its costs. Also using Machine Learning models in service customization sometimes may lead to inherent discrimination issues hidden in the model’s training.
As a special case of Manufacturing, Digital Twins in the Automotive industry facilitate process optimization and reduce marginal costs during the manufacturing stage and continue to do so once the product hits the road and is in service, through predictive maintenance.
As the Automotive industry is shifting more towards the “Computer-on-Wheels” designs, where you are essentially riding a strong computing machine with the latest Edge technology and a large battery to power it on the road, the Digital Twins become even more sophisticated with the addition of Artificial Intelligence, in performing computational work on the go and in continuous connection with other vehicles, through the Internet of Things and the 5G integrated Cloud services.
A growing number of governments have started using Digital Twins to create digital presentations of different aspects of their services. The British government is using Digital Twins to replicate a live virtual model of the country’s economy.
This has also extended to several cities, in the virtualization of public services and internal operations. Digital Twins help urban planners better understand and improve the efficiency of resource distribution channels (such as water and electricity) as well as many applications that can improve life for their citizens now and in the future.
Digital Twins and Remote Working Paradigm
One of the key benefits of using Digital Twins, which has seen massive, accelerated growth due to the pandemic, is their use in training staff remotely and away from the physical location of an organization.
Digital Twins facilitate the required live interaction that the staff needs in order to go through their onboarding and even as they ramp up into their role, to contribute to the organization without having to be onsite.
That also helps the organizations expand the geo-dispersion of their team, and hiring from wider areas (since the commute is no longer a key factor) since many can work and continue to serve remotely.
Digital Twins and Artificial Intelligence
Digital Twins need real-time analytical processing of streamed big data that is coming in from thousands of IoT sensors that are collecting live data from several points in parallel. The computational model to process this amount of complexity in real-time goes above and beyond traditional pre-programmed computing.
That is why we see organizations use Machine Learning algorithms and cloud-based big data streaming technologies to allow for multi-entry channels of live data and real-time processing of it. ML models are created based on the observed behavior and historical data rather than just the design information.
ML models also allow engineers to quickly evaluate possible design alternatives and try a range of possible best fits based on the results of the algorithms. Digital Twins provide manufacturers with the data they require to make real-time production optimization decisions.
How is it done?
To offer a simplified vision on how a Digital Twin is created in a manufacturing environment, we can set the starting point as an existing 3D simulation model, created using a product lifecycle management (PLM) platform or CAD (Computer Assisted Design) software. These models typically describe what the physical product will look like and provide dimensions, weight, and descriptions of materials to be used in construction.
We then begin to create a Digital Twin of the physical product by overlaying the existing 3D simulation model with real-time data from associated sensors, deployment details, and operational conditions.
The problem is that regular software tools would need several hours – if not several days – to complete simulations with the data obtained by the operational and sensory inputs. This can simply render impossible all of our efforts in design optimization, design space exploration, and what-if analyses.
That is why we will use AI-enhanced “Reduced Order Modeling” which is also called “Surrogate Modeling” to replicate the behavior of the complex simulation model, with as much practical accuracy as possible in a much faster approach. We will then use decomposition methods such as truncation, response surface methodologies, and ensemble-based heuristics. This approach allows engineers to experiment with many architectural implementations and verify the results in the same amount of time it would have taken for one original simulation run.
Using the real-world data with surrogate models lets us optimize design parameters and predict behavior changes in real circumstances, like wear and tear, and update the Digital Twin with the information.
We will then integrate the surrogate models, algorithms, and real-world data into a product model. The model needs to be executed periodically to stay up-to-date and relevant and to be able to update the Digital Twin operational model. Unfortunately, the integration part is easier said than done. The lack of standards severely hampers our teams’ ability to integrate the models from all the tools in the Digital Twin flow, and so far, no international or national standard institution has tried to remedy that shortcoming.
Another hurdle in some cases is the lack of large amounts of good quality data for training the models. Digital Twins brings real-world physical experience to the simulation but ingesting that data, can be a challenge when the model to be simulated isn’t yet a field-tested product.
Sometimes the needed high model complexity in mimicking the real world would result in high complexity in the AI framework, and would dramatically raise the need for relevant and high-quality training data for the models, which can be quite a problem in several sectors. The good news is that we have seen recently created new approaches which can train on smaller amounts of data.
Digital twins are dependent on advanced technology to operate effectively, but digitalization does not need to be a one-shot, complex, and expensive process. It must be able to provide an easier job for the staff, faster results, and an easy-to-learn framework.
Starting small with what we have and then taking incremental steps in the digital transformation process by investing in technologies like AI, IoT and advanced analytics tools would allow for achieving earlier victories and help us learn from failures, and create the case for funding for larger digitalization steps.
We are now effectively witnessing a second wave of Digital Twin technologies, with revolutionary features where Digital Twins are not just predicting overall product performances based on their operators’ preferences, but are also adapting and predicting the performance and state of key individual components of the systems which would lead to user-specific individualized performance enhancements.
Article written by Arman Kamran, CTO of Prima Recon, Professor of Transformative Technologies, and Enterprise Transition Expert in Scaled Agile Digital Transformation