The Transforma Insights team has been actively examining the opportunities for Digital Twin for many years. Just this week we published our Digital Twin Key Topic Insight Report , and at the end of May we joined the Digital Twin Consortium .
In a previous blog ' An abundance of Digital Twins ', we discussed the abundance of Digital Twins that were likely to emerge to support different use cases (and different service providers) associated with the same piece of equipment. The graphic below illustrates how different pieces of equipment on a production line can be associated with different digital twins for different purposes, in this example there are twins for managing the equipment, another twin for managing the process and another for managing the building that contains the production line.
There may be other associated digital twins too. For example, a twin used by a specialist service and maintenance company, or an electricity (smart grid) supplier that might need information to manage demand response. The list could almost be endless.
In all cases, where data is shared between parties on the basis of custom integrations, then there will be issues to manage when one party changes their process, or issues a software upgrade – solution partners will expect information to be shared in the same formats (and with the timeliness, accuracy, etc.) that were envisaged when the custom integration was built. And, of course, all such custom integrations would have to be built ground-up in the first place.
With such a diversity of potential digital twins for different purposes, and a dynamic environment,, it’s clear that without some kind of (de jure, or de facto) standards in place the overall space could become hugely complex, potentially with different digital twins built ground-up for specific use cases with specific equipment types and specific manufacturers, and so on. That would be hugely inefficient, and result in significant friction for adoption across multiple Digital Transformation domains.
Standardisation has the potential to eliminate all such complexities. But how might standardisation in the digital twins space work?
In our view, it’s the last of these options that is likely to be most suitable. It has the advantage of ensuring long-term, stable, access to defined information types (for example, ‘real time power consumption’), but it also does not significantly inhibit innovation by any participant in an already-deployed digital twin solution. The list of standardised APIs would evolve both within, and across, industries over time.
Closely related to this, there’s also the consideration of who ‘owns’ digital twin information.
At this point, it is helpful to introduce two new definitions:
If the market evolves in the way that we are suggesting (with creeping standardisation of APIs), then there’s an obvious answer to who should own digital twin information: the Primary Digital Twin for any asset should be owned by the owner of that asset. The owner of the asset would then authorise a range of third parties (including any original equipment manufacturer) to access relevant information (via APIs) to populate the different digital twins that are needed to support associated use cases.