IoB Insiders Could digital twins hold the key to insurance companies building a better picture of customer assets, asks Andy Yeoman, CEO of Concirrus?
With the rise in connected assets and sensors, and the ever-expanding capacity of predictive analytics and machine learning, the possibilities for insurers to know and understand more of the world around them are huge.
We’ve spoken in the past about how this will enable insurers to change their business models, moving from a model of insurance (that is, what they’ve been doing for the past few hundred years) to one of assurance (where risks are reduced or prevented, and where the insurer actively helps the customer to protect their assets).
We’ve also spoken about how ‘the law of unintended consequences’ could mean that hyper-connectivity will bring about new, unforeseen developments. On a macro-level, this may mean that the insurance market develops in unexpected ways – individual players (insurers, customers, original equipment manufacturers, brokers and so on) may rise or fall or be sidelined entirely as each develops new technological capabilities.
But it may also have consequences on a micro level, with our understanding of device and asset behaviour becoming much more detailed. One technology affecting our understanding of devices and assets is ‘digital twin’ technology.
Read more: With connected policies, who own IoT data?
What is a digital twin, anyway?
Digital twin technology was rated by IT market research company Gartner as one of the top ten strategic technologies of 2017. In effect, it allows companies to create a digital version of a device, machine or system, that can then be used for simulation purposes. This may include diagnosing faults, preventing downtime or generating predictive models that allow greater understanding of its physical, ‘real world’ counterpart.
An early application of digital twin technology was pioneered by NASA for space exploration, allowing engineers to understand how to diagnose and fix machine faults remotely.
As Bernard Marr of UK-based think tank and consulting organization, the Advanced Performance Institute, writes in Forbes, when disaster struck the Apollo 13 mission, “it was the innovation of mirrored systems still on earth that allowed engineers and astronauts to determine how they could rescue the mission.”
Indeed, the importance of this is difficult to overstate. As Thomas Kaiser, senior vice president of IoT at software giant SAP, states, “Digital twins are becoming a business imperative, covering the entire lifecycle of an asset or process and forming the foundation for connected products and services. Companies that fail to respond will be left behind.”
Read more: What data does a digital twin run on?
So what are the implications for insurance? We’ve gone on record many times to say that insurers that fail to adopt connected technologies will become extinct in the near future. But taking the decision to adopt connected technologies is only part of the answer – they need to be baked into that company’s business model.
Part of doing this means utilizing information generated by networks of smart devices. If you’re a marine insurer and all of your customers have vessels packed full of engine diagnostic sensors and connected technology, you could, theoretically speaking, create predictive models that tell you how likely each vessel is to have engine problems (and how likely the owner is to make a claim).
Executives at power systems company Rolls-Royce are already talking about this and applying predictive analytics to maintain efficiency and prevent downtime with the company’s engines. They are also talking about autonomous vessels by 2020, something that would result in huge amounts of digital data being made available for analysis.
In a similar vein, industrial giant GE is applying this technology to its renewable energy business, with predictive maintenance modelling for wind farms.
Read more: GE powers up data insights
What happens next?
So what happens to insurers when this kind of data is produced by customers? How do they ensure that they aren’t sidelined, as other industry players (brokers, for example) become powerful, data-fuelled entities with massive insight into market risks?
The answer lies in an insurer’s ability to translate raw data into behavioural insight. With greater insight into the likely behaviour of (and risk associated with) individual assets, which may in part be generated through the use of digital twins, we can start to see how they might manage risk more effectively and even work with customers to mitigate it. Closer working relationships with corporate customers’ risk management departments are likely to be key here.
This data will also help insurers to model and plan their overall risk portfolio, envisaging likely scenarios and ensuring that an optimal balance of risk is maintained. This ‘digital portfolio’ might help remove some elements of uncertainty within the business, helping it to plan more effective reinsurance policies, for example.
But in order to leverage this data, insurers need ways of bringing it all together and making sense of it. This is currently beyond the abilities of many, and will mean that third-party technology providers have a huge role to play. These companies will act as the intermediary between insurers and data providers (such as customers and manufacturers, as well as industry datasets ). The race to achieve this is now on, and the winners will enjoy a period of substantial competitive advantage over the rest of the market.