Big data can play a key role in insurance by enabling insurers to analyse the data to avoid large-scale claims. The technology is becoming increasingly relevant in underwriting, pricing risk, and incentivising risk reduction for insurers, in addition to other technologies such as image and fraud recognition.

Listed below are the key technology trends impacting the big data in insurance theme, as identified by GlobalData.

Cloud

The different cloud providers provide several offers for big data analysis. As computing moves from in-house corporate data centres to third-party cloud data centres, corporations need to buy less of their own networking gear. Moreover, the big cloud data centres run by the likes of Amazon and Microsoft are increasingly using custom networking kits from smaller suppliers such as Arista Networks rather than purchasing, for example, an end-to-end Cisco network.

Insurers are generally reliant on old legacy systems, which have held back innovation. Cloud technology offers insurers the chance to move faster and improve storage and management of the vast amounts of customer data they hold.

Blockchain

It is very early days, but security is one of the factors driving blockchain in the big data context. The critical flaw in credit reporting agencies is putting all that information in one place. Blockchain is distributed and decentralised, so no single loss would be catastrophic. But like many new technologies, blockchain raises new security risks even as it addresses the existing ones.

The most well-known implementation of blockchain, Bitcoin, has been hacked on numerous occasions, including a hack in December 2020 when EXMO revealed cryptocurrencies had been stolen from its wallets. The total lost amounted to 5% of EXMO’s assets, which accounted for roughly $10.5m.

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

Artificial intelligence (AI) for data quality and data readiness

One of the benefits of using AI is that it can significantly improve data quality. Such improvement is needed within any analytics-driven organisation, where the proliferation of personal, public, cloud, and on-premises data has made it nearly impossible for information technology (IT) to keep up with user demand.

Companies want to improve data quality by taking advanced design and visualisation concepts typically reserved for the final product of a business intelligence solution, namely dashboards and reports, and putting them to work at the very beginning of the analytics lifecycle. AI-based data visualisation tools, such as Qlik’s Sense platform and Google Data Studio, are enabling enterprises to identify critical data sets that need attention for business decision-making, reducing human workloads.

Data centre interconnect (DCI)

As more data centres come online around the world, the need to transfer data between them at increasingly high speeds grows. As a result, the DCI market faces huge demand pressures for ever-faster optical links and transceivers. These data transfer speeds are especially important at the DCI level since 70% of all data centre traffic is east-west traffic and therefore has a marked effect on the overall speed of operation of the data centre. The primary use case of DCI remains connecting data centres but has recently expanded into areas such as capacity boosting.

Given how interconnected data centres are, this optical upgrade is likely to happen globally in a relatively short time. Over the next two to three years, a shortage looms of the vital optical interconnect devices that shuffle data at very high speeds within and between data centres, which are vital to avoiding internet bottlenecks. The main beneficiaries could be the leading manufacturers of lasers and optical and photonic products, including Applied Optoelectronics, Ciena, Corning, Finisar, Lumentum, Infinera, Oclaro, NeoPhotonics, and VIAVI Solutions.

Container software

The most important development on the open-source software front is the arrival of operating system container software. Containers package a software application together with its dependencies and can be run on most Linux or Windows servers. They enable applications to be easily moved between different IT infrastructures: on-premises, public cloud, private cloud, or bare metal.

A container’s appeal lies in the fact that you can build it once and run it anywhere. Containers are revolutionising the way applications are developed and redefining the concept of cloud-native applications. The key benefits of container technology for app developers are significant cost savings compared to running virtual machines, reduced time to deployment, better scalability, and flexibility to port to other infrastructures.

5G

The full-scale mainstream adoption of 5G, which is still a few years away, has the potential to increase data consumption globally. 5G can become essential for the insurance industry and enable connected insurance, whereby all of a person’s products can be linked up. It can also drive live data insights to improve preventative insurance; for example, by flagging drivers as tired if they are blinking more than usual, or warning of a potential collision ahead.

Data as a service (DaaS)

Analytics and business intelligence tools require data from a single, high-performance relational database, but most organisations have multiple solutions, formats, and sources for data, particularly legacy retail banks and insurers. This approach leads to numerous challenges, including increased infrastructure costs, low architecture flexibility, increased data complexity, complex data governance, and an increase in the time taken to move data between systems.

DaaS, a cloud service that provides users with on-demand data access, helps enterprises address these challenges and is typically deployed with data lakes, which helps enterprises reduce data storage and management costs, as well as enhance data quality.

This is an edited extract from the Big Data in Insurance – Thematic Research report produced by GlobalData Thematic Research.