Blogs

Improving Insurance Profitability by Reducing Losses and Increasing Pricing Accuracy

Post Comment
Worldwide Industry Marketing Manager for Insurance, IBM

Insurance companies are looking to accelerate the speed and increase the precision of catastrophe modeling, the process through which companies determine the exposure of current policies and the probable maximum loss (PML) from a catastrophic event. Catastrophe modeling is vital for setting policy prices and ensuring there are sufficient reserves available to pay future claims.

“IBM is now helping the banking and insurance industries use advanced disaster modeling algorithms to more accurately plan for disasters -- showing, yet again, how IBM’s heavy investment in analytics and risk management is helping IBM customers compete more effectively while better managing risks.”
Joe Clabby, Clabby Analytics

In the past, actuaries were restricted by the inability to run frequent, fast models. With help from big data solutions, companies can integrate historical event data, policy conditions, exposure data and reinsurance information. They can analyze trillions of data rows and rapidly deliver results that underwriters can use to price policies by individual address, proximity to fire stations or other granular parameters, rather than simply by city or state. Pricing models can be updated more frequently than a few times a year, and specific risk analysis can happen in minutes instead of hours.

tornado.jpg

With the ability to examine streaming data, companies can capture and analyze real-time weather and disaster data that might affect losses. Companies can anticipate losses before disasters strike or as events unfold. Insurers gain the ability to model losses at the policy level, calculating the effect of a new policy on the portfolio while it is being quoted.

By way of example, a large global property casualty insurance company wanted to accelerate catastrophe risk modeling in order to improve underwriting decisions and determine when to cap exposures in its portfolio. However, its modeling environment was too slow and unable to handle the large-scale data volumes that the company wanted to analyze. While the company wanted to run multiple scenarios and model losses in hours, the existing environment required up to 16 weeks to acquire data, model losses, perform analytics and take actions. As a result, the company was able to conduct analyses only three or four times per year.

IBM demonstrated that the company could improve performance by 100 times, accelerating query execution from three minutes to less than three seconds.

The company can now run multiple catastrophe models every month instead of just three or four times per year. Once data are refreshed, the company can create “what-if” scenarios in hours rather than weeks. With a better and faster understanding of exposures and probable maximum losses, the company is able to take action sooner to change loss reserves and optimize its portfolio.

Related Information


Photo: FEMA