This article originally appeared here and we at Insurance Nerds think all of our readers would benefit from the insights in it. Thanks to Praedicat for permission to republish.
We recently did a survey of some of our customers and out of all the answers to the survey, the one that has stuck with me the most was that 83% of underwriters agree that it is their job to protect their company against the “next asbestos,” or what we call latency catastrophe. The picture that comes to mind is of the brave young Dutch boy with his finger in the dike, trying to stop the flood. I’m inspired by the image, but it’s not the little boy’s job. It is the job of the Dutch flood control authorities. Like the Dutch, the P&C insurer needs a latency catastrophe flood control system, which consists of a set of underwriting guidelines developed by management (most effectively with our help, by the way). The underwriter’s job is to write profitable business within the guidelines.
The problem with the idea of the underwriter as the bulwark against the next asbestos is that underwriters operate as expected profit maximizers. They are incentivized to look at an account and, if the premium exceeds the expected loss, they go for it. The technical underwriter spends time identifying the largest drivers of expected loss and engages in a dialogue with the customer about those risks. They can’t possibly identify and discuss all of the loss drivers, so it is reasonable to focus on the risks with the highest expected losses.
The system falls apart with latency catastrophe. These risks typically have relatively low expected losses for a given account, but potentially catastrophic losses across the portfolio because they affect many different accounts across multiple underwriters and even multiple policy years. This is not a problem that underwriters are trained to think about, and it is a different kind of data that needs to be made available, namely data on low probability latent risks for the customer and portfolio accumulation data for the insurer.
The more I’ve thought about the underwriter protecting against the next asbestos, though, the more I am inspired by it. If the underwriter has real-time data on the accumulation of specific named risks in the company’s portfolio, she can look for those risks during account underwriting, and then check whether adding the account fits within established accumulation thresholds. If it does, write the account. If it doesn’t, seek an exclusion or some other means of managing the accumulation (more on those other means in a future blog). A bonus is that when the underwriter engages in dialogue with her customers about these large-scale low-probability risks, underwriting will be working with their customers on the largest risks in society, and equipped with the knowledge their customers need and most likely don’t have.
This is a different way for underwriting management to operate as well. Management will no longer be identifying emerging risks and developing specific underwriting guidelines, including exclusions, based on specific risks. Instead, management will set broad accumulation thresholds and make portfolio accumulation information available to the underwriters. With these data available to the underwriting workflow, the good news is that the underwriters will then just do their job — they already see it that way — and protect the company against the next asbestos.
There’s a lot of interest these days in the underwriter of the future. It is not just about processing more data quickly, but also different kinds of data and different kinds of underwriting guidelines. Praedicat is working with insurers to build this vision for the future of underwriting today.