
Cause, effect, consequence: Assessing natural hazards
The City of San Francisco recently announced that it was partnering with the Lawrence Berkeley National Laboratory on a project to model Pacific storms. According to the San Francisco Chronicle, this “first in the nation” initiative “… is expected to help city officials decide where to direct millions of dollars in future infrastructure investments, from fortifying San Francisco International Airport to upgrading highway drainage to re-engineering seawalls.”
The Chronicle also reports that “The simulations will narrow the likely impacts of global warming to within as little as 2 miles, meaning city officials will know precisely how weather in San Francisco will differ from other parts of the region.”
Not surprisingly, this level of precision and granularity represents a vast improvement over previous generations of computer models developed to simulate natural systems and quantify the potential repercussions of natural disasters. It’s also a portent of the future.
The hard lessons of Hurricane Andrew
After tearing across southern Florida in August 1992, Hurricane Andrew left behind USD 27.3 billion in total damages and USD 15.5 billion in insured losses (1994 USD); nine insurers also went out of business. At that time, Andrew was, by far, the most destructive Atlantic hurricane on record. (It currently ranks as the eighth costliest Atlantic hurricane.)
In addition to the devastation it wrought, Hurricane Andrew, along with several other major disasters around this same time, starkly exposed the limitations inherent in applying an actuarial approach to assessing catastrophe risk.
Fortuitously, this was also a period when computer processing power was advancing rapidly as was our understanding of the Earth’s atmosphere, oceans and geology; the latter from several ground-breaking scientific studies initiated in the 1980s that were enabled, in part, by new data sources from satellites as well as on land and across the oceans.
Following the hard lessons learned in the aftermath of Andrew, these new capabilities and expanded data sets provided the impetus for the first computer models designed to quantify the possible effects of natural disasters.
A vital tool for re/insurers
Fast forward to today. Several commercial providers currently offer models covering a range of perils including floods, windstorms, excessive heat, wildfires and tsunamis. The geographic spread of these models also has expanded considerably. While there are still some gaps, and some perils have been modeled more extensively than others, most areas of the world where natural disasters represent a substantial threat are covered by one or more commercial models.
These models are primarily used by re/insurers as input for underwriting/pricing decisions and to help manage accumulations in hazard-prone areas. They’re also used by some national governments and global aid organizations for disaster planning and preparedness.
Up to now, however, these models and capabilities have not been widely used at the individual client level to help organizations assess their specific exposures to various natural hazards.
Client-specific applications
As San Francisco’s new project indicates, that’s changing. Today, data scientists, natural scientists, actuaries, risk engineers and modeling experts can take advantage of:
- a profusion of data from a vastly greater number and variety of sources, including feeds from new, high-resolution satellites that record conditions with even greater precision and granularity;
- an evolving, albeit still imperfect, understanding of the causes, dynamics and effects of various natural hazards; and
- increasingly powerful computer networks for processing data and simulating different natural systems, plus ever more sophisticated mapping/visualization techniques.
Also, as a global insurer, we indemnify a significant volume of claims linked to natural disasters. Our loss experience data, covering over 30 countries, is an extremely useful resource on the effects of different kinds of events in diverse parts of the world; it gives us first-hand insight on where, when, why and how things break. For instance, by analyzing clients’ flood losses in different parts of the world, we are better able to estimate the damages clients could expect the next time water levels reach a certain height. And with enhanced modeling capabilities, we also can predict with greater confidence the probable magnitude and extent of flooding in particular watersheds over time.