Hurricanes strike the US with regularity, but there’s nothing on record that is at all like Hurricane Harvey’s pummeling of Houston. Understanding the risk of that kind of wind and rainfall happening again is critical if we intend to rebuild infrastructure that’s going to survive to its expected expiration date. But freakish storms like Harvey make risk calculations challenging. These storms have no historic precedent, so we have no idea how often they occur; and the underlying probability of these events is shifting as our planet grows warmer.
An MIT professor named Kerry Emanuel, however, has helped develop a system that analyzes hurricane frequency in a warming world. Using it, he has found that Harvey-sized rainfall could go from being extremely rare to having an 18-percent chance of happening in any given year by the end of this century.
Rainfall experiences a lot of local variations, and sites within a few miles of each other can often see very different numbers. To get a clearer picture of a storm’s damage, the research community has settled on a figure called the “area integrated rainfall.” By that measure, Harvey is the largest storm on record, having dumped 850 millimeters on the Houston area. That’s extreme, but there are other storms of similar magnitude. Texas saw more than 500mm of rain from the remnants of hurricane Patricia just two years earlier.
Storms of this size are a rarity in the historic record, but that record gets spotty once you go back more than a century. Climate models could provide a greater sense of their likely frequency, but they have poor spatial resolution—you can’t accurately model a hurricane without an excessive amount of computing power. Emanuel worked with other researchers to address this limitation, creating a situation in which a high-resolution model of a hurricane floats like a bubble within the larger context of a global climate model. Emanuel has now used this method to analyze Harvey.
To start with, he ran a set of seven climate models using the conditions that prevailed between 1980 and 2016. Tropical disturbances were seeded in these runs, and the system selected those that developed into hurricanes and struck Texas within 300 kilometers of Houston. Based on this analysis, storms with anywhere above 450 millimeters of rainfall are extremely rare; those with more than 800mm are almost unprecedented. “Harvey’s rainfall in Houston was ‘biblical,'” Emanuel concludes, “in the sense that it likely occurred around once since the Old Testament was written.”
If we expand the analysis out to the Texas coast as a whole, the probabilities naturally go up a bit. The models suggest that a storm capable of dumping that much rain would strike Texas about once a century under the climate conditions that prevailed from 1980 to 2000. Harvey-scale storms remain so rare as to make it impossible to perform a statistical analysis on them.
Rare, but getting less so
That’s seemingly good news. A century is longer than we expect most infrastructure to last, and plenty of areas in Texas don’t have much in the way of infrastructure to start with. But the typical year we’re now experiencing is already warmer than most of the ones from 1980 to 2000, which argues against complacency.
To get a better sense of how climate change is skewing those odds, Emanuel turned to the IPCC’s business-as-usual emissions scenario (termed RCP 8.5) and ran the climate models using it as input, focusing on the last two decades of the present century (in other words, looking 60 to 80 years ahead). The news is not good: “Rainfall in excess of 500 mm, which is around a once-in-2,000-year event in the late 20th century, becomes a once in a 100-year event by the end of this century.”
And that’s just for Houston. For Texas as a whole, this sort of rain goes from being once a century to happening every 5.5 years.
Emanuel didn’t directly model the present conditions. But if we assume there’s a straight line between the end of the last century and the end of the 21st, then the odds of one of these storms is now once every 16 years. Remember, it had been once a century just 17 years ago.
Normally, this sort of analysis takes time, both to perform and to make its way through peer review. To get his results out so quickly after Harvey, Emanuel decided to use his status as a member of the National Academies of Science, which let him pick his own peer reviewers, who were likely to be friendly and get the review done quickly. So his findings probably haven’t faced as rigorous a review as they might have. Still, the system used for his analysis has been subject to peer review a number of times, and Emanuel is putting his reputation as a scientist on the line here.
And the results make sense. With a warming atmosphere, evaporation increases, and the air is capable of holding more moisture. That is expected to intensify hurricanes as this century goes on. As for Houston, the results suggest that rebuilding efforts should not assume that something like Harvey will never happen again. And the rest of Texas should view Harvey as a warning.
More Info: arstechnica.com