U.S. Hurricane Damages 1900 to 2005 - Why Do the Losses Keep Going Up?
Roger Pielke, Jr. University of Colorado
Joel Gratz ICAT Managers
Chris Landsea (Chris.Landsea@noaa.gov), (presenting) TPC/NHC, Miami, FL
After more than two decades of relatively little Atlantic hurricane activity the past decade has seen heightened hurricane activity and more than $150 billion of dollars in damage in 2004 and 2005. This paper normalizes U.S. hurricane damage from 1900-2005 to 2005 values using two methodologies.
A normalization provides an estimate of the damage that would occur if storms from the past made landfall under another year’s societal conditions. Our methods use changes in inflation and wealth at the national level and changes in population and housing units at the coastal county level. Across both normalization methods, there is no remaining trend of increasing absolute damage in the dataset, although 2004 and 2005 are large loss years. The 1970s and 1980s were notable because of the extreme low amounts of damage compared to other decades. The decade 1996-2005 has the second most damage among the past 11 decades with only the decade 1926-1935 surpassing its costs. Over the 106 years of record, the average annual normalized damage in the continental United States is about $10-11 billion. The most damaging single storm is the 1926 Great Miami storm with $140-157 billion of normalized damage. The most damaging years are 1926 and 2005. Of the total damage, about 85 percent is accounted for by the intense hurricanes (Saffir–Simpson categories 3, 4, and 5), yet these have comprised only 24 percent of the U.S. landfalling tropical cyclones.
Unless action is taken to address the growing concentration of people and properties (such as by strengthening the ability of buildings to withstand storms) in coastal areas where hurricanes strike, damage will increase, and by a great deal, as more and wealthier people increasingly inhabit these coastal locations.