Wildfires have increased in size and ferocity in the Western United States in recent decades. To date, multiple studies have suggested climate change is one likely culprit lurking behind much of this fiery growth. The climate is getting warmer, these studies have reported, and this is essentially drying our forests out and making them more likely to go up in smoke. Yet it might surprise you that few studies have sought to put a number on the degree to which climate change has contributed to the increase in wildfire activity.
Bucking the trend is a recent paper that attributes around 55 percent of the increase in wildfire danger to human-caused climate change over the past four decades. During roughly the same period, the study further suggests anthropogenic climate change has nearly doubled the extent of burned area in western US forests.
The study, published in the Proceedings of the National Academy of Sciences, is the work of CIRC researcher John Abatzoglou at the University of Idaho and his colleague A. Park Williams at the Lamont-Doherty Earth Observatory. (John is no stranger to wildfire studies, see also: “Large Wildfires Expected Under Climate Change,” the October 2015 CIRCulator.)
Abatzoglou and Williams’ analysis compared the degree to which climate change created the right conditions for trees and other vegetation in the Western forests to—in effect—be dry enough to burn.
The exact term of art employed by researchers is “fuel aridity.” (Wildfire research tends to refer to trees and other vegetation as “fuel loads,” which, arguably, is pretty good shorthand for the amount of trees, shrubs, and other botanic whatnot that could turn into flames given the right conditions, which, yeah, would make a truly unruly acronym.)
In total, the researchers considered eight fuel aridity metrics and then compared them to the area burned from roughly the middle of the twentieth century to the present. As you might imagine, fuel aridity (in its many metrics) and area burned, are pretty closely correlated. Abatzoglou and Williams determined their degree of chumminess across the eight fuel aridity metrics, coming up with a number of about 76 percent across all metrics. In other words, climate change appears to be the predominant driver of variability in forest area consumed in wildfires.
With that number in hand, the researchers then compared how climate—especially increases in temperature—led to the right conditions for fuel aridity.
The climate part of the analysis was essentially done in two parts: modeling that included climate change and modeling that did not. This was done because the researchers wanted to compare the two in order figure out how much climate change led to an increase in aridity and, hence, how much of wildfire growth can be attributed to climate change. To do this, the researchers’ examined anthropogenic changes in temperature and humidity from climate model simulations relative to the pre-industrial climate and then effectively scrubbed these changes from the observational record to make their comparison.
Mixing and weighing the various ingredients, the researchers’ final analysis suggests that 55 percent of the increase in fuel aridity conditions for the years 1979 to 2015 are due to anthropogenic climate change. Comparing this number to the close relationship between fuel aridity and area burned and the amount of area that was actually burned over the past three decades, the researchers made an effort to quantify how much of that area burned was due to climate change’s tendency to create tinder (dry flammable fuel loads, not the well-known dating app). That number was about double what it would have been minus climate change for the years 1984 to 2015. As with the amount of increase, the growth in area burned really takes off since the turn of the millennium, a trend that has been observed in multiple other studies.
Some Caveats: The legacy of fire suppression, the US Forest Service’s long-held policy to put out wildfires, is now widely understood to have increased fuel loads (the amount of trees, shrubs, and other botanic whatnot that could turn into flames given the right conditions) at lower elevation forests across the Western US. Abatzoglou and Williams’ study considered the modern fire record during which most fires were suppressed. However, it did not separate the influence of climate change from that associated with the legacy of fire suppression. The authors hypothesize that fire suppression has ironically made the landscape more responsive to climate variability and change.
Abatzoglou and Williams end their paper by cautioning that theirs is a first effort and that a large amount of uncertainty exists in their statistical estimate. They suggest that in the future dynamic vegetation models could be used as a way to parse out more complex relationships and processes between climate change and wildfires.
Citation: Abatzoglou, John T., and A. Park Williams. “Impact of anthropogenic climate change on wildfire across western US forests.” Proceedings of the National Academy of Sciences 113, no. 42 (2016): doi: 10.1073/pnas.1607171113
Photo Caption: “Battling the Blaze.” U.S. Marines and fire crew on Marine Corps Base Camp Pendleton, Calif., are responding to wildfires ablaze in southern California May 14, 2014. The Tomahawk fire, in the northeast section of Camp Pendleton has burned more than 6,000 acres forcing evacuations of housing areas on base and various schools both on and off base. Aircraft from 3rd Marine Aircraft Wing and the Camp Pendleton Fire Department worked in coordination with CALFIRE to prevent fires from spreading off base. Marine officials are coordinating with CALFIRE for the further use of military aircraft pending the wildfire status within San Diego County. (Photo Credit: U.S. Marine Corps photo by Lance Cpl. Joshua Murray/Released)
Nathan Gilles is the managing editor of The Climate Circulator, and oversees CIRC’s social media accounts and website. When he’s not writing for CIRC, Nathan works as a freelance science writer. Other Posts by this Author.