‘Policies of fire suppression that do not account for this unusual environmental situation are unsustainable.’
By Summit Voice
SUMMIT COUNTY —After a careful comparison of historic fire frequency and climate records (based in part on an extensive charcoal database) a team of university researchers concluded that human activity reduced the frequency of wildfires from the 1,000-year maximum to the 1,000-year minimum — in less than 100 years, leading to a fire deficit.
In short, fire regimes are out of whack with climate cycles. A combination of drought, a build-up of combustible fuels and increased tree mortality is the recipe for a “perfect storm” of wildfire conditions, the researchers warned in a Feb. 14 paper published in Proceedings of the National Academy of Sciences.
The long-term perspectives gained through these studies demonstrate how strongly climate and people affect the present-day landscapes and forests of the American West, and how they may change in the future, said University of Oregon geography professor Patrick Bartlein, one of the paper’s authors.
“Policymakers and others need to re-evaluate how we think of the past century to allow us to adjust and prepare for the future,” Bartlein said. “Recent catastrophic wildfires in the West are indicators of a fire deficit between actual levels of burning and that which we should expect given current and coming climate conditions. Policies of fire suppression that do not account for this unusual environmental situation are unsustainable.”
“The last two centuries have seen dramatic changes in wildfire across the American West, with a peak in wildfires in the 1800s giving way to much less burning over the past 100 years,” said lead author Jennifer R. Marlon, now a National Science Foundation Earth Science Postdoctoral Fellow at the University of Wisconsin, Madison. “The decline was mostly caused by the influx of explorers and settlers and by their subsequent suppression of wildfires, both intentionally and accidentally.”
Marlon and colleagues used existing records on charcoal deposits in lakebed sediments to establish a baseline of fire activity for the past 3,000 years. They compared that with independent fire-history data drawn from historical records and fire scars on the landscape.
Their key findings:
- Comparing charcoal records and climate data, as expected, showed warm, dry intervals, such as the “Medieval Climate Anomaly” between 1,000 and 700 years ago, which had more burning, and cool, moist intervals, such as the “Little Ice Age” between 500 and 300 years ago, had fewer fires. Short-term peaks in fires were associated with abrupt climate changes — warming or cooling.
- Wildfires during most of the 20th century were almost as infrequent as they were during the Little Ice Age, about 400 years ago. However, only a century ago, fires were as frequent as they were about 800 years ago, during the warm and dry Medieval Climate Anomaly. “In other words, humans caused fires to shift from their 1,000-year maximum to their 1,000-year minimum in less than 100 years,” Gavin said.
- Climate and humans acted synergistically — by the end of the 18th century and early 19th century — to increase fire events that were often sparked by agricultural practices, clearing of forests, logging activity and railroading.
“We can use the relationship between climate and fire to answer the question: What would the natural level of fire be like today if we didn’t work so hard to suppress or eliminate fires?” Marlon said. “The answer is that, because of climate change and the buildup of fuels across the western U.S., levels of burning would be higher than at any time over the past 3,000 years, including the peak in burning during the Medieval Climate Anomaly.”