About a month ago we had a fire in the neighborhood. It was a lightning strike, setting off a small blaze in the woods about two-thirds the way up the valley wall. We live in a cabin on a ranch in southwest Colorado, and I don't think our landlords, the ranch owners, stopped watching the smoke for a day and a half, waiting for it to either go away or erupt into its doomsday potential.
Given the 50 year drought going on out here and that you couldn't design a better fire scheme than the woods around us right now--literal piles of grey, baked deadwood, dead leaves, and brown grass--it's just dumb luck the whole valley didn't go up. Plenty of others have this year, including the one above my old high school.
We should be out of fire season by now, the first week of December. But, after another month of sun and warm weather, the situation's probably even worse. This is what used to be winter, in which all that firewood gets smothered by snow and cold. That hasn't happened yet and everyone is looking a bit nervous. A dry, warm winter is what happened last year, and it led to the worst fire season on record in Colorado. This winter, in the NOAA's long-term climate forecasts, is supposed to be warmer, with an even chance of "normal" precipitation. But, of course, the catch is that there aren't normals anymore. Instead, we have models and projections based on a changing climate.
One of those projections NASA announced today: forest fires are going to get worse. Count on it: "longer and stronger" seasons, more extreme events. That's not really a surprise but, hey, good to know. It's now estimated that fire seasons like 2012's will occur two to four times a decade by 2050. This is based on dryness projections in the Fifth Assessment Report of the United Nations Intergovernmental Panel on Climate Change. The researchers, led by NASA's Doug Morton, did the math on both high and low projections of future CO2 emissions and found that it's going to be bad either way.
Their projections come with the help of satellite data, which allowed a full tabulation of area burned year by year. This method also allows measurements of biomass burned and, following from that, carbon dioxide released by fires. They calculated an increase from 8.8 million tons of CO2 released per year from 1984 to 1995 to an average of 22 million tons from 1996 to 2002. That's a trend expected to continue, making up one of many feedback loops involved with climate change: more fires, more CO2, intensifying climate change, drier conditions, more fires, more CO2. Etc.
Complementing NASA's work, which was announced this week at the annual meeting of the American Geophysical Union, is a new paper in the journal PNAS looking at wildfire patterns in the dry, warm mid-Holocene period (about 6,000 years ago), by examining charcoal deposits in certain lakebeds. Basically, it's historical and statistical proof that drier conditions lead to increased wildfires, or at least that they're correlated. Again, pretty obvious, but maybe it'll be useful in some future lawsuit against industrial civilization.
Top image: stunned Elk avoid a fire sweeping through Montana's Bitterroot Valley by standing in a river. (NASA APOD)