Statistical analysis of temperature data affirms climate models
FRISCO — A new statistical analysis of temperature records dating back to 1500 suggests it’s more than 99 percent certain that the past century of global warming is caused by the emission of heat-trapping, industrial-age greenhouse gases. The study was published online April 6 in the journal Climate Dynamics.
In a press release, the McGill University researchers said the study doesn’t use complex computer models to estimate the effects of greenhouse-gas emissions. Instead, it examines historical data to assess the competing hypothesis that warming over the past century is due to natural long-term variations in temperature. The results all but rule out the possibility that global warming in the industrial era is just a natural fluctuation in the earth’s climate.
“This study will be a blow to any remaining climate-change deniers,” said McGill University physics professor Shaun Lovejoy. “Their two most convincing arguments – that the warming is natural in origin, and that the computer models are wrong – are either directly contradicted by this analysis, or simply do not apply to it.”
Lovejoy’s study applies statistical methodology to determine the probability that global warming since 1880 is due to natural variability. His conclusion: The natural-warming hypothesis may be ruled out “with confidence levels great than 99%, and most likely greater than 99.9 percent.”
To assess the natural variability before much human interference, the new study uses “multi-proxy climate reconstructions” developed by scientists in recent years to estimate historical temperatures, as well as fluctuation-analysis techniques from nonlinear geophysics. The climate reconstructions take into account a variety of gauges found in nature, such as tree rings, ice cores, and lake sediments. And the fluctuation-analysis techniques make it possible to understand the temperature variations over wide ranges of time scales.
For the industrial era, Lovejoy’s analysis uses carbon-dioxide from the burning of fossil fuels as a proxy for all man-made climate influences – a simplification justified by the tight relationship between global economic activity and the emission of greenhouse gases and particulate pollution, he says.
“This allows the new approach to implicitly include the cooling effects of particulate pollution that are still poorly quantified in computer models,” he added.
While his new study makes no use of the huge computer models commonly used by scientists to estimate the magnitude of future climate change, Lovejoy’s findings effectively complement those of the International Panel on Climate Change (IPCC), he says. His study predicts, with 95 percent confidence, that a doubling of carbon-dioxide levels in the atmosphere would cause the climate to warm by between 2.5 and 4.2 degrees Celsius. That range is more precise than — but in line with — the IPCC’s prediction that temperatures would rise by 1.5 to 4.5 degrees Celsius if CO2 concentrations double.
“We’ve had a fluctuation in average temperature that’s just huge since 1880 – on the order of about 0.9 degrees Celsius,” Lovejoy said. “This study shows that the odds of that being caused by natural fluctuations are less than one in a hundred and are likely to be less than one in a thousand. While the statistical rejection of a hypothesis can’t generally be used to conclude the truth of any specific alternative, in many cases – including this one – the rejection of one greatly enhances the credibility of the other,” he concluded.