Suitable coral reef habitat could shrink by a factor of 10
By Summit Voice
SUMMIT COUNTY — In some of the world’s oceans, acidity is increasing at more than 100 times the natural rate, according to University of Hawaii scientists, who warned once again that these changes in ocean chemistry may significantly reduce the calcification rate of corals and mollusks.
Combining computer modeling with field observations, an international team of scientists from the International Pacific Research Center, University of Hawaii at Manoa concluded that anthropogenic CO2 emissions over the last 100 to 200 years have already raised ocean acidity far beyond the range of natural variations. The study is published in the January 22 online issue of Nature Climate Change.
“When Earth started to warm 17,000 years ago, terminating the last glacial period, atmospheric CO2 levels rose from 190 parts per million to 280 ppm over 6,000 years. Marine ecosystems had ample time to adjust, said lead author Tobias Friedrich. “Now, for a similar rise in CO2 concentration to the present level of 392 ppm, the adjustment time is reduced to only 100 – 200 years.”
“Our results suggest that severe reductions are likely to occur in coral reef diversity, structural complexity and resilience by the middle of this century,” said co-author Professor Axel Timmermann.
The Earth system models simulated climate and ocean conditions 21,000 years back in time, to the Last Glacial Maximum, and forward, to the end of the 21st century.The models used the saturation level in aragonite (a form of calcium carbonate) typically used to measure of ocean acidification.
As the acidity of seawater rises, the saturation level of aragonite drops. The models successfully captured the current observed seasonal and annual variations in this quantity in several key coral reef regions.
Today’s levels of aragonite saturation in these locations have already dropped five times below the pre-industrial range of natural variability. For example, if the yearly cycle in aragonite saturation historically varied between 4.7 and 4.8, it varies now between 4.2 and 4.3, which — based on another recent study — may translate into a decrease in overall calcification rates of corals and other aragonite shell-forming organisms by 15 percent.
Given the continued human use of fossil fuels, the saturation levels will drop further, potentially reducing calcification rates of some marine organisms by more than 40 percent of their pre-industrial values within the next 90 years.
“Any significant drop below the minimum level of aragonite to which the organisms have been exposed to for thousands of years and have successfully adapted will very likely stress them and their associated ecosystems,” said lead author Tobias Friedrich.
“In some regions, the man-made rate of change in ocean acidity since the Industrial Revolution is hundred times greater than the natural rate of change between the Last Glacial Maximum and pre-industrial times,” Friedrich said.
On a global scale, coral reefs are currently found in places where open-ocean aragonite saturation reaches levels of 3.5 or higher. Such conditions exist today in about 50 percent of the ocean – mostly in the tropics. By end of the 21st century this fraction is projected to be less than 5 percent. The Hawaiian Islands, which sit just on the northern edge of the tropics, will be one of the first areas to feel the impact.
The study suggests that some regions, such as the eastern tropical Pacific, will be less stressed than others because greater underlying natural variability of seawater acidity helps to buffer anthropogenic changes. The aragonite saturation in the Caribbean and the western Equatorial Pacific, both biodiversity hotspots, shows very little natural variability, making these regions particularly vulnerable to human-induced ocean acidification.