A new study found that the increasing CO2 levels in the Earth's atmosphere could result in a climatic condition resembling the Triassic period. And this could happen over the next century!
The Triassic period refers to a geologic period and system — 200 million years ago — that lasted for 50.9 million years.
The research was carried out by the scientists at the University of Southampton, with Professor of Isotope Geochemistry, Gavin Foster, as lead author.
More than 1,200 estimates of ancient atmospheric CO2 concentrations were used to create a continuous record. Researchers claimed that Earth's carbon dioxide levels could soon be similar to levels that prevailed during the Triassic period, and this could happen by 2250 if humans continue exploiting fossil fuels.
"We cannot directly measure CO2 concentrations from millions of years ago. Instead we rely on indirect 'proxies' in the rock record. In this study, we compiled all the available published data from several different types of proxy to produce a continuous record of ancient CO2 levels," Foster explained in a statement.
Significant fluctuations have been seen in CO2 levels on multi-million year timescales over, from around 200-400 parts per million (ppm) during the cold 'icehouse' periods to up to 3,000 ppm during intervening warm 'greenhouse' periods, the media release stated.
There's already proof that significant alterations in the climate took place in the past, and the present speed at which climate change is taking place is being defined as "unusual", though Earth is presently in a colder period.
Fossil fuels exploited over the last 150 years have increased atmospheric concentrations of CO2 from 280 ppm in the pre-industrialisation era to nearly 405 ppm in 2016. While the potent greenhouse gas CO2 doesn't define Earth's climate, sunlight entering the Earth's atmosphere and the power of the greenhouse effect do. Changes taking place in any of these two factors can trigger climate change.
"Due to nuclear reactions in stars, like our sun, over time they become brighter," adds co-author Dan Lunt, Professor of Climate Science at the University of Bristol.
"This means that, although carbon dioxide concentrations were high hundreds of millions of years ago, the net warming effect of CO2 and sunlight was less. Our new CO2 compilation appears on average to have gradually declined over time by about 3-4 ppm per million years. This may not sound like much, but it is actually just about enough to cancel out the warming effect caused by the sun brightening through time, so in the long-term it appears the net effect of both was pretty much constant on average," Lunt added.
"Up until now it's been a bit of a puzzle as to why, despite the sun's output having increased slowly over time, scant evidence exists for any similar long-term warming of the climate," co-author Professor Dana Royer, from Wesleyan University in the US elucidated.
"Our finding of little change in the net climate forcing offers an explanation for why Earth's climate has remained relatively stable, and within the bounds suitable for life for all this time," Royer explained further.
This long term study also provides helpful insights regarding future climate changes; it's a well-known fact that the rate at which climate changes are presently occurring is way above geological norms.
By the year 2250, carbon dioxide levels in the atmosphere would reach 2,000 ppm if humans burn away the Earth's entire store of fossil fuels and don't come up with a solution to keep increasing CO2 levels under control.
"However, because the Sun was dimmer back then, climate change 200 million years ago was lower than we would experience now with such high levels of CO2. So not only will the resultant climate change be faster than anything the Earth has seen for millions of years, but the climate that will exist is likely to have no natural counterpart, as far as we can tell, in at least the last 420 million years."
This study has been published in the journal Nature Communications.