Patrick Brown and Ken Caldeira dropped a bombshell into the recent (7 Dec 2017) issue of Nature, and the repercussions are echoing around the scientific world. (See, for example, the related article in MIT’s Technology Review.) To be crisp, current emissions trajectories and new understanding regarding climate sensitivity look like Earth is on a path to achieve C over pre-industrial by end of century.
That emissions have not abated is no secret. It seems to need to be repeated constantly, but it should be recalled that what matters for climate (radiative) forcing is the amount of cumulative emissions. That’s because CO2 takes a long time to scrub from atmosphere, so, if they are any emissions at all, this cumulative amount keeps building up, even if only 30% of total emissions remain in atmosphere. (They’re in the climate system, but in oceans and soils.)
What Brown and Caldeira have contributed is a look at the large set of climate models that are used for climate forecasting, and which of the set were the most successful at predicting present conditions given conditions in recent times. There are a large number of these, between three dozen and five dozen, depending upon how one counts, and their vintage. Some are better at certain aspects of climate than others. When forecasts are made, they are run in an ensemble fashion, meaning that all the models get a crack at the global conditions, obtained by observations-to-date, and then are run forward to project how things will be, given concentrations of CO2 in atmosphere, a rate of volcanic eruption, and so on. The UNFCCC forecasts and the U.S. National Climate Assessment (NCA) forecasts are based upon these ensemble runs. The forecasts are obtained as a weighted average of the ensembled outputs.
Recall that these climate models have been much maligned by people and groups who doubt that climate disruption is a serious or any risk to humanity and its economies. One thing claimed is that the long run projections of the climate model runs can’t be trusted because the ensemble does not do that great a job of predicting today from recent history of observations. That much is correct.
However, in a nutshell (see their paper for more, and the figure below, taken from that paper), what Brown and Caldeira did was to emphasize the contributions to temperature prediction of the subset of models which are the most skillful at predicting the present based upon recent observations. They used a technique called multivariate partial least squares regression. What they found was that, with that weighting, the predictions ran hotter than those using the entire ensemble without such a technique.
(Click on image to see a larger figure, and use browser Back Button to return to reading blog.)
This has two major implications, if this trend continues.
First, it means that the endless arguments about what is a good estimate for climate sensitivity have found a powerful resolution: Sensitivity looks high. If that is not the case, the then unreasonable skill of certain models at predicting the present needs to be explained.
Second, if we’ve really moved to the vicinity of C by 2100, then:
- climate bifurcations are well within possibility and the projections of University of Exeter Professor Tim Lenton based upon their analysis of observations look prescient.
- Carbon dioxide removal (CDR), perhaps using techniques like those pioneered Professor Klaus Lackner, look increasingly necessary, despite their outrageous expense and the multi-century timescale these operations must endure. The question of moral hazard of these technologies is rapidly being eclipsed by the facts that the world has not done enough to keep us out of serious trouble.
Note there’s No Free Lunch for fossil fuel emitters here, even granted CDR. That’s because it is so expensive to scrub CO2 from the climate system, it only makes economic sense if emissions are zeroed as rapidly as possible. Even so, given no emissions from combustion or energy for transport, production, harvesting, etc, just feeding people on the planet will release something like 2 GtC per year.
There is a nice Abstract of the finding by Dr Brown here:
The above is a lecture Professor Lenton on bifurcations in the climate system, and clues which suggest one is approaching, a trajectory which hopefully can be reversed.
These other outlets have covered this story, too, in addition to Technology Review:
In net, it looks like the IPCC may have underestimate future warming trends due to climate change.