tripleplus ungood: Long run hot climate models are also the most accurate reproducing today and recent past


Patrick Brown and Ken Caldeira dropped a bombshell into the recent (7 Dec 2017) issue of Nature, and the repercussions are echoing around the scientific world. (See, for example, the related article in MIT’s Technology Review.) To be crisp, current emissions trajectories and new understanding regarding climate sensitivity look like Earth is on a path to achieve +5\textdegreeC over pre-industrial by end of century.

That emissions have not abated is no secret. It seems to need to be repeated constantly, but it should be recalled that what matters for climate (radiative) forcing is the amount of cumulative emissions. That’s because CO2 takes a long time to scrub from atmosphere, so, if they are any emissions at all, this cumulative amount keeps building up, even if only 30% of total emissions remain in atmosphere. (They’re in the climate system, but in oceans and soils.)

What Brown and Caldeira have contributed is a look at the large set of climate models that are used for climate forecasting, and which of the set were the most successful at predicting present conditions given conditions in recent times. There are a large number of these, between three dozen and five dozen, depending upon how one counts, and their vintage. Some are better at certain aspects of climate than others. When forecasts are made, they are run in an ensemble fashion, meaning that all the models get a crack at the global conditions, obtained by observations-to-date, and then are run forward to project how things will be, given concentrations of CO2 in atmosphere, a rate of volcanic eruption, and so on. The UNFCCC forecasts and the U.S. National Climate Assessment (NCA) forecasts are based upon these ensemble runs. The forecasts are obtained as a weighted average of the ensembled outputs.

Recall that these climate models have been much maligned by people and groups who doubt that climate disruption is a serious or any risk to humanity and its economies. One thing claimed is that the long run projections of the climate model runs can’t be trusted because the ensemble does not do that great a job of predicting today from recent history of observations. That much is correct.

However, in a nutshell (see their paper for more, and the figure below, taken from that paper), what Brown and Caldeira did was to emphasize the contributions to temperature prediction of the subset of models which are the most skillful at predicting the present based upon recent observations. They used a technique called multivariate partial least squares regression. What they found was that, with that weighting, the predictions ran hotter than those using the entire ensemble without such a technique.


(Click on image to see a larger figure, and use browser Back Button to return to reading blog.)

This has two major implications, if this trend continues.

First, it means that the endless arguments about what is a good estimate for climate sensitivity have found a powerful resolution: Sensitivity looks high. If that is not the case, the then unreasonable skill of certain models at predicting the present needs to be explained.

Second, if we’ve really moved to the vicinity of +5\textdegreeC by 2100, then:

  1. climate bifurcations are well within possibility and the projections of University of Exeter Professor Tim Lenton based upon their analysis of observations look prescient.
  2. Carbon dioxide removal (CDR), perhaps using techniques like those pioneered Professor Klaus Lackner, look increasingly necessary, despite their outrageous expense and the multi-century timescale these operations must endure. The question of moral hazard of these technologies is rapidly being eclipsed by the facts that the world has not done enough to keep us out of serious trouble.

Note there’s No Free Lunch for fossil fuel emitters here, even granted CDR. That’s because it is so expensive to scrub CO2 from the climate system, it only makes economic sense if emissions are zeroed as rapidly as possible. Even so, given no emissions from combustion or energy for transport, production, harvesting, etc, just feeding people on the planet will release something like 2 GtC per year.

There is a nice Abstract of the finding by Dr Brown here:

The above is a lecture Professor Lenton on bifurcations in the climate system, and clues which suggest one is approaching, a trajectory which hopefully can be reversed.

These other outlets have covered this story, too, in addition to Technology Review:

In net, it looks like the IPCC may have underestimate future warming trends due to climate change.

About hypergeometric

See http://www.linkedin.com/in/deepdevelopment/ and http://667-per-cm.net
This entry was posted in American Association for the Advancement of Science, American Meteorological Association, American Statistical Association, AMETSOC, Anthropocene, bifurcations, clear air capture of carbon dioxide, climate, climate change, climate disruption, climate economics, climate models, critical slowing down, Cult of Carbon, destructive economic development, Global Carbon Project, global warming, Humans have a lot to answer for, Hyper Anthropocene, Kevin Anderson, leaving fossil fuels in the ground, radiative forcing, Spaceship Earth, the right to be and act stupid, the right to know, the stack of lies, the tragedy of our present civilization, the value of financial assets, Timothy Lenton, tragedy of the horizon. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s