HadCRUT4 and GISTEMP series filtered and estimated with simple RTS model

Happy Vernal Equinox! This post has been updated today with some of the equations which correspond to the models.

An assessment of whether or not there was a meaningful slowdown or “hiatus” in global warming, was recently discussed by Tamino and others(i) (see comments beginning here) in response to a paper by Fyfe, Meehl, England, Mann, Santer, Flato, Hawkins, Gillett, Xie, Kosaka, and Swart, with Ed Hawkins explaining in a more expository piece. Tamino has done his assessment using incremental linear models, correcting for el Niño, and a successively more elaborate series of corrections. I did a quick assessment using a Rauch-Tung-Striebel smoother using the DLM package of R, devised by Petris and Gilks. I used the , and a level level model, meaning one which admits only step changes. I have applied this same model elsewhere, for sea-level rise.

Taking each series separately, what this means is that the observations of the temperature anomaly in one of the series at time t or \mathring{y}_{t} are based upon a “true anomaly” at time t or x_{t} with zero mean Gaussian noise, \mathcal{N}(0,\sigma^{2}_{y}) added to it:

\mathring{y}_{t} = x_{t} + \mathcal{N}(0, \sigma^{2}_{y})

The “true anomaly” is modeled as simply as it can possibly be, yet allowing for full non-stationarity, as a random walk with Gaussian steps, or an example of a Gaussian diffusion process:

x_{t+1} = x_{t} + \mathcal{N}(0, \sigma^{2}_{x})

The filter-smoother finds the most probable values of x_{t} given observations \mathring{y}_{t}. For the HadCRUT4 and GISTEMP temperature anomaly series, \sigma^{2}_{y} was estimated using a Politis-Romano stationary bootstrap (because the series data are interdependent) giving \mathring{\sigma}^{2}_{y}. (The tsbootstrap function of the tseries package was used.) I arbitrarily set

\sigma^{2}_{x} = \sigma^{2}_{y} = \mathring{\sigma}^{2}_{y}

and obtained the following:

2016-03-18_153857_HadCRUT4_GISTEMP_RTS
(Click on figure to see larger image, and use your browser Back button to return to this blog.)

Of interest in the above is the behavior in the 2000-2015 region. Sure, the mean trends suggest a slowing, but the up-and-down single standard deviations are quite broad, and so the time-varying means don’t signify as much as they might. My point going into this is whether there is sufficient information in the observations to make proper distinctions.

We’ll see. For this is the first step of a more elaborate project which I am going to place at arXiv.org, one which overlaps with some of my professional work, my work on Town of Sharon hydrology, and involves new algorithms. That project will also address some of the material in a paper by Frankcombe, England, Mann, and Steinman. I do not expect the project to add to the geophysical discussion, but it is an excuse to devise new methods, and see if the skepticism regarding the purported “hiatus” I share with Tamino stands up to a deeper look. I also feel that the goal, quoted by Fyfe, et al, to “Our goal here is to move beyond the purely statistical aspects of the slowdown, and to focus instead on improving process understanding and assessing whether the observed trends are consistent with our expectations based upon climate models”, deserves some response. The Kalman smoother state-space models are famous for allowing the introduction of arbitrarily complicated explanatory mechanisms for process, even if they only encapsulate superficial physics. There is also a question if something as specific as evidence for a purported slowdown requires the entire CMIP5-like apparatus to be established. Some argue that, in some cases, model-free forecasting can outperform mechanistic models (Perretti, Munch, Sugihara). Also, there are new non-parametric methods available which permit algorithms to find their own models using Bayesian principles and stochastic search. It will be interesting to see what these can do with this data.

I would also like to see assessments of model fit be done using quantitative measures, such as by use of the many available information criteria(ii), rather than by only differential modeling. I’ll endeavor to do that some in my sequel.

By the way, the code to do the above is included in this tarball, along with the data. Just be sure to unpack it into a single directory and setwd to there before trying to run.

Update, 20th March 2016

The value of the variance \sigma^{2}_{x} for the process noise in the above was arbitrarily chosen to be the same as the empirically estimated observational variance \mathring{\sigma}^{2}_{y} of the observations in the separate cases of the HadCRUT4 series and GISTEMP. Moreover, the two time series are analyzed and depicted separately, even if they are practically independent measurements of the same thing. In fact, their correlation coefficient is in excess of 0.98 over their common run. Accordingly the true temperature anomaly should be estimable using both in combination, rather than portraying it twice, since presumably they both reflect the true anomaly series, x_{t}. This update provides one view of that.

In particular, the model was changed to have a bivariate series, the first component being from HadCRUT4, the second from GISTEMP.

\mathbf{y}_{t} = \left[\begin{array}{c}1 \\ 1\end{array}\right] x_{t} + \mathbf{\mathcal{N}}(\mathbf{0}, \boldsymbol\Sigma_{y})

As before:

x_{t+1} = x_{t} + \mathcal{N}(0, \sigma^{2}_{x})

Here \boldsymbol\Sigma_{y} is the covariance matrix between the two series, and is constructed from empirical measurements. Let {}_{H}\mathring{\sigma}^{2}_{y} be the empirical variance estimated using tsbootstrap for the HadCRUT4 series on its common support with GISTEMP. Let {}_{G}\mathring{\sigma}^{2}_{y} be the empirical variance estimated using tsbootstrap for the GISTEMP series on its common support with HadCRUT4. Let {}_{H}\rho_{G} be the empirically derived (Kendall) correlation coefficient between the two series, that 0.98 value mentioned above. Then the covariance {}_{H}\sigma_{G} between the two series is obtainable through the relation:

{}_{H}\sigma_{G} = {}_{H}\rho_{G} \sigma_{H} \sigma_{G}

and by substituting {}_{H}\mathring{\sigma}^{2}_{y} for {}_{H}\sigma^{2}_{y}, and {}_{G}\mathring{\sigma}^{2}_{y} for {}_{G}\sigma^{2}_{y}, and {}_{H}\mathring{\rho}_{G} for {}_{H}\rho_{G}.

Then

\boldsymbol\Sigma_{y} = \left[ \begin{array}{lr} {}_{H}\mathring{\sigma}^{2}_{y} &_{H}\mathring{\sigma}_{G} \\ {}_{H}\mathring{\sigma}_{G} &_{G}\sigma^{2}_{y} \end{array} \right] \approx \left[ \begin{array}{lr} 0.09526892484 & 0.00064632259 \\ 0.00064632259 & 0.08516777122 \end{array} \right]

The process noise variance, \sigma^{2}_{x} is not known, so instead of picking a value for it arbitrarily, it was estimated here using maximum likelihood as sketched in Petris, Petrone, and Campagnoli(iii), using their dlmMLE function. That produces \sigma^{2}_{x} \approx 0.0013466857.

Two cases were considered. The first is the estimate of the combined temperature anomaly series with the input data restricted to the common support. The second case considered was the same, except with a surrogate 2016 temperature anomaly of 0.894 added in, something which has not yet been observed.

The result of the first is:

2016-03-21_005841_HadCRUT4_GISTEMP_Combined_RTS_no_2016
(Click on figure to see larger image, and use your browser Back button to return to this blog.)

The HadCRUT4 and GISTEMP series are shown as before, but the bold red shows the estimate for x_{t} in the common model. The second-order Akaike information criterion for this dataset and model is 604.1, a criterion useful for model comparison(iv).

The second includes that 0.894 degree C anomaly estimate for 2016, and is shown as:
2016-03-21_005841_HadCRUT4_GISTEMP_Combined_RTS_with_2016_surrogate
(Click on figure to see larger image, and use your browser Back button to return to this blog.)

The second-order Akaike information criterion (“AICc”) for this dataset and model is 591.4, slightly better than before.

Note that when the 2016 surrogate is provided, the common estimate ever so slightly increases its slope. With the maximum likelihood estimate, the assessment is that the most recent uptick in global temperatures is judged, at least by the RTS filter, as if it were an outlier. More observations will, apparently, be needed before its mind is changed. Still, if the surrogate is included, the slope of the inferred curve leading to its 2015 value is very nearly linear, contrasted with the gentle kink down which is the solution without it. To the degree to which we accept the inference of the combined model, there is very little indication of a slowdown in warming, agreeing with Tamino, and disagreeing with Fyfe and company. But there are ways of improving this analysis.

The code to do this bit is included in this tarball, along with the data. As before, be sure to unpack it into a single directory and setwd to there before trying to run.

The use of the empirically-derived \boldsymbol\Sigma_{y} in the seems a little silly. Maximum likelihood is being used to estimate the process variance, \sigma^{2}_{y}, so why not use it to estimate \boldsymbol\Sigma_{y} as well? Because it results in a poorer fit. Here

\boldsymbol\Sigma_{y} = \left[ \begin{array}{lr} {}_{H}\sigma^{2}_{y} &_{H}\mathring{\sigma}_{G} \\ {}_{H}\mathring{\sigma}_{G} &_{G}\sigma^{2}_{y} \end{array} \right] = \left[ \begin{array}{lr} 0.0053155162 & 0.0035482156 \\ 0.0035482156 & 0.0117052488 \end{array} \right]

and \sigma^{2}_{x} \approx 0.0028928208, but the second-order Akaike information criterion is now a whopping 982.7, so it’s a loss. With the 2016 surrogate the model gets even worse with the AICc being 990.4.

In another update to this post, the calculation for \sigma^{2}_{x} will be repeated but using a Bayesian computational technique, Gibbs sampling.

The code to do this bit with the MLE is included in this tarball, along with the data. As before, be sure to unpack it into a single directory and setwd to there before trying to run.

And, as promised, I’ll dig deeper still.


(i) To be complete, Tamino has repeatedly addressed this question in many earlier posts.

(ii) Briefer treatments are also available here, and here

(iii)See G. Petris, S. Petrone, P. Campagnoli, Dynamic Linear Models with R, Springer, 2009, Section 4.1, for details.

(iv)See K. P. Burnham, D. R. Anderson, Model Selection and Multimodel Inference: A Practical Information Theoretic Approach, 2nd edition, Springer, 2002.

About ecoquant

See https://wordpress.com/view/667-per-cm.net/ Retired data scientist and statistician. Now working projects in quantitative ecology and, specifically, phenology of Bryophyta and technical methods for their study.
This entry was posted in AMETSOC, anemic data, Bayesian, boosting, bridge to somewhere, cat1, changepoint detection, climate, climate change, climate data, climate disruption, climate models, complex systems, computation, data science, dynamical systems, geophysics, George Sughihara, global warming, hiatus, information theoretic statistics, machine learning, maths, meteorology, MIchael Mann, multivariate statistics, physics, prediction, Principles of Planetary Climate, rationality, reasonableness, regime shifts, sea level rise, time series. Bookmark the permalink.

5 Responses to HadCRUT4 and GISTEMP series filtered and estimated with simple RTS model

  1. Pingback: Phase Plane plots of COVID-19 deaths with uncertainties | 667 per cm

  2. Pingback: Less evidence for a global warming hiatus | Hypergeometric

  3. Pingback: What are the odds of net zero? | Hypergeometric

  4. Pingback: Repaired R code for Markov spatial simulation of hurricane tracks from historical trajectories | Hypergeometric

  5. Pingback: On Munshi mush | Hypergeometric

Leave a reply. Commenting standards are described in the About section linked from banner.

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.