(Slight updates, 1st October and 30th September 2017. Thanks to Jim Stuttard for noticing that a link on Takens embeddings was closed to all but students. I have replaced that link, and added a link to an article about stable recovery of embeddings.)
I say “remarkable” because, for instance, in their frequentist work on series, their autocorrelatiions, and statistics are filled with unchallenged assumptions about independence and about false discovery rates, especially in their highly imperfect appropriation of methods from gene sequence analysis. Their paper on El Nino is especially egregious, using a t-test in total neglect of the fact that they are performing multiple and many tests. They don’t address the problem of overfitting at all, and, being frequentists, they are obliged to do so.
But far worse, in my opinion, is their advance of what is Just Bad Science. In particular, their series analysis suffers three blemishes.
First, there is no instance where their series based explanation makes a prediction that is falsifiable. I challenge thrm to make one.
Second, while they offer a statistical explanation of series, they have not advanced a physical mechanism for its realization, something essential for both taking it seriously as a hypothesis and for supporting additional scientific work based upon it. They can’t say what additional measurements are to be taken or where. Indeed, from their perspective, doing additional measurements is somewhat pointless. due to the “chaotic” nature of outcomes.
Third, they embrace the popular understanding of “chaos” from the Lorenz setting rather than a technical one, so the scientific reader really doesn’t know from paragraph to paragraph what exactly they are talking about.
Thus, I’d say their thesis is Not Even Wrong (*).
Update, 6th March 2015
So, I finally found a possible explanation of what Swanson and Tsonis mean by their ‘s__t happens’, even if they dd not say that. They were so wrapped up and excited about “internal variability” in climate, they did not allude to their paper co-authored with George Sugihara (**), which said, in part:
The lack of an oscillatory model signal suggests that the interdecadal global mean surface temperature signal derived from the observations and shown in Figs. 1A and 2B is indeed the signature of natural long-term climate variability. Removing this internal signature from the observed global mean temperature record should clean up the individual and unique realization of nature, isolating the forced climate signal. Fig. 3 shows that the resulting cleaned signal presents a nearly monotonic warming of the global mean surface temperature throughout the 20th century, and closely resembles a quadratic fit to the actual 20th century global mean temperature. Interdecadal 20th century temperature deviations, such as the accelerated observed 1910–1940 warming that has been attributed to an unverifiable increase in solar irradiance (4, 7, 19, 20), appear to instead be due to natural variability. The same is true for the observed mid-40s to mid-70s cooling, previously attributed to enhanced sulfate aerosol activity (4, 6, 7, 12). Finally, a fraction of the post-1970s warming also appears to be attributable to natural variability. The monotonic increase of the cleaned global temperature throughout the 20th century suggests increasing greenhouse gas forcing more-or-less consistently dominating sulfate aerosol forcing, although our technique cannot exclude other mechanisms not contained in the current generation of model forcing (22) … Second, theoretical arguments suggest that a more variable climate is a more sensitive climate to imposed forcings (13). Viewed in this light, the lack of modeled compared to observed interdecadal variability (Fig. 2B) may indicate that current models underestimate climate sensitivity. Finally, the presence of vigorous climate variability presents significant challenges to near-term climate prediction (25, 26), leaving open the possibility of steady or even declining global mean surface temperatures over the next several decades that could present a significant empirical obstacle to the implementation of policies directed at reducing greenhouse gas emissions (27). However, global warming could likewise suddenly and without any ostensive cause accelerate due to internal variability. To paraphrase C. S. Lewis, the climate system appears wild, and may continue to hold many surprises if pressed.
(Emphasis added, 30th September 2017.)
The latter figure’s quadratic is remarkably similar to a figure I generated and posted elsewhere here:
A similar result was reported by Comrie and McCabe in 2012, in “Global air temperature
variability independent of sea-surface temperature influences“, where correction of tropospheric air temperature for SSTs netted a linear increasing temperature over the 20th and beginning 21st centuries, per
(*) Professor Peter Woit has a wonderful blog by this title which repeatedly finds instances of such, having been most prominently been used by him in his book on superstring theory. A recent example in a blog post titled “Advertisements for the Multiverse”:
It suggests that the answer to the question raised by all these different kinds of multiverse (“which one is true?”) can be answered by believing all multiverse models at once, no need to choose.
No mention of tedious things like dust. This multiverse is all new and shiny, slices, dices, provides every reality you could possibly want.
(**) I first encountered Professor Sugihara’s work in his famous “Detecting causality in complex ecosystems“, co-authored with Robert May, Hao Ye, Chih-hao Hsieh, Ethan Deyle, Michael Fogarty, and Stephan Munch, and I have had professional reason to use his work, published in an article with Deyle as “Generalized theorems for nonlinear state space reconstruction“, as well as the recurrence quantification ideas based upon the embedding of Floris Takens championed in “Detecting causality”, incorporated into its convergent cross mapping. Note that recovering these in a literal-minded way is a fragile process, but Yap and Rozell have shown how these can be recovered stably.
Berner, et al do “Stochastic parameterization”, and this is related to the work by Ye, Beamish, Munch, Perrettia, and colleagues on model-free forecasting.