dynamic linear model applied to sea-level-rise anomalies

I spent much of the data working up a function for level+trend dynamic linear modeling based upon the dlm package by Petris, Petrone, and Campagnoli, while trying some calculations and code for regime shift detection. One of the test cases I used was the sea level rise anomaly series produced by the CU Sea Level Research Group at the University of Chicago. That data looks like:
SLR_original_data_2015-12-11_202752
(Click on image for a larger size picture. Use browser Back button to return to blog.)

and is taken from this location.

The result of a DLM smooth looks like:
SLR_anomaly_2015-12-11_203050
(Click on image for a larger size picture. Use browser Back button to return to blog.)

Initial estimates for the observational (co)variance, the state drift covariance, and the initial state covariance were based upon multiples of the covariance of the series, not corrected for serial correlation. Strictly speaking, that should be done, but as the covariance matrices are estimates, I did not think it necessary. I typically do it using the Politis and Romano stationary bootstrap but did not in this study. The objective wasn’t to get a handle on SLR as much as write and test code. Moreover, various multiples of these were tried in a set of by-hand runs, to see the effects upon the resulting smoothed trajectory.

In particular the observational (co)variance was modelled as twice the variance of the series. For the state drift covariance, I used the variance of the series for the level term, and unity for the trend term, also on a diagonal. For the initial state covariance, I used ten times the variance of the series for the level term, and, again, unity for the trend term.

The times for observations were re-registered by migration to be equispaced across the duration observations were taken. This was done by using an Akima interpolating spline of 3rd degree, averaging for ties, and using Akima’s improved method. (See the aspline function in the R akima package.)

The envelope about the series shows the one standard deviation confidence. Migrated data are shown as squares. Original data are shown as triangles.

I also wrote a stochastic search algorithm to look for better settings around the values chosen. It is not proper to limit this optimizer to maximizing classical log-likelihood, since that made states too sensitive to individual observations, so a regularizer was added to the utility function, one which penalizes excessive magnitudes of second derivatives. Unfortunately, this introduced yet another parameter, a smoothing coefficient, so I concluded that this algorithm was going down a frequentist rathole.

The other direction I might have done was applying information criteria to winnow alternative models. For instance, does a level step suffice? What does the trend term bring to the model? Good things to consider for future work!

About hypergeometric

See http://www.linkedin.com/in/deepdevelopment/ and http://667-per-cm.net
This entry was posted in Bayesian, citizen science, climate change, climate data, climate disruption, dynamic linear models, floods, forecasting, Frequentist, global warming, icesheets, information theoretic statistics, Kalman filter, meteorology, open data, sea level rise, state-space models, statistics, time series. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s