In Monte Carlo We Trust
The statistics blog of Matt Asher, actually called the “Probability and Statistics Blog”, but his subtitle is much more appealing. Asher has a Manifesto at http://www.statisticsblog.com/manifesto/.
Why "naive Bayes" is not Bayesian
Explains why the so-called “naive Bayes” classifier is not Bayesian. The setup is okay, but estimating probabilities by doing relative frequencies instead of using Dirichlet conjugate priors or integration strays from The Path.
Hermann Scheer was a visionary, a major guy, who thought deep thoughts about energy, and its implications for humanity’s relationship with physical reality
Darren Wilkinson's introduction to ABC
Darren Wilkinson’s introduction to approximate Bayesian computation (“ABC”). See also his post about summary statistics for ABC https://darrenjw.wordpress.com/2013/09/01/summary-stats-for-abc/
Higgs from AIR describing NAO and EA
Stephanie Higgs from AIR Worldwide gives a nice description of NAO and EA in the context of discussing “The Geographic Impact of Climate Signals on European Winter Storms”
While it is described as “The mathematical (and other) thoughts of a (now retired) math teacher”, this is false humility, as it chronicles the present and past life and times of mathematicians in their context. Recommended.
I have used dlm almost exclusively, except when extreme efficiency was required. Since Jouni Helske's KFAS was rewritten, though, I'm increasingly drawn to it, because the noise sources it supports are more diverse than dlm's. KFAS uses the notation and approaches of Durbin, Koopman, and Harvey.
``The real problem is that programmers have spent far too much time worrying about efficiency in the wrong places and at the wrong times; premature optimization is the root of all evil (or at least most of it) in programming.'' Professor Donald Knuth, 1974