“Bayes’ theorem in the 21st century”


Professor Bradley Efron wrote a piece on “Bayes’ theorem in the 21st century” in Science for 7th June 2013 which, as always, offers his measured approach to the frequentist-Bayesian controversy (see B. Efron, “A 250 year argument: Belief, behavior, and the bootstrap“) and wins thought-provoking comments, including a published Letter by Professor Robert Van Hulst to which Professor Efron responded.

My experience as a (mostly) Bayesian practitioner is that what appeals about Bayes-Laplace is it holds out the hope that a hodgepodge of statistical techniques might be unified under a common umbrella, that the answer one obtains after analysis is independent of the set of statisticians used to perform the analysis. I say that tongue-in-cheek because qualified frequentist statisticians will reproduce each others conclusions even if their methods are not identical and because simply knowing Bayes theorem and the manipulation of conditional probabilities are insufficient for dealing with actual problems to be solved, where issues of measurement, censoring, dependency, and subtleties of goals of inference need to be addressed and balanced.

Still, it is attractive to think Bayes-Laplace offers a scalpel for statistical problems where either a big adze is all we’re often handed, or demands on our cleverness outstrip the time and cost the client has to answer their needs. I don’t know if there is one answer to the triad of demands upon statistics: estimationprediction, and model-checking. (This is a formulation due to Professor John Kruschke, in his wonderful textbook Doing Bayesian Data Analysis.) I do use exogenous existing data to help pose priors and so may be a Bayesian sinner. But, above all, I see Bayes-Laplace as giving numerical engineers a rich and powerful set of loss functions, that is, minus logs of posterior probabilities, which they can then use via a variety of optimization and search techniques to find answers to problems, many of which are not in closed form.

That link, I believe, is invaluable.  If adopting a Bayesian approach lets one validly use it, there’s a lot to be said for adopting.

About hypergeometric

See http://www.linkedin.com/in/deepdevelopment/ and http://667-per-cm.net
This entry was posted in Bayesian, education, maths, optimization, rationality, SPSA, statistics, stochastic algorithms, stochastic search. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s