Why scientific measurements need to be adjusted

There is an excellent piece in Ars Technica about why scientific measurements need to be adjusted, and the implications of this for climate data. It is written by Scott K Johnson and is called “Thorough, not thoroughly fabricated: The truth about global temperature data.”

I like his example of well levels very much:
(Click on image for a larger figure, and use browser Back Button to return to blog.)

Mr Johnson writes:

… In fact, removing these sorts of background influences is a common task in science. As an incredibly simple example, chemists subtract the mass of the dish when measuring out material. For a more complicated one, we can look at water levels in groundwater wells. Automatic measurements are frequently collected using a pressure sensor suspended below the water level. Because the sensor feels changes in atmospheric pressure as well as water level, a second device near the top of the well just measures atmospheric pressure so daily weather changes can be subtracted out.

If you don’t make these sorts of adjustments, you’d simply be stuck using a record you know is wrong.

This is the kind of thing that’s learned in Physics and Chemistry classes in high school these days. (Well, at least AP Physics and Chemistry, not to mention Statistics.)

Mr Johnson provides a nice sketch of the several datasets use to estimate Earth surface temperature data. There’s a similar story which attends sea-surface temperatures, which has its own dramas, also describe here, from water inadvertently heated by ship’s engines, which Mr Johnson mentions, to thermal bias and microcode errors in measurement instruments.

These are experimental adjustments, made for good reason. There are also statistical adjustments which can improve representations of datasets, like smoothing, which I have written about earlier.

But the point is, many people, encouraged by a sound-bite-oriented media, don’t know about or understand these complications, and so it is easy for people like Representative Lamar Smith to prey on their ignorance. Is it his fault? Partly. But it’s also the fault of a public which embraces representative democracy but doesn’t “want to go to school and learn their lessons” well enough to be able to fulfill their responsibility.

About ecoquant

See https://wordpress.com/view/667-per-cm.net/ Retired data scientist and statistician. Now working projects in quantitative ecology and, specifically, phenology of Bryophyta and technical methods for their study.
This entry was posted in American Association for the Advancement of Science, American Meteorological Association, American Statistical Association, AMETSOC, Berkeley Earth Surface Temperature project, Canettes Blues Band, citizen data, climate data, data science, environment, evidence, geophysics, GISTEMP, HadCRUT4, mathematics education, meteorological models, obfuscating data, open data, physics, science, spatial statistics, Tamino, the right to know, the tragedy of our present civilization, Variable Variability. Bookmark the permalink.

Leave a reply. Commenting standards are described in the About section linked from banner.

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.