Tuesday, October 09, 2007

In Defense of Stupid Studies

Last weekend I was hanging with friends when the topic got around to the government wasting money on stupid studies. Their gripe was about a study that showed that the rate of Post Traumatic Stress Disorder is more common among troops who had done multiple tours in Iraq than those who had done fewer.

While I agree that the government should not be paying for these studies, I tried to argue that the studies themselves aren’t wastes. Even ones that show “well duh!” conclusions like proving that people watch TV to alleviate boredom are not a waste of time or money.

Over the past week I’ve come up with two arguments against what is fundamentally a failure to appreciate different qualities of knowledge.

First: What you might call obvious, I might call an assumption. It comes down to different requirements in what it takes to “know” something. My first retort to my friends’ questioning of doing this study is that conventional wisdom gets slammed quite frequently by someone rigorously examining the evidence. Unfortunately, I couldn’t come up with an answer to the retort “When have you ever heard of such a case?” At the best I could recall nothing better than how everyone “knew” that the sun revolved around a flat earth. It took a couple of days to remember reading about experiments showing that being physically chilled does not make one more likely to catch a cold, or that eating less than half an hour before swimming does not increase the rate of getting cramps. Imagine if either of these experiments had confirmed the conventional wisdom? Then the old hindsight bias kicks in and the money spent on is declared “wasted”.

Second: A stupid study tests not only the obvious conclusion but the concepts that go into the creation of the obvious conclusion. To butcher an old expression: Scientific discovery is not heralded by “Eureka” but by “What the (bleep), that doesn’t make any sense.” The result of a study is said to make sense if it fits with the previous conclusions that make up the dominant theory. If the data makes sense, then the paradigm is just hunky-dory. If not, one of those pesky paradigm shifts is in the offing, or at least will be once the parameters of the experiment have been verified down to the last measure. The tricky part is that one never knows just which conclusion is the one that will be the thread that unravels the whole conceptual sweater.

So next time you hear about a study that leaves you saying, “I could have told you that,” think instead that you have been given proof that you were right all along. That and future generations won't think you quaint for believing something just because it was superficially obvious.

Tuesday, October 02, 2007