"Science suffers from an excess of significance" This article...: "'Science suffers from an excess of significance'
This article suffers from an excess of quotable lines:
...most published research findings are wrong.My take on the field of statistics is that it's an arithmetic discipline for making people feel comfortable. Just think about the term '95% confidence': it's a bait-and-switch trick, which works by giving the result of a convoluted and artificial computation the same name as a vaguely understood emotional or intellectual state favorable for decision-making.
'A new claim about a research finding is more likely to be false than true.'
...Dr. Ioannidis and his colleagues analyzed 432 published research claims concerning gender and genes. Upon closer scrutiny, almost none of them held up. Only one was replicated.
'People are messing around with the data to find anything that seems significant....'
'The correction isn't the ultimate truth either.'
Which isn't to say it's all smoke and mirrors. As a statistician I know likes to say, Las Vegas is filled with monuments to the Central Limit Theorem, and all horse players die broke.
But what Dr. Ioannidis is pointing out in the article is that, too often, scientists conduct experiments in statistical alchemy: they want to turn a set of data into a statistically significant conclusion. They do that by loading the data into a statistical analysis software tool, then monkey around until they get a number less than 0.05 (or, if they're desperate, 0.1) to come out of the function they've been told computes significance.
And guess what? Given enough time and experience with the analysis software, it's usually possible, one way or another, to get a score under 0.05.
What does the score actually mean? Well, it means something, probably, but to know just what you have to carefully work through every step taken to obtain it. And statistics is known for counter- and contra-intuitive reasoning, so if you're not an expert in statistics (NOTE: 6 semester hours of undergrad statistics doesn't make you an expert, nor does unlimited hours using statistical analysis software), you shouldn't be too confident your confidence interval tells you what you tell yourself it tells you.
But we're all taught the Scientific Method, which in its simplest form is:
- Make a hypothesis.
- Test the hypothesis.
- Reject the hypothesis if it fails the test.
I'm not knocking science, or even statistics (which, considered as an applied discipline, is full of deucedly clever (and undeniably useful) stuff). But we have to understand things as they are, not as they are idealized to be, and that includes understanding that the majority of scientists are lousy statisticians.
(Link via Eve Tushnet.)"
(Via Disputations.)
No comments:
Post a Comment