Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

Silent3

(15,204 posts)
Fri Mar 15, 2013, 11:09 AM Mar 2013

Study says type X people prefer Y. Study says group A experiences less B.

"Hey! That's WRONG! I'm type X, and I hate Y! I'm in group A, and B happens to me all the time!"

I dread these inevitable stupid reactions to reports of statistical studies.

Popular press reports have to take part of the blame, often oversimplifying and distorting the actual contents of the studies they're reporting on for the sake of punchier headlines, or simply because the reporters are as bad as many of their readers at processing statistical and probabilistic information.

No matter how accurate or conscientious the reporting, however, it seems a lot of people just can't deal with associations and correlations on anything but a black-and-white level.

I'll make up a imaginary example, hopefully avoiding the emotional triggers often associated with real examples: Suppose naturally purple hair was common, and a study came out saying that, on average, people with purple hair rated lower on math ability.

Yes, that study could be wrong for any number of reasons (poor study design, bad sampling, biased testing, inaccurate assumptions), but it's not WRONG!!11!! because your sister has purple hair and got an 800 on her SATs. Correlations can be both real and strong or real and weak, with few exceptions or plenty of exceptions.

The study could be wrong, but it's not wrong because the results upset your idea of fairness. It's not wrong just because someone who is prejudiced against the purple-haired might use the study to justify their prejudice. It's not wrong because a stupidly oversimplified caricature of the real study (all people with purple hair are bad at math) is wrong.

10 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
Study says type X people prefer Y. Study says group A experiences less B. (Original Post) Silent3 Mar 2013 OP
There are two kinds of people in the world Xipe Totec Mar 2013 #1
What about the 3rd way people? RC Mar 2013 #2
10 types of people Silent3 Mar 2013 #5
There are two kinds of people in the world Wounded Bear Mar 2013 #10
Stupid studies elicit stupid responses. bemildred Mar 2013 #3
What constitutes a "stupid study"? Silent3 Mar 2013 #4
Well, I don't think it's a case of a list of stupidity parameters... sibelian Mar 2013 #6
I might have noticed one particular study today... Silent3 Mar 2013 #8
Yes, you're right, it's infuriating. sibelian Mar 2013 #9
"Study says type X people prefer Y" bemildred Mar 2013 #7

Wounded Bear

(58,647 posts)
10. There are two kinds of people in the world
Fri Mar 15, 2013, 04:01 PM
Mar 2013

People who divide people into two groups, and people who don't.

Silent3

(15,204 posts)
4. What constitutes a "stupid study"?
Fri Mar 15, 2013, 03:22 PM
Mar 2013

Are you making a distinction between stupid reporting of some studies, and the studies themselves? Do you think some questions are unworthy of being asked, or are wrong to ask?

In any event, I don't buy the idea that the stupidity, or lack thereof, of a study has much bearing on the stupidity of the responses. If a study strikes some people as humorous, certainly that will generate more smart ass comments, but that's not quite the same as grossly misunderstanding statistical and probabilistic distributions, it's not the same as being unable to grasp the distinction between tendencies and loose associations and hard-and-fast, black-and-white rules.

sibelian

(7,804 posts)
6. Well, I don't think it's a case of a list of stupidity parameters...
Fri Mar 15, 2013, 03:28 PM
Mar 2013

but, you know, the study you're clearly talking about.... I mean, whaaaaaa? WHO thought that up? It's not exactly an inspiring, life-enrichening use of the scientific method, is it? I think you're right about the rest of what you say, tho.

Silent3

(15,204 posts)
8. I might have noticed one particular study today...
Fri Mar 15, 2013, 03:40 PM
Mar 2013

...but the kinds of reactions I'm talking about apply to many different studies when mentioned on various online discussion boards.

Imagine a study shows that, say, at the end of a twenty year longitudinal study, regular consumption of quantity X of sugary soda and candy led to a 12% increase in diagnosis of Type 2 diabetes compared to a control group.

Stupid headline for the story: Sugar Causes Diabetes

Stupid reader reaction #1: I told you, sugar is POISON!!!!!

Stupid reader reaction #2: That's bullshit! My aunt drank four liters of Pepsi and ate five Snickers bars every day, never got diabetes, and lived to 97 when she got hit by a bus!

sibelian

(7,804 posts)
9. Yes, you're right, it's infuriating.
Fri Mar 15, 2013, 03:54 PM
Mar 2013

It's that sort of nonsese that "contradicts" statistical information regarding global warming. "It snowed heer. Globl warming is HOAX. That Logic." It's like everybody suddenly acquired the IQ of a herd of lolcats.

bemildred

(90,061 posts)
7. "Study says type X people prefer Y"
Fri Mar 15, 2013, 03:34 PM
Mar 2013

Questions, like hypotheses, are a dime-a-dozen, but finding good, meaningful, illuminating questions is work, sometimes a lot of work. And then you still have to figure out how to collect evidence that supports them.

On the other hand, if you just formulate some catchy question or hypothesis, then go out and question some people about it, and then sort their answers into your little bins, and then compute some numbers, then you have nothing at all, except perhaps an exciting segment on some TV show.

That's what I think.

Something along the lines of the critique labelled "Poll Dance" in the letters section linked here:


Poll Dance

I suggest reconsideration of your “PBK Presidents Poll,” which appears neither to be a poll nor to have been conducted uniformly among college presidents.

We’re advised only that this project collected responses from 70 individuals in an attempt to “survey leaders of colleges and universities on issues facing higher education.” There’s no description of sampling methodology, sample composition, sample weighting, questionnaire design, data validation or any computation of sampling error or statistical reliability.

What constitutes a poll is worthy of discussion. In the context of news reporting, which appears to be your aim, a poll is a study of attitudes or behavior among a randomly selected group of individuals whose characteristics are reliably representative of the broader population from which the sample was drawn. A probability-based sample, as required by this definition, is essential for the application of inferential statistics, the principle being that inferences about a full set can be made by examination of a randomly selected subset.

Many other niceties are involved—best practices in questionnaire design and accuracy in data analysis, for instance—and particulars can be debated. But the day starts with sampling. Without it we have a compilation of anecdote—not reliably quantifiable in a representative sense, and thus not a poll, presidents’ or otherwise. The stylebook of The New York Times, for one, says that the words “poll” and “survey” are to be “limited to scientific soundings of public opinion.”

What we have, rather than a poll, may be an attempted census. To be successful, this would have to include as close as possible the presidents of all the 280 academic institutions with PBK chapters. Without awareness of, and if needed correction for, differential nonresponse, a 25 percent completion rate in a census, sorry to say, doesn’t cut it.


http://theamericanscholar.org/responses-to-our-winter-2013-issue/
Latest Discussions»General Discussion»Study says type X people ...