|
A very big pet peeve of mine, having been a math major in college, is how people seem to just accept the results of polls, surveys and other opinion results. That problem -- and it is a problem, a major one -- has prompted me to put down a few questions you should ask yourself any time you look at a poll. I hope folks find them helpful.
1. What is the population being sampled? The whole point of a poll is to model the opinions of a larger population. The first question you must ask is, "What is the population represented?" For example, is the poll supposed to reflect the views of all citizens, or just the "likely voters?" The distribution of different opinions will most likely vary between these groups, so knowing the population is important.
2. What was the methodology for conducting the poll? When modeling the views of a population, researchers must decide how they will select the people who will represent the population. How these people are selected and questioned is the poll's methodology.
Pollsters can very easily get desired results by using inaccurate methodologies. A survey of the health effects of smoking might be done by sampling only young smokers who have not yet developed emphysema and lung cancer, for example. Or they might conduct a candidate survey by sampling people who live only in small conservative communities and ignoring or underrepresenting people who live in urban, more liberal areas. "Man on the street" polls will get very different results when the same questions are asked at a NASCAR event, an anti-war rally, "social hour" after services at a Baptist church or outside Neiman-Marcus.
The way the poll is conducted is also very important to know. A survey made by telephone automatically excludes the homeless, people who do not have a telephone, people who do not have a published telephone number, people who use only a cell phone rather than a land line, people who are at work when the pollster calls, and people who are too busy to talk. As a result, telephone surveys overrepresent the wealthy and middle classes and the elderly.
Any poll results which do not mention the methodology used should be taken with a very, very big grain of salt.
3. What is the sample size? A poll's sample size is the number of people who responded to the poll. The more people who are included in the results, the more accurately the results will reflect the opinions of the population. The "sufficient" sample size -- the number of people needed to give a reasonable confidence in the results (see question 4) -- varies on the methodology used to conduct the poll. A targeted, accurate methodology can model the opinions of likely voters in the United States using a sample size of only about a thousand people. Other methodologies might require three or four thousand respondants to get the same level of confidence. And some methodologies, such as internet polls, are so inaccurate that the sufficient sample size approaches the size of the entire population.
4. What is the margin of error / level of confidence? First off, let me say: The margin of error does not represent "wiggle room" in the results of the poll. While this is a common practice, any such use is a misrepresentation of the poll's results and is not supported by any of the math behind statistical science.
Every poll or survey has a margin of error. It is found through a calculation using the size of the population being modeled, the sample size and the methodology. It is one way of expressing the poll's level of confidence, ie how accurately the answers reflect the views of the population. A reasonably accurate poll will have a margin of error of 3% or 4%; that means that if the poll were repeated with a different sample population, the second results should be within 3% or 4% of the first results. If no margin of error is reported, you should assume that the poll does not accurately reflect the population.
5. What were the actual questions used to obtain the information? Even a poll using an accurate, unbiased methodology and a large sample size can easily be skewed towards a desired result by using questions designed to elicit certain responses. Even if they ask for essentially the same information, how a question is phrased can alter a person's answer. Take, for example:
"Should women be allowed to make their own medical decisions without government interference?" "Should women be prohibited from ending the lives of their fetuses?"
The bias is further complicated by the fact that demagogue will turn around and report both questions as showing support for or opposition against abortion rights. If a poll does not include the exact questions asked, it should be treated with extreme suspicion.
|