Jackpine Radical
(1000+ posts)
Send PM |
Profile |
Ignore
|
Mon Apr-24-06 03:43 PM
Original message |
Here's what bothers me about polls: There's no real way to validate |
|
Edited on Mon Apr-24-06 03:44 PM by Jackpine Radical
them, other than by elections, which can now be readily Diebolded, and other polls, which can either be bought off and corrupted, or just ignored by the Corporate Media.
With no balance whatever in the media, what would keep Gallup or whoever from just lying? Why bother to collect data, for that matter?
Here, for example:
The latest Jackpine/Warmonger poll found 87% of likely voters support Bush, the Iraq War, and higher gas prices. This poll surveyed 1103 Americans who identified themselves as likely to vote in 2006, and has a margin of error of +/- 3%.
There, see how easy it is? I'm expecting an appropriately large payment from the RNC via PayPal for all my hard work.
|
Jigarotta
(1000+ posts)
Send PM |
Profile |
Ignore
|
Mon Apr-24-06 03:49 PM
Response to Original message |
1. heh, nice. and true. nt |
Nicholas D Wolfwood
(1000+ posts)
Send PM |
Profile |
Ignore
|
Mon Apr-24-06 03:49 PM
Response to Original message |
|
Many firms do polling independently, and many are paid internally by other organizations. Sure, you can pay anyone to run a poll that WILL find the results you're looking for, however, when it comes to credibility on common, national polling issues, the validation comes from the fact that other pollsters are coming up with comparable numbers. If one guy says Bush is running at 33%, that doesn't make it true, but if ten pollsters come within a few points of that number, it probably is.
|
raccoon
(1000+ posts)
Send PM |
Profile |
Ignore
|
Mon Apr-24-06 03:55 PM
Response to Original message |
3. Another thing about polls, how a question is asked has a great |
|
deal to do with how people answer it.
I wish when the media discussed polls they'd say how the question(s) were phrased.
Yeah, in my dreams.
|
warrens
(1000+ posts)
Send PM |
Profile |
Ignore
|
Mon Apr-24-06 03:55 PM
Response to Original message |
4. I'm expecting an appropriately large payment from the RNC |
|
That only works if you are high up over at FRetard City, where the Paypal payments from RNC never stop as long as RimJob keeps lovin' him some Bush.
|
racaulk
(1000+ posts)
Send PM |
Profile |
Ignore
|
Mon Apr-24-06 03:55 PM
Response to Original message |
5. It's simple statistics, really. |
|
Edited on Mon Apr-24-06 03:56 PM by racaulk
There is no way for a sample to be truly representative of a large population, especially one as large as that of the "likely" voters in the United States. That's why there is a margin of error, which is kind of a disclaimer that the pollster makes to say that the poll is not completely accurate. And it is within that margin of error that elections can become fraudulent and stolen.
As far as pollsters just lying, I think it would take some sort of widespread collusion among all independent pollsters to successfully pull that off. Though that is certainly not impossible (no dirty trick the GOP pulls surprises me anymore), I don't think that would be very likely. For example, if Faux News says *'s approval ratings are at 80%, but every other independent pollster has his approval ratings around 33%, then the results of the Faux News poll is an outlier in the statistical data and their poll would obviously be flawed. But that won't stop Faux News from embellishing the truth a little, for example putting his approval around 37%, and then citing margin of error so that the poll appears "independent."
I'm not disagreeing with anything you said, just putting my two cents in, for what it's worth. Believe me, I totally share your frustration. :mad:
|
Gormy Cuss
(1000+ posts)
Send PM |
Profile |
Ignore
|
Mon Apr-24-06 03:58 PM
Response to Original message |
6. That's why at least two tests should be applied before judging a poll. |
|
First, was it done by a reputable polling organization using acceptable sampling and polling practices and is a full methodological report available?
Second, are the results dramatically different from those obtained at the same time by other reputable polling organizations using a similar methodology?
If it doesn't pass either test, it's a stinker. I don't care who conducted it.
I know of many other ways to assess the worthiness of a poll but applying these two alone separates most of the wheat from the chaff.
Internet newspaper polls and others where responders are self-selected are rarely a reflection of anything other than who has an ax to grind on that issue that day.
|
Jackpine Radical
(1000+ posts)
Send PM |
Profile |
Ignore
|
Tue Apr-25-06 03:22 PM
Response to Reply #6 |
7. Well, formally in reply to Gormy, but actually in reply to everybody, |
|
the notion that a given poll result should be fairly close to other such polls is aq matter of concurrent validity. It only works if the same contaminating factors don't affect the other polls. If that biasing factor is deliberate corruption, and if it affects all the polls similarly, then the whole process is built on quicksand. All you need to do is buy off some of the polls & instruct the media to ignore anybody who wouldn't be bought, and you end up with the result you want, embedded in an illusion of credibility.
|
Gormy Cuss
(1000+ posts)
Send PM |
Profile |
Ignore
|
Tue Apr-25-06 04:27 PM
Response to Reply #7 |
8. You are right that there are other ways to cook polling results |
|
Edited on Tue Apr-25-06 04:32 PM by Gormy Cuss
and I would say that the easiest and most frequently used method is to hype cheap intentionally-biased web polls with an enormous self selection bias rather than bothering to pay for a controlled survey in an attempt to get a true bead on public perceptions. I would suggest that a concerted effort to bias all polls with the same contamination is too much trouble and expense when you can accomplish the same goal by sloppy insta-polls with loaded questions.
I proposed two tests that would identify most of the stinkers (and a correction: I see that I posted them as either/or, when I meant the second as a support to the first.)Reluctance to share methodological details is one of the quickest way to identified cooked results, and when that reluctance is combined with results that are contrary to other contemporaneous polls, that's enough evidence for me to disregard the 'results.' I can and occasionally do attach other tests of validity to poll/survey results but in order to do a rigorous analysis I would need to put my head together with some former colleagues in this business.
As for buying off pollsters and suppressing contrary poll results, the latter happens frequently in private market research and of late has happened in some government-sponsored research. The former I know nothing about.
|
DU
AdBot (1000+ posts) |
Thu Apr 18th 2024, 11:26 PM
Response to Original message |