Wednesday, September 24, 2008

THE OPINION MAKERS: AN INSIDER EXPOSES THE TRUTH BEHIND THE POLLS by David W. Moore

Part 2

Chapter 1 “Iraq and the Polls-- The Myth of War Support”

Reviewed by Thomas Riggins

Discussing the preface to this book in Part 1, we reviewed the claim that the polls distort and misrepresent public opinion. Here is a case in point: how the polls manufactured a pro Iraq War sentiment to bolster the claims of the Bush administration.

In the run up to the war in 2003 the press was reporting a pro war mood in the country. The pollsters typically asked their questions giving two choices and forcing the interviewee to choose an answer. Sometimes a control question was asked when the polling company really wanted to find out what was going on.

The CNN/USA Today/Gallop Poll did this in February 2003 [the war began about a month later] and found that about 30% supported it, 30% were against it, and 40% could care less. That is hardly a pro war mood when 70% either don’t care or oppose it. Moore points out that this neutral factor was not measured by the other polls and practically ignored by the CNN/USA Today/Gallop Poll itself. Ignoring the control question “reveals much,” Moore adds, “about the way that media polls manufacture public opinion for their own purposes.”

The problem is, Moore says, that polls want an answer to their questions even if the people they are asking don't know or care about the issue. Moore says we should distinguish between DIRECTIVE (the person really wants the opinion carried out) and PERMISSIVE (the person doesn't really care what happens) OPINIONS.

Moore gives the example of the poll mentioned above. Are you for the war?-- 59%. Against the war-- 38%. No opinion-- 3%. That is the standard "forced answer" poll. It looks like the people want war! This would have been the original result had not Gallop asked a control or follow up question.

People were asked if they would be upset if their opinions were not carried out-- i.e., if the government did the opposite of what they thought. Now if you take the strong or directive opinions on the war (yes vs no) along with the permissive, don't care group (plus the no opinion group) you get the following. For war-- 29%, Against war 30%, no opinion, unsure, don't really care 41%.

So reporting that 59% favored the war would not have been a true statement of how the public really felt. Most polls (including Gallop) don't usually use a follow up question so most polls are deceitful. The truth was that about 71% of the people didn't want war, were unsure or didn't care one way or the other.

Moore also makes a distinction between "top-of- mind" responses and reasoned ones. That is between an opinion that is just what someone has heard about or been told about (say by a pollster) but really doesn't know much about, and one that has been arrived at after thinking about it and reading about it. This is the difference between a knee-jerk response and a well thought out one.

To get "newsworthy" polls for their clients (the big media) most pollsters lump these two groups together-- even though the answer of the "top-of-the-mind" person may have been elicited by the form or wording of the question itself.

Here is another example of a misleading poll. It was once claimed that most Americans supported what the government was doing at Guantanamo. In 2007 Gallop pollsters did a standard poll asking if Gitmo should be closed or not. They got this answer: yes, close it-- 33%, no-- 53%, undecided-- 13%. But when a control question was asked (as in the Iraq war poll above) i.e., would you mind if the government did just the opposite of what you think, the response was modified to yes, close it-- 19%, no-- 28%, undecided, don't care-- 52%. A big difference as you can see!

Finally, remember the antimissile shield? In 2002 Bush took the US out of the 1972 antiballistic missile treaty and claimed he had the support of the American people. Forced choice polls had been taken and seemed to back him up-- the majority of American people were for the antimissile shield. Gallop did a forced choice poll (only two answers allowed but a person could volunteer an "I don't know.") Here is how the poll turned out. For the shield 64%, against it 30%, neutral 6%. Then Gallop did the same poll but again with a control question that allowed people who didn't care or didn't know anything about what was going on to op out. This result was for it 29%, against 13%, neutral 59%.

The second poll gave a much truer picture of what Americans were thinking than did the first. Moore says opinions are easily manipulated and "that on all sorts of issues, the media polls continually and systematically distort public opinion, with severe consequences for us all." Just ask yourselves if control questions were used in any of the polls that came out saying how popular Palin is. If not, why not?

Coming up next week-- a review of Chapter Two-- "Manufacturing Public Opinion".

No comments: