Nearly seven months after President Obama won reelection by a margin of 4 percentage points, the Gallup Organization, the world's best-known polling firm, identified in a new report four main reasons why their 2012 surveys badly understated Obama's support.
RELATED: Polling 101: How to Read Polls in the Last Three Weeks of the Campaign
The report, unveiled at a Tuesday morning event at the firm's headquarters in Washington, detailed the reasons why Gallup believes that its polls failed to predict Obama's victory. Gallup's final pre-election poll showed Mitt Romney leading Obama by a percentage point, 49 percent to 48 percent. But in the previous survey -- conducted immediately before Hurricane Sandy disrupted pollsters' plans in the week before the election -- Romney held a 5-point lead, 51 percent to 46 percent.
RELATED: The New Gallup Numbers Are Out: Romney Up by 6
Gallup editor-in-chief Frank Newport framed Gallup's struggles in terms of public polling's overall inability to accurately predict the results. The final RealClearPolitics average of national polls showed Obama leading Romney by just seven-tenths of a percentage point, around 3 points less than his actual margin.
RELATED: Get a Taste of the Sarah Palin Movie
MORE FROM NATIONAL JOURNAL
"Something was going on that affected the entire industry," Newport said. "That's what prompted our commitment here at Gallup."
RELATED: Huntsman Bets on New Hampshire as Other States Try to Undermine It
But Gallup itself was a part of the reason that national poll averages were inaccurate. Gallup's polls exhibited a consistent Republican bias in 2012; meanwhile, Gallup and some other firms, like the automated pollster Rasmussen Reports, are overrepresented in averages because they conduct daily tracking polls in the months prior to the election, rather than more infrequent media pollsters that didn't skew as heavily toward the Republican candidate.
RELATED: Sarah Palin's Media Vortex Opens at the Iowa State Fair
There is little precedent for review Gallup's process, and the implications for the firm's future -- and survey research at large -- could be far-reaching.
"Political polling is the public face of survey research," said Michael Traugott, a University of Michigan professor who joined Newport and Gallup methodologists to lead the project. "And we know that confidence in the method and the image of the entire industry are related to how well the pre-election pollsters do."
Traugott later stressed that election forecasting is not the be-all, end-all of public polling. "The purpose of the industry is not to estimate the outcome of elections per se," he said. Polls also explain how voters feel about the candidates and the issues -- and how and why their opinions may change over time.
But elections provide a check on the accuracy of this data, consumers of polls certainly have more confidence in research that proves to be accurate. This is particularly important for Gallup, whose historical trends make up a large share of what we know know about how Americans have felt about their government and its role over time. The way opinions have changed on social and economic issues is based in part on their past surveys.
The Gallup report outlined the various experiments it has conducted or will conduct later this year. While most research failed to identify factors "that caused Gallup to underestimate Obama's vote share," as the report puts it, the report does identify four main areas warranting changes, or further study.
"None of these factors are large in and of themselves," said Newport. "But they are significant enough that we think they made a significant difference in our overall assessment in who was going to win the presidential election last fall."
-- Likely-voter screen: Gallup's likely-voter screen, the battery of seven questions it uses to determine which respondents are most likely to cast ballots, "probably needs a total overhaul," Newport said Tuesday in a curtain-raising appearance on MSNBC's "Morning Joe."
The Gallup postmortem calls their likely-voter model "broadly similar to those of other survey organizations," though they say that their questions "are more heavily weighted toward past voting behavior than other firm's questions."
The report found the question seemingly most responsible for tilting their poll too far toward Romney was asking respondents how much thought they were giving to the election. Obama led by 3 percentage points among all voters, but that swung 4 points toward Romney among voters identified by Gallup as likely to cast ballots.
"Obviously if we had used no questions, Obama would have led by 3 points," and Gallup would have been more accurate, Newport added.
"We don't have a silver bullet" for explaining what part or parts of the likely-voter screen led to misrepresentation of the overall electorate, said Christopher Wlezien, a Temple University professor who consulted on the project.
Traugott said it's possible the Obama campaign's focus on battleground states requires a different approach to identifying likely voters in those states. "Polling firms don't organize the geography of the samples by focusing on battleground states versus non-battleground states, he said," but turnout "actually was up" in these states, despite national declines.
"One of the interesting things about this is whether or not this is a factor that is idiosyncratic to the 2012 campaign" and a testament to the Obama campaign's skill and efficiency in turning out their voters in battleground states, Traugott said.
Newport said that Gallup would begin experiments including questions about voters' contact with the campaigns as part of the likely-voter screen. And since it's too early to begin using a likely-voter screen for the 2014 midterm elections, Gallup will undertake experiments this fall to test tweaks, using this year's gubernatorial elections in Virginia and New Jersey as test cases. Their pre-election poll results will not be released publicly, however, until after those elections.
-- Residency of respondents: According to the report, Gallup conducted too many interviews with respondents in the Central and Mountain time zones, thus underrepresenting some areas of the Eastern and Pacific time zones, including areas of the Pacific coast in which Obama performed well.
"This ... we think was a factor" in "our too-Romney estimate," Newport said.
-- Race and ethnicity: The subject of a long story last year by the Huffington Post's Mark Blumenthal, Gallup acknowledged that the way they asked respondents about their race was causing them to weight their surveys to a population with more whites and fewer nonwhites. By asking a series of yes/no questions for each race and ethnicity category, Gallup ended up with "a disproportionate number of respondents reporting they were multiracial" or American Indian, the report states.
Gallup has already implemented changes, allowing respondents to choose from a list of races and ethnicities -- and to select up to 5 choices. Results are then weighted to four known, Census-based targets, instead of the two to which results were weighted in 2012.
"We think we've come much closer to the Census categories to which were weighting," Newport said. "Those changes have already been implemented in the first two months of this year."
Newport also said their final poll was a percentage point low for Hispanic respondents, though he said the firm conducted an appropriate number of Spanish-language interviews. "Most of the interviews we conducted in Spanish ended up not being included in our likely-voter sample," Newport said, because those respondents who requested to complete interviews in Spanish were less likely to vote.
-- Landline sampling frame: Gallup used a listed landline sample -- that is, a roster of landlines tied to actual residents -- resulting in an "older and more Republican" sample than what they might have compiled using a sample obtained by randomly dialing landline telephones. "These differences likely contributed to Gallup's less accurate vote estimate," their report states.
Gallup has transitioned back to a random-digit-dialing, "list-assisted" sampling frame, which it had used until 2011. "We have therefore made the corporate decision to change back" to random-digit dialing, Newport said.
One factor that Gallup says didn't lead their results to be inaccurate was the rise in cell-phone respondents. Gallup began conducting 40 percent of its interviews via cell phone in 2011, when it moved away from randomly dialing cell phones. They even increased their cell phone sampling to represent half their interviews last fall, which is greater than the percentage used by other pollsters.
Some of the other experiments found not guilty of causing inaccurate results include Gallup's use of a rolling sample, or "tracking design." That means that Gallup's daily results don't really represent separate polls. To wit: Gallup's most recent presidential approval tracking poll covers interviews conducted over the prior three nights. The next day's result will report the next rolling sample -- meaning it will include roughly two-thirds of interviews previously reported.
"We weight every night independently," Newport said. "One of the great advantages of our tracking program is that we can look at 30,000 interviews or 60,000 interviews" and look at various factors that could affect voters' opinions, Newport said.
Gallup also looked at the whether calling respondents -- after they are selected randomly, either from a list or by a computerized random dialer -- three times missed harder-to-reach voters, who may have been more Democratic. They conducted an experiment in which they called respondents five times, instead of three. "The initial results," Newport said, "show it did not make a difference," but more analysis is forthcoming.
Other Gallup experiments included identifying the pollster verbally and on caller-ID displays with the generic name "Selection Research" instead of "Gallup" and differences in the race and gender of interviewers. But they found that those factors did not significantly explain Gallup's inaccuracies last year.
Gallup's process is ongoing, and even if the changes they implement do address their 2012 issues, it's not clear that they can offset the changes that are affecting the industry at large. Survey costs are increasing as Americans move away from landline phones, and those reached are less likely to participate than they used to be than they were before. Newport said Gallup's response rate -- that is, the percentage of voters they attempt to contact who complete the survey -- was roughly 10 percent in 2012, a significant decline.
Still, Newport is optimistic that the review will lead to better results for his firm, whose name Traugott said "is synonymous with political polling all around the world."
"When the next presidential election rolls around," Newport said, "we think we'll certainly be in a position at the accurate end of the spectrum."
Source: http://news.yahoo.com/gallup-explains-messed-2012-presidential-polling-175551792.html
grammys 2012 deadmau5 phoebe snow jennifer hudson tribute to whitney houston nicki minaj grammy jason whitlock beach boys
No comments:
Post a Comment