Everywhere I turn, it seems, there is yet another poll that comes out that shows President Obama up by 8 points here or another that shows Romney up by 3 points there, yet both of those two polls were taken in the same state, during the same relative period. The national polls have been showing a dead heat, for the most part. The consensus seems to be that one should ignore the national polls (why even take them then?), since votes are tallied individually by each state and then the electors for that state cast their votes for whomever won the popular vote for the state. That makes perfect sense. There are several problems with all the polls, however. There are problems with oversampling of one party over another or using cell phone numbers to conduct polling which skews the results by cutting out a good percentage of elderly voters who tend to be consistent voters. These have been discussed elsewhere. There is another problem that you don’t see discussed often and that is: likely voters.
There is a methodology that pollsters use to determine who will be a “likely voter” and it is, generally, just a few questions about the person’s history of voting and then a question about whether or not they’ll vote this time around. The problem with this is that people can’t be trusted to always do what they say they will. Perhaps they don’t want to tell a stranger that they’re not planning on voting out of embarrassment or maybe it just slips their mind on election day, but the facts remain that polls always overestimate likely voters, because they’re taking the person’s word for it. As an example, The New York Times conducted a poll in 2004 that asked the qualifiers of voting proclivity to determine it’s likely voter turnout from the poll. The results from the poll showed that 90% would definitely vote and 7% would probably vote. 83% of the respondents were registered to vote in the precinct they currently lived in. Doing the math shows that at a 90% level, of the 83% of those registered in their current precincts, 75% of registered voters should have voted. Including those that said they would probably vote increases that percentage to over 80%. The results from 2004, after the election, show that only 61% of registered voters actually voted. This is not a phenomenon. Voter turnout for 2000 was 55% of registered voters and in 2008, 62% of registered voters turned out.
It is important to realize, also, that likely voters and projected voter turnout are actually two different statistics altogether. The latter is generally accurate and uses historical patterns to project the future, much in the same way as the Farmer’s Almanac projects the weather for any given period of time. Many people conflate the two and assume that when pollsters narrow down their results to likely voters, they’re providing the most accurate projection. They’re not. Pollsters are often wrong, which is why when a pollster does come close to predicting an outcome, folks glom onto them as being “the most accurate pollster of 2008” afterward, like they did with Rasmussen.
The problems with polling have always been around, like when, in October 1980, a week before election day, The Washington Post conducted two polls on the same day, with opposite results; one showed Reagan ahead by 4 and the other showed Carter ahead by 3.
So why do we follow these inaccurate predictors so much? It could be impatience. But, the truth is I don’t think most of us really want to. The media are the real ones that are so fascinated with polling. This is most likely due to their desire to be the first to provide results. However, some claim it could also be an agenda-driven desire to suppress the vote by insinuating that the race is over before the election even happens. With certain characters, namely MSNBC, I do think that is a strong possibility.
Regardless of the reason, the important thing to take away from this is this: the only poll that really matters is the one taken on election day when you go out and vote.