Wednesday, January 16, 2008

Why all the polls are wrong

Opinion polls are political heroin. You know they're no good over the long term, but they provide a short-term hit for political junkies and despite their inherent, and significant, problems it's possible to function with them from day to day. And they cost a lot of money.

But the problem with opinion polls is that no individual poll is reliably accurate.

The 1992 General Election was seen as a watershed in political opinion polling in the UK, when the professionals got it (very) wrong and predicted a victory for Neil Kinnock's Labour Party even up to the exit polls carried out for that night's TV coverage. More recently, polls in New Hampshire which showed a clear advantage for Barack Obama left many commentators on both sides of the Atlantic with egg on their faces. In Scotland, there have been wildly different results for political opinion polls carried out in roughly the same time.

Significantly, the manner in which the UK polling organisations subsequently tightened up their act and tried to introduce a greater degree of 'science' into the way that they interpreted the raw data they collect is the reason that we can't infer any great significance in a single opinion poll. Because different organisations have different systems for presenting, for example, an individual's likelihood to vote, (some only count respondents who rate their likelihood to vote as greater than 7 or 8 out of 10, while some don't ask this question as a matter of course) or a slightly tweaked demographic model (or none) which they use to plug the statistical gaps in their data collection, it's a fact that different organisations would present different headline 'results' using the exact same questions and answers. Polls carried out by the YouGov for the SNP the Telegraph, in November 2006 in the run up to last year's election which were carried out at the same time had different results, mainly because the questions asked were slightly different.

When you then factor in the relative merits of phone polling (generally less accurate) versus internet polling (YouGov produced the most accurate polls for the last General Election, but was out in last year's Scottish Election), and the often different questions posed to try to examine the same issue it's even more apparent that polls can't all be 'right'. That's why they're all wrong.

So, as an aide memoir, here's a short checklist on what to look for if you want a poll that's less inaccurate than others:

1. Any poll by an organisation which isn't in the British Polling Council should be immediately discounted. Member organisations are required to publish their full data of any questions released to the media if asked, so it can be held up to independent scrutiny. Scottish Opinion is one company which does not do this in Scotland and so it's not possible to compare their methods against other operators. Scottish Opinion polls of voting intention in Scotland in the last year have been the most volatile of all the polling organisations over this period and appeared to show wild swings in opinion when other polls showed steady trends.

2. A poll with a small sample is not reliable. You need a big sample (<900>

3. Don't look at the snapshot, look at the trends over time. Even a poll with a dodgy methodology will, if applied consistently, show trends over time. You may not be able to say with 100% confidence that polling levels for Political Party A are at X percent, but you will be able to say whether support is rising or falling over time if the questions are the same.

4. Beware of shy voters and the 'spiral of silence' Research has shown that there is a tendency for some voters not to want to reveal a party preference they perceive to be unpopular or unfashionable. So in the UK voters have been less inclined to declare their support for the Conservative Party, while there has been some discussion of whether in New Hampshire, Barack Obama 'suffered' from an inflated support rating as voters did not want to be perceived as having a racial bias against him when asked to choose between the candidates for the Democratic Presidential nominee.

So, polling is not an exact science, governed by arithmetical certainty. It's a social science, shaped by opinions and affected by direct interaction with fallible human beings. And that's why all opinion polls are wrong.