We’re used to hearing that a poll has a margin of error but what does it mean? A margin of error of 3% doesn’t mean the poll’s numbers are definitely within 3% of the â€˜real’ numbers, it means there is a 95% chance they are within 3% of those numbers; that is, one in 20 polls will be out from reality by something greater than 3%. One in twenty polls is a rogue poll and there is nothing that a polling company can do to prevent that. See those Xs on the graph of the last 22 polls below? That’s the Fairfax poll showing a 27% gap. Look how far it falls outside the polls before and since; an obvious rogue.
But polls can also be out if the sample isn’t random. Random doesn’t just mean the first 1000 people who you get to answer. It means that your sample is not different from the general population. Let’s take a bad poll to see how this can go wrong – David Farrar’s recent poll for Family First on smacking.
In the general population, 17% of people are over 60, 30% of the respondents to Farrar’s poll were over 60. 37% of kiwi families have children at home, in Farrar’s poll only 22% of them did. The other demographic data is also out. This means that group of people Farrar sampled is not a real sample of New Zealand and the results may be wrong over and above the margin of error that is always there. Farrar’s poll shows reasonable support for smacking but that support is especially strong in the over-sampled demographics (old people and those without kids).
Now, in America, polling companies use â€˜witches brews’ of formulas to balance the demographics of their samples to that of the general population. It can raise its own problems but, apparently, polling companies in New Zealand don’t even do that. Meaning their chances of getting a rogue poll are that much stronger. And don’t forget: polls are done by calling landlines, not everyone has a landline and 70% of people refuse to take part in polls that means the sample one gets in any poll is attitudinally different from the Kiwi population in general.
On top of all this, not all polling companies are created equal. In New Zealand, Colmar Brunton is notoriously inaccurate in its political polling, leaning about 5% to National, while Roy Morgan is the best on the major parties but over-polls the Greens. That comes down to methodology and, some have suggested, bias in polling companies. At any rate, polls are likely to be well out from the true numbers. How much were the final polls before the last election out in total, from reality?
What does all this mean? Individual polls may not reflect reality and a movement in results between polls, especially in the absence of a major political event (eg Orewa I), is more likely to result from normal variation or a problem with the polls than from a change in the real support levels for parties.
So, next time you see a 27% gap when there was a 15% one before, don’t get too excited.