Why the polls suck.

Written By: - Date published: 6:49 pm, September 20th, 2008 - 34 comments
Categories: polls - Tags: ,

Newswire poll - participation ratesNewswire have a new poll “Is it a done deal? What the polls don’t tell us” which makes for some interesting reading.

It was a limited poll done around Wellington on Monday using a true random polling technique. It also reported all of the figures including the numbers for which there was no answer.

As you can see from the pie chart left, the actual number of people who gave a party preference was about a quarter of those sampled.

The highest proportion were those who didn’t answer the phone. Because the poll was on a single night, they were not able to retry the phone numbers. The word from people I’ve talked to says that polling companies often have to call up to seven times in the one or two week period to be able to contact someone. Even then there is high proportion from whom they do not get an answer.

But even this is not the full story. The methodology of this and every other phone poll has problems apart from the short survey period. It relies on land lines..
 

Selecting random numbers from the Wellington phone book and speaking to whoever answered the phone, 24 NewsWire reporters called 1147 residential numbers between 7pm and about 8.30pm on September 15.

It took that many calls to get answers from 770 people, with 200 of those who picked up the phone declining to participate.

I just checked on the availability of landlines with the Wellington Central electorate. Of the 44.5 thousand odd voters enrolled a few months ago, only 57% are contactable by land-line using the white pages. Between cell phones, unlisted phone numbers, and people who just don’t have a phone the samples look highly selective and almost certainly self-selecting. Wellington Central is close to the average of the electorates for phones. In Auckland the percentage of voters contactable by land-line can range from about 35% in South Auckland electorates to 70% in the North Shore electorates.

In my opinion, most of the published polls are about as useful as online polls. Read 08Wire’s take on those in “Memo to Whale: Online Polls Stink” . I particularly liked the story of how Hank the Angry Drunken Dwarf won an online poll for most beautiful person of the year.

The published telephone polls are self-selecting samples of who has a phone, who is willing to answer, and who has a decided opinion. At best they are indications of trends in a poll series from a single company. Different methodologies between companies make cross analysis of polls almost useless.

At worst, well just read the pontificating about polls and across poll series by the local news media. You get the distinct impression that they consider that the polls absolutely say what the electorate will do, and they treat the published error rates as gospel. It would be interesting to see what their reaction would be if they were exposed to some of the requirements of a valid statistical study, and then shown how far their phone polls deviate from it. In the end there is only one poll that counts, and it happens on November 8th.

Chris Trotter also comments on this poll in Counting the horse’s teeth. It appears that newswire is produced by Whitireia Journalism School. Perhaps a new generation of journalists who are better informed?

Update: Matthew Hooten misses the point of this post (as usual with his comments about this site).

hat-tip: bill browne

34 comments on “Why the polls suck. ”

  1. randal 1

    of course the polls suck lprent but dont worry about it. if it were a done deal then the natty dreads would not be so evil and vicious. they can smell defeat and it is driving them crazy.

  2. jaymam 2

    A truly random dialling of landlines depends on having the valid range of phone numbers for each exchange updated constantly, and updating the program that does the selection. I doubt very much that the polling companies have updated the ranges for many years.

    That means that new exchanges and outlying areas of cities are not dialled at all. Quite probably the owners of those landlines are not rich pricks and would therefore be lefties.

  3. Byron 3

    Might want to fix up this typo

    “done around Wellington on Monday around Wellington”

    Also, “As you can see form the pie chart left, the actual number of people who gave a party preference were less than a quarter of those samples.”

    28% > 25% (1 quarter)
    So its slightly over a quarter

    Sorry to nitpick

    [lprent: thanks. I wrote this just after getting up from a afternoon snooze. There was insufficient coffee in the system. ]

  4. Gooner 4

    I agree Lprent.

    I mean, from what I am being told on the hustings ACT is going to poll 20%+. The polls that have them @ 2% are just plain wrong.

  5. lprent 5

    I suppose that if you canvass inside of Act meetings you could get that high?

    But the posts focus was more on who doesn’t get polled or don’t give an opinion. I didn’t even mention the percentages for each party.

  6. Felix 6

    Staying awake too long allows really insane thoughts to pass as if they were quite reasonable thoughts.

    I think the Actoids need to lay off the party pills for a bit and have a wee nap.

  7. Razorlight 7

    Why can it not be reasonably assumed that those who don’t get polled or don’t yet have an opinion will not vote along the same lines as those who have been polled and do have an opinion.

    I know there is that argument tha poor people do not have land lines etc but other than that I do not see why Poll results are dismissed by those who trail.

  8. Quoth the Raven 8

    I smoked P and I’m alright
    Got on the P stayed up all night
    I’m gonna vote Act cos’ they’re right
    Nanananananana nanananana

  9. theodore steel 9

    Perhaps it would be useful to actually get a statistic to show how those without landlines do actoually vote.

    I respect the assumption they would vote left-wing, as they are poor and it is widely assumed the left do more for the poor. But it is just as naive as taking the polls as read to even assume those without landlines vote at all. I mean, they are obviously removed from society so much as to not have a phone, surely they would then have a very high rate of non-particiaption in voting.

    Also it may be those who don’t answer are the “busy, upper class, 80hr week rich pricks without time to do polls”.

    Sure polls are faulty, but it can’t be purely assumed they are biased one way or another. Because everyone knows assumptions are baseless.

  10. Tim Ellis 10

    Those are interesting results, lprent, and you do us a good service by writing about it.

    It would seem that the conclusion you are drawing is that poll results are unreliable. Given that phone line access hasn’t changed dramatically in the last three years, it’s safe to say that the same statistics would have applied to the integrity of the polling information available at the last election.

    Yet as Hooton has pointed out, the poll results from 14 public polls published in the last three weeks of the campaign in 2005 closely mirror the actual election results.

    Average them out and you get:
    National 40.70%
    Labour 40.59%
    Greens 5.54%
    NZ First 5.29%

    The election results were:
    Labour 41.10%
    National 39.10%
    NZ First 5.72%
    Greens 5.30%

    We know that during the last ten days or so in the election campaign in 2005, Labour picked up support at National’s expense. This might explain the very slight difference between the average poll result, and the election result. Or it might suggest small sampling errors or methodology errors in the polls in 2005.

    We know that individual polls frequently have methodology differences and sampling errors (including, very possibly, the one you’ve quoted). But taken as a whole, they do seem to represent a very close image of public opinion.

    There are real problems with taking an individual poll result and claiming it as the definitive snapshot of where people are at. There are margins of error. There are also rogue results. But they don’t seem to be nearly as common as some people would like to believe. Averaged out, the poll results taken as a whole have proven time and time again to be an exceptionally accurate picture of public opinion.

    The Labour Party has spent enormous amounts of money on UMR over the years, and UMR provides probably the best polling service of any pollster. Their polling methodologies are not dramatically different to any other polling company: they simply poll more people more often. Helen Clark probably gets poll results on her desk every few days, if not daily. There is probably no politician better skilled at analysing and interpreting poll results than Helen Clark. She wouldn’t do this, and the Labour Party wouldn’t be paying for it, if it was a waste of money.

    It seems to be quite convenient to deny the validity of polls when the party you favour is consistently getting poor results, poll after poll. Yet that seems about as logical to me as living in a state of denied reality.

  11. lunaspark 11

    As a first-time poster, medium-term reader, I have to say firstly I appreciate the effort the posters behind The Standard (I know it’s just a program on a computer somewhere) are making. But I really do have to take issue with some of the odd conclusions made by the very same posters.

    Firstly, it’s disingenuous to reject a professionally executed poll for being inaccurate, only to present your own non-scientific telephone poll of less people than any genuine poll would canvass. A few years ago I did a stint at a polling company, and the rejection rate was far beyond what your poll is reporting. Your poll is reporting a 25% answer rate – whilst our targets were a mere three answers per hour. Imagine how many calls you can make in an hour, then realise only three would actually agree to talk to you, if you were reaching target. Which was rare.

    And this was ‘professional’ polling, which took into account age, income, location, etc.

    So I find it highly unlikely that a full quarter of people would give you a preference off the bat over the phone.

    I do agree that polls do veer slightly more to the right than reality, but giving off a false image of complacency is dangerous, is it not?

  12. monkey boy 12

    I think that the polls are horse-pooky what will damage Labour on election day is the previously party-faithful who will stay at home, rather than vote. it might be worthwhile to establish how many people actually decide to vote, regardless of whether they are decided or not about which party. ‘undecided and ‘declined’ is a bad result for Labour if this poll were to represent a typical Labour electorate.

  13. jcuknz 13

    Once upon a time, way back before ACT was ever thought of and we probably had Labour’s Rogernomics running things my wife was polled on her buying pattern. In addition there were politcal questions and she let me answer them. I was mildly keen on Winston Peters and voted for him in the polls and he rode high. My wife and I separated and I no longer helped her with the polls. Winston has been falling in grace ever since then 🙂

    So much for polls 🙂

  14. lprent 14

    Razorlight:

    Why can it not be reasonably assumed that those who don’t get polled or don’t yet have an opinion will not vote along the same lines as those who have been polled and do have an opinion.

    That would be the case if the spread of the lack of landline access was even. It isn’t.

    I get to play with the data from the electorate I been helping for the most of last couple of decades. Obviously one of the things I’m interested in is the correlations. When you look at the data there are two things that pop out when you look at landlines.

    Firstly that there is a strong correlation between age and land lines. Essentially if you are under 40 your probability of having listed access to a land line drops dramatically. If you are in your late 20’s it is about half of the average for the electorate. Conversely if you are over 65 and under about 80, then the frequency of having a land line is likely to be over 80%.

    Secondly as you noted there is a correlation with income. That shows up at both ends of the spectrum. If you correlate against the census mesh block data, areas with low household income are less likely to have land lines. It is less dramatic than the age data inside the electorate. In the small areas with very high income, listed phone lines also drop. These are probably unlisted numbers (like mine).

    However when you look at the south auckland/north shore percentages with phone lines, then the differences on income become obvious. The households in South Auckland electorates are half as likely to have land lines as those in the North Shore electorates.

    I’ve only done skims on other city data, but the same kinds of patterns show everywhere. Land line access have strong correlations to age and income as the defining characteristics.

    Please no-one argue that the income levels between the two areas are similar.

  15. lprent 15

    Tim: Since I’ve been involved with phone canvassing (about 15 years) in my home electorate the percentage of listed phone lines has dropped roughly from low 80% to 55% now. In the 2005 election it was about 64%. The rate of change is accelerating fast. In my opinion largely due to the cost of cellphones relative to landlines and relative ages.

    Currently polls are useful for trends, which is why Helen probably looks at the UNR data. If I was looking at the polling data in the way that you say Helen is, I’d be looking at the effects of specific proposed or announced policies in specific sub-samples. If it indicates a positive (or negative) change in voting behaviour, then that is useful information.

    You can see me pointing out in comments the delta change of series of successive polls. A series like -3%, +5%, +1%, 0% in successive polls from the same company targeted in the same way is useful (if they haven’t changed their methodology). It shows the trend espcially when read in conjunction with dates, events, and policy releases. However the absolute percentages are rubbish for actually figuring out an outcome is useless. That is what most of the media do – ie saying how many seats each party would get.

    Averaging the various polls tends to give better results. However if the base data that each poll is based on has an inherent error (land line access) then they’d all be biased. Personally I don’t think that the increasing accuracy of the polls in 2005 was as much due to policy as to the undecided becoming decided. But that is just opinion.

    Essentially what I’m arguing is that the interesting part of the election this time around is going to be in the high proportions that don’t get polled or don’t answer. As you pointed out, a 2% change between the average of the sampling polls and the real poll was sufficient to change the outcome of the election.

    People who do canvassing know this because what we see when we’re phone canvassing and especially door knocking varies a *lot* from published polls all of the time. As the elections go by I keep seeing bigger and bigger variances between the polls and the canvassing. I’m just expecting it to get bigger and bigger variances as people drop off listed land-lines.

  16. randal 16

    the polls are just another right wing media splurge. Once upon a time they meant something but these days its just more political hanky panky and meaningless post modern blather about sweet f. a. they have come to represent wishfull thnking rather than a truly indicative sample and this will be seen on the day. National have been going going on and on and ON about the polls since christmas trying to force an early election but they are not making so much noise now.

  17. Bill 17

    And then the unfortunate effect of consistently weighted polls being fed to the electorate ‘night after night’: they have a tendency to become self fulfilling prophesies.

    I have no doubt that the media, to an extent,takes it’s cues from polls and ‘falls in line’ with the general sentiment they (the polls) seem to express, thereby reinforcing the impression that ‘everybody’ is thinking x, y or z.

    Broadcast x, y or z into people’s living rooms on a regular basis and pretty soon ‘everyone’ will gear their opinions to correlate in one way or another with x, y or z.

    This goes beyond mere party votes, but to underlying perceptions of such things as crime or unemployment; their relative importance in the scheme of things and the parameters for discussing or dealing with such phenomena. To the extent that these are embodied within a party vote, the limited discourse is further coloured by the media taking cues from the polls and being more sympathetic to a particular line or party in the interest of reflecting public opinion.

    And sadly, the parties themselves shape policy in reaction to polls in an effort to reflect general sentiment and so we have a lot of things being out of touch with what people actually think, or people eventually adopting the discourse presented to them against their natural instincts.

    Or something like that…

  18. lprent 18

    Hi luna:

    Firstly, it’s disingenuous to reject a professionally executed poll for being inaccurate, only to present your own non-scientific telephone poll of less people than any genuine poll would canvass.

    I deliberately didn’t show the party detail of the poll for that reason.

    What I’m arguing is that the activists and members of the left should largely ignore the absolute values in the polls and just get on with the campaigning for this election. The polls have some fundamental flaws and are skewing further away from reality as the number of landlines reduces. The MSM’s interpretation of polls just shows their inability to understand the limits to sampling.

    What I was interested in this poll was that they published numbers of numbers that they were unable to contact, the numbers who refused to answer, and the undecided. If you have a look at the graph at the top, those are the only numbers I showed.

    The technique that the polling companies use is meant to be pretty much the same as the newswite poll, but using random phone numbers covering the whole country. They apparently will then call those selected numbers multiple times to get a response. Typically they will not just be asking questions about party preferences, they will also ask about a number of other matters.

    Essentially the only real difference with the poll that the journalism school did was to ring a local area, and only try numbers once. They were also by the look of it only asking one question.

    Pollsters ringing me always seem to start with a statement something like “this will only take 10 minutes” at which point I’d usually decline to answer. When we run canvassing phone runs the answering profile is much more like the one listed above because we’re usually only asking that one main question. You get *much* smaller rejection rates.

    What I’m arguing is that the published polls usually only report from one of the figures above. There is only one poll (TNS?) that reports the undecided. None of them give numbers for declined or unable to contact. Those factors are actually critical in knowing what the validity of the poll is.

    I’m also arguing that the whole concept of phone polling for predicting election outcomes is almost redundant because of the ever decreasing numbers of listed phone numbers.

    I’m sure that the pollsters are as professional as ever, but they should be looking for a different technique. Phone polling isn’t going to be viable in a few years. It is not particularly viable now.

  19. Anita 19

    Razorlight,

    Why can it not be reasonably assumed that those who don’t get polled or don’t yet have an opinion will not vote along the same lines as those who have been polled and do have an opinion.

    We can divide voters into four groups

    1) Core voters voted Labour/Nat/whoever last election, will vote the same way this election, know who they’ll vote for in 2011

    2) Swung voters voted one way last election, have made a decision to vote a different way this election.

    3) Cusp voters – Know they will vote Labour-or-Green or National-or-Act or NZFirst-or-Kiwi or whatever their personal pairing is. Will decide closer to the day which way to jump.

    4) Vague voters genuinely don’t have a clue, will make up their minds at the last minute or not vote at all.

    The core group is the solid part of polling. Core voters who bother to answer the poll will answer the voting preference questions. Swung voters are similar, although some will feel some discomfort (particularly if asked previous voting patterns), but generally they’re giving voting preferences right now.

    The final two groups however are responding to voting prefs questions in much lower rates.

    So the question is, will cusp and vague voters vote in the same ways as core and swung? I would argue that they won’t.

    Cusp voters are generally looking at smallparty-bigparty cusps and smallparty-smallparty cusps to a lesser extent. So we should expect them to have higher rates of small party voting rates than the current polling.

    Vague voters; I really don’t know. My hunch is that the genuinely vague are the most disconnected from national politics and the media. My hunch is that they’re poorer as more disenfranchised. They’re the people Labour will try to get to the polling booth because, if they get there, they’re more likely to vote Labour.

  20. randal 20

    Lprent…you are right on the button. this election the right have relied on anything except genuine policy. they have used a whole battery of so called experts, pollsters and ducked and dived rather than face up to any issues. All the Labour Party has to do is get the troops out and keep talking to the PEOPLE about what a national victory would mean for their hopes and aspirations. National has become a party of chisellers. They are supposed to be the party of business but they cant generate any new business. all they can do is take more than their fair share of the incomes generated by productive New Zealanders. they want the lot. they want villas in the south of france and his and her bentleys and townhouses in London. As soon as they get rich they want to cut and run. Basically they are parasites and if they get their way then Kiwi workers will be screwd down to subsistence level while they enjoy the HIGH life.

  21. Pat 21

    Randall – when the troops are talking to the people, do they tell them who will become PM when Helen retires?

  22. randal 22

    pat only john keys cares about that because it will never be him!

  23. Pat 23

    Randall – for sure, Labour are going to romp in. But does that mean the PM will be Cullen, or Goff, or Mallard? Surely Labour voters should know which PM they are voting for.

  24. randal 24

    pat people vote for policies. if you are so shallow to think that politics is a personality contest then you should be spending your time following beauty contests or watching reality teeveee. get a life. Labour has the policies and thats what counts.

  25. randal 25

    pat …national have tried to presidentialise the elections in recent times and they have failed miserably and they will fail again. the basic premise of our political system is representation by member of parliament and national should rmemeber that. however thye take their lead form the rightwing nutbars in the Us that dont bleieve in government anyway so they will reap what they have sown. nothing.

  26. Pat 26

    Who do you think will get the PM job when Helen retires, and why?

    My money is on Cullen. I don’t think he is ready to retire yet (he can point to McCain’s age as a precedent) plus I reckon he could pull together the numbers to fend off Goff.

  27. Matthew Pilott 27

    Pat – off by a mile. I could drop the most obvious name, but I’m enjoying your guesses. And the assumption that Clark would step down if Labour win.

  28. Tim Ellis 28

    LP, that was a very thoughtful response.

    Currently polls are useful for trends, which is why Helen probably looks at the UNR data. If I was looking at the polling data in the way that you say Helen is, I’d be looking at the effects of specific proposed or announced policies in specific sub-samples. If it indicates a positive (or negative) change in voting behaviour, then that is useful information.

    I disagree, LP. That’s what you use focus groups for, which UMR certainly does do for the Labour Party. A poll might include fifteen questions; a few more for a purely political poll (as used by political parties) and to drill down on attack lines, a few less for a major polling company such as Colmar-Brunton or Digipoll, were political questions are included in wider consumer surveying to make up their political poll. Most of the published polling we see is part of a wider consumer survey.

    You can see me pointing out in comments the delta change of series of successive polls. A series like -3%, +5%, +1%, 0% in successive polls from the same company targeted in the same way is useful (if they haven’t changed their methodology). It shows the trend espcially when read in conjunction with dates, events, and policy releases.

    I disagree LP. A series like 53%, 47%, 53%, 47% in successive polls from the same company with a sample of 500 may not show any bumps at all in support: it probably shows quite a steady level of support. Each of the differences is likely to be within the margin of error. If you are relying on one poll to base a trend, then you are dealing with what is probably a relatively high margin of error in each of them. Polling companies do not call the same people: if you generated a sample of people, and tracked that same sample over the period, then you could probably say that individual policies have had an influence on the difference.

    A consolidated combination of a range of poll data (i.e., polls of polls), with a much larger combined sample, will mute any sampling or methodology errors.

    However the absolute percentages are rubbish for actually figuring out an outcome is useless. That is what most of the media do – ie saying how many seats each party would get.

    I disagree here as well. The 2005 election result was remarkably close to a poll of polls of the last 3 weeks of the election campaign.

    Averaging the various polls tends to give better results. However if the base data that each poll is based on has an inherent error (land line access) then they’d all be biased.

    You have said that land line access is an inherent error. I don’t know where you get the basis for this. You have said that landline access has decreased marginally over the last three years, but there doesn’t seem to me to be so consequential as to dramatically undermine polling integrity.

    Labour and National were last neck and neck in consolidated polling in January 2007. Landline access hasn’t changed dramatically since then. If low landline access creates a bias against the Left in poll results, then that doesn’t explain why in January 2007 the consolidated results were so close.

    I suspect also that landline access issues are a bit of a myth, for two reasons. The first one is that landline access is as low among medium-high income apartment-dwellers as it is among low-income tenants. The second reason is that polling companies, when they take their samples, weight their results accordingly. They don’t simply say: “Right, we’re going to conduct two thousand phone calls, and take down the results of anybody who answers” (which the survey linked to this article, ironically, confesses to doing. They ensure their responses match the population in general, including age, ethnicity, income, geographical region, and historical voting preferences. It just isn’t credible to say that poor people miss out on getting surveyed, because polling methodology requires that they do.

    As you pointed out, a 2% change between the average of the sampling polls and the real poll was sufficient to change the outcome of the election.

    I’m not saying that there is no sampling or methodological bias. There probably is. But if it exists, it is within a 2% range, as we saw at the last election. It isn’t the ten percent range that might suddenly be made up between Labour and National this election. If the polls show a 15 point gap between Labour+Greens and National+Act in the week before the election, then really you’re just living in dreamland if you think that can be explained away by landline access.

    People who do canvassing know this because what we see when we’re phone canvassing and especially door knocking varies a *lot* from published polls all of the time. As the elections go by I keep seeing bigger and bigger variances between the polls and the canvassing.

    When you’re spending an afternoon doorknocking, you’re only doing so in a single meshblock. It is highly likely within a statistical meshblock, you’re going to see very little variance in voting preferences. Which is why if you’re door-knocking Victoria Avenue in Remuera, you get 80% voting National, or 80% voting Labour in many parts of Mangere. But you would have to be quite deluded to say that the voting pattern in four hundred doors you’ve successively knocked on is representative of the voting pattern generally, because that isn’t sampling at all.

  29. Anita 29

    Tim Ellis,

    A consolidated combination of a range of poll data (i.e., polls of polls), with a much larger combined sample, will mute any sampling or methodology errors.

    Unless they have the same sampling or methodological errors. Given they’re all outbound phone polling we can comfortably assume they have some sources of errors in common.

  30. Anita 30

    My main frustration with the published polls is that they don’t report their “would not answer” and “did not know” numbers. We can adjust in our heads for the socioeconomic biases of a phone poll, but we can’t adjust for information they don’t give us.

    The current published polling means something very differen if 95% of the electorate has decided from if 60% of the electorate is decided.

  31. Pat 31

    Matt P – I’ll stick with Cullen. Can’t see anyone brave enough to take him on. If Pascal’s Bookie is really a bookie, maybe he could set some odds and take bets.

    Helen would step aside end of 2009 or early 2010.

  32. lprent 32

    Tim: I’m short of time, but I’ll cover a couple of points, but won’t go through point by point.

    Listed land-line use hasn’t just reduced marginally – which appears to be the basis of your underlying argument. It has plummeted. In 1996 it was about 79% in my electorate, now it is 53%. In 1996 Mangere was close to 60%, now it is about 35%. In the North Shore it was in the high 80’s, now it is about 68%. The same trend is happening over the whole country and it is accelerating,

    As Anita says, if there was an inherent sampling error common to all of the polling companies, then a poll of polls merely compounds the error.

    That is exactly what I’m asserting – there is an inherent bias happening in the poll methodology. Using landlines is heavily biased against younger people, heavily biased against people on low incomes, and for that matter heavily biased against people I know (who you can’t find in phone books – they’re technophiles). The reduction has been considerably less in areas that vote conservative., considerably more in areas that vote progressive.

    If the polling companies weight on demographics then the effect gets compounded, not reduced. Imagine that you’re looking at 25-30 years olds by phone – hard to find. You’re likely to wind up with someone who does things traditionally, like having a listed land-line, and then multiplying it.

    Similarly if you get someone in Mangere by phone, what is the bet you get someone who is both relatively affluent AND conservative. They have a listed landline. As far as I can see having a listed landline is indicator that you’re more likely to be affluent, technophobic, and conservative.

    Of course the polls close towards reality the closer you get towards an election. You get more of the undecided answering (which is why the missing figures are more critical than who answers). I’ll give you a guess who I think is more likely to answer polls further away from an election. They aren’t younger left voters or older less affluent voters – they’re too busy surviving or doing their own thing to think about politics. It is the social conservatives…

    That is why both you and Hooten quote close (3 weeks) to the election polls when you start talking about poll accuracy. Hell they even start closing up between polling companies before the election. Try looking at the polls further away from the election say at 8 weeks. Look at how much variance you get – that is undecided voters. This year has already shown far more voliatility between polls than past election years. I think that the reason is the steadily reducing numbers of listed landlines.

    If the polls are wide really close to an election then I’d get more concerned. But at present I see media holding on to the polls as if they are the holy grail. It annoys me because that is not what I’m seeing in the phone polling we do, or the door knocking to fill in the areas that don’t have phones. It is a far more robust technique than pollsters use because we’re targeting the wavers and enrolled non-vote.

    It is going to be a hellishly close election when you factor in the coalition politics. Especially since the undecided are higher than I’ve ever seen in the last 18 years. It is all going to come down to turnout.

    Which of course is where the polls come back in – effective at stopping people voting. Thats why Hooten likes them, it helps with the spin.

  33. jcuknz 33

    I think the real indication of who will be PM if Helen might step down would be to have it on the Victoria University poll which you find mentioned at kiwiblog, I forget the name of it. They suggest it is more accurate becuase to participate you have to put your money where your mouth is …. bit hard on $13/hour though .. not for overpaid politicains.

Links to post