I started thinking about writing this blog prompted by Owen Jones suggestion that Jeremy Corbyn supporters do not seem to care enough that Labour is not doing well in the polls. It is certainly true that some of the most recent poll results have not made particularly good reading for Labour supporters. However, the fetishisation of opinion polls is a dangerous game, and as somebody with over 20 years experience of teaching research methods I thought I should perhaps share some of my knowledge more widely.
We need to be clear what an opinion poll can and cannot do. We also need to be clear what an opinion poll is actually telling us. Owen Jones is an astute and intelligent journalist, and should know never to take anything at face value. I have often made the point to my students that we tend to be more critical of polls with which we disagree than those that confirm what we already believed. I see this on social media where both sides of the leadership debate quote polls approvingly only if they seem to show that they are right.
It is important to recognise that polls, particularly political polls, are not simply random acts of data, but are conducted to support or deny particular narratives. Poll figures are always a range for reasons I shall explain. Yet, when polls are reported in the press and through the media we are always given just the headline figure. A rather typical recent example was the Daily Mirror's headline Labour sinks 16 points behind in grim new poll. This was from July 27th and the headline figures were that in this poll of 2,012 people, conducted online, when asked "If there were to be a general election tomorrow which party do you think you would vote for? Conservative/Labour/Liberal Democrat/Other?", 43% said Conservative and 27% Labour. The implication is clear. This is bad news for Labour and particularly for Jeremy Corbyn who is clearly to blame for this state of affairs.
The poll was conducted by ICM who have been tending to show a greater lead for the Tories who, you probably do not recall, had Labour ahead consistently in the run up to the General Election and predicted a 35-35 draw for the election itself. In other words, they expected and convinced most of us that we were heading for a hung parliament. More importantly for my argument. The under reporting for the Tories was around 2%, the over reporting for Labour around 5%. It is important to note that other polls were similar so that the actual vote was anywhere from 2-5% different from the polls. Hold that thought.
The problems that polling companies have to overcome are incredibly complex. It starts with the question asked. Respondents are asked to speculate how they would vote in an event they know is not imminent. Most people will not admit that they have no intention of voting, so within the results are a number of respondents (and we can never know how many) that will not vote. We also know that many people do not decide how to vote until very close to the event. In other words, in asking people to think about the General Election we are, currently, asking them to consider an event that could still be 4 years away. Putting the "if it were tomorrow" qualifier is rather like saying "if you suddenly became 20 feet tall, would it make any difference to you?"
The problem of interpreting the question is compounded by the problem of sampling. Although statistically speaking 2,000 people are likely to be representative of the population at large, they are still 2,000 people who are prepared to be asked. Where particular groups, particularly young and ethnic minorities, are under-represented the pollsters use weighting in order to make the results representative. So whilst 18-24 year olds may be around 9% of the population the sample may only manage to find lets say 50 young voters who will talk to them. So to make up for this the weighting will mean that each of them will, effectively be counted four times. Which might not matter if the views of young people are homogenous, but could seriously bias the results. The more important fact is that polls, by definition, do not include the views of those who refuse to be polled (but may vote) and that is problematic. The tendency of polling organisations is to treat non-responders as if they are the same as responders. In the absence of any data, what else can they do? But, it is likely that non-responders to polls are likely to be disproportionately young and, where online polls are concerned, older.
But does this mean that we should ignore the polls? All polls are subject to error. Not simply by asking speculative questions to the wrong people, but the very real statistical margin of error. For most polls this is estimated at +/- 3 per cent. That is to say that if, for example, the Daily Mirror says that Labour has 27% of the vote, the real figure will be somewhere between 24-30%. In fact, the margin of error of +/- 3% is a common figure based on a random sample, but most polls, particularly online ones are far from random, and the true margin of error could be much greater. So the fact that pollsters could be up to 5% out on a general election (the only time opinion polls are actually tested incidentally) should not surprise us.
But, even accepting that the margin is +/- 3% puts a different perspective on polls. Let's be clear here, the Tories have been ahead of Labour for some time in most polls. Occasionally an individual poll will put Labour ahead as happened with YouGov just before the EU Referendum, but looking overall at recent polls asking about voting intentions most polls have the Tories ahead. What is less clear is just how far and how much that should worry those of us who would rather like to see the Tories out of power.
Let's see what the polls are telling us. To do this I will use the excellent UK Polling Report which provides data on all the main polling organisations. If we go back to May, during the Referendum campaign, most polls put Labour somewhere between 2-6% behind the Tories. However, if we apply the +/-3% that changes quite dramatically. If Labour was doing 3% worse and the Tories 3% better than the published figures then the Tory lead could have been somewhere between 8-10%. On the other hand, in the equally plausible possibility that Labour was 3% better than being reported and the Tories 3% worse, Labour could have had a lead of between 1 and 4%. Which one is correct? The simple answer is we don't know.
The media tend to go for the result which backs their particular narrative. In this case, and increasingly since June that narrative has been that Jeremy Corbyn is unelectable. That narrative has been aided by those within his own party arguing that to be the case. Leadership contender Owen Smith (Owen Smith says Jeremy Corbyn's principles are 'just hot air') arguing precisely this during the leadership hustings.
Clearly poll ratings which seem to show Labour support ebbing away will be popular with a media which has decided that Jeremy Corbyn should not be allowed to lead the Labour Party. The reporting of polls should include all the relevant information. For example, in June in the run-up to the Euro Referendum, Labour was gaining on the Tories. It is worth noting that in the 8 polls published in June that the Tory lead lead ranged from 0-5%. But, given the margin of error Labour could well have been in front by up to 6 percentage points. This would have given a rather different complexion to Labour MPs discontent with Jeremy Corbyn. Unfortunately, many of them are as innumerate as the general population (see What happened when MPs took a maths test).
So, in reply to Owen Jones, and possibly Owen Smith too, should we be worried by Labour's performance in the polls? Well, yes, it is not good to be behind, but we should bear in mind that polls are part of a wider narrative. If 8 polls in the space of 4 weeks can be widely divergent and if the media refuse to explain that poll results are part of a range not an absolute score, we should treat the polls as what they are. A piece of political data that are fun to watch but not worth spending too much time obsessing over.