'

Opinion Polls

Home

This chapter on opinion polls could well form part of the media chapter as opinion polls are part and parcel of the media, especially in an age when 24 hour reporting and the need for clickbait call for more details to be given than are actually known. Opinion pollsters fill that gap in knowledge by summarising the opinion of a few thousand respondents and extrapolating the results to the entire population. This role of filling the gap in actual knowledge has been particularly relevant in the inquest on the referendum result. A poll published shortly after the result by the Conservative Party peer and pollster Lord Ashcroft. Media, politicians, and activists have quoted his early findings as if they were facts, although remain supporters amend it with a subsequent poll that goes against his finding that most young people did not vote. That searching for the number that suits the argument shows the way in which opinion polls are exploited by the media and political worlds without actually providing much by way of guidance as to what actually happened in the voting booths on 23 June 2016.

There are two main observations to make about political opinion polls in the UK: that they serve politicians and pundits well and that they are invariably wrong. There is one headline after an election or referendum in the UK and that is The Opinion Polls Got It Wrong. The first major poll error came ironically with Major in 1992. John Major had succeeded Margaret Thatcher as Conservative Party leader and prime minister after she was forced from power by the europhile wing of the party. Major was relatively inexperienced in cabinet and faced the prospect of leading a party that was not only split along europhile and eurosceptic lines, but also over the coup against Thatcher, who had been prime minister for the previous eleven years. So opinion polls pointed towards Major's defeat in the 1992 general election, which he had postponed as long as was legally possible. Labour leader Neil Kinnock was expected to become prime minister, but Major won with a relatively narrow majority of twenty-one seats.

Opinion poll companies held a review in 1992 into why they called the election so wrong and concluded that there was an issue of shy Conservatives. This was the idea that Conservative policies had become so unpopular in the previous 13 years of rule that many of those interviewed were unwilling to admit to pollsters that they intended to vote Conservative. Unfortunately for the pollsters their predictions did not improve thereafter and continued right up to David Cameron equally surprising victory in the 2015 general election. Although the surprising nature of Cameron's victory was only surprising if you believe opinion polls, which most voters in the UK gave up more than 20 years earlier.

So why do opinion-polls still dominate the headlines if they are fairly useless at predicting what the electorate will do at the ballot box? The simple answer is that it gives politicians and pundits something to talk about. So much in the political realm is full of uncertainty, but that does not sell newspapers or gain clicks on media websites. Opinion polls may bear little resemblance to reality and be proved to be ineffectual time and time again, but they provide a definite number that journalists can talk and write about. The number might be wrong, but that is unimportant if the twenty-four hours news programme has its content, newspapers are sold in greater numbers, and media websites gain more clicks. That explains why within hours of the referendum result confounding yet again the pollsters' predictions politicians and pundits were quoting opinion poll numbers.

For the first few weeks after the surprise (to those who believe opinion polls) referendum result many certainties were declared by activists, politicians, and pundits. A key example was that old people up north had voted to ruin the future of young people all over the UK. As time wore on it became apparent that these certainties were built on houses of sand with the tide coming in. Media interviews with young people who voted leave and old people who voted remain began to raise questions about those opinion poll certainties. However the lack of certainty continued to pass by most activists, politicians, and pundits as the poll numbers continued to be a convenient untruth. To coin a phrase: 80% of opinion poll results may as well be made up out of thin air and the remaining 30% are hampered by dodgy statistics. Yet still political parties and media organisations will pay a lot of money for opinion polls, because they want to have some evidence to work on, even if they know that the evidence is not worth the paper is printed on. Below I will discuss some of the reasons that opinion polls are so poor at polling opinion.

Populations

Over 33 million people voted in the referendum, but most opinion-polls only interview about 2000 people and then extrapolate the result to the total population. Opinion polling companies make their money from being better at selecting the population to interview and better at weighting the results. The problem is that they are not very good at either task, although they are very good at marketing their services.

A good mix of the population can be achieved by using the telephone directory method in which random people are phoned and therefore no prejudice of the pollsters can intervene. The problem with that method is that they are generally calling landlines, which many people can no longer afford. Even for those who have a landline the problem with cold calling marketing companies means that many refuse to answer calls on their landline on the assumption that if it is a friend or relative they will leave a message and can be rung back. The opposite approach is the carefully selected focus group, which balances different ethnicities, genders, and political affiliations. The problem with the focus group method is that it balances the group according to the pollsters' preconceived notions of the make-up of the electorate.

When opinion were cited by the media during the referendum campaign they would normally reveal whether it was a telephone or online poll, as the form tended to favour remain and the latter leave. The judgement was made that the telephone polls were more reliable because it was feared that the online groups had been infiltrated by leave activists, which was not a glowing commendation of the population selecting talents of the polling companies. In any survey accuracy is better with a totally random sample population, but people who own a landline and are indoors and willing to answer it is not a totally randomised sample.

Even once a population is sampled there is the further problem of weighting results. Pollsters will ask other questions that will help them to gain an answer that is more likely to be an accurate reflection of the vote at the ballot box. For example they use prior voting patterns to weight those interviewees they think are more likely to vote and weight for problem groups such as shy Conservatives. The problem is that UK polling companies are not used to dealing with referendums. So their sampling and weighting is skewed towards those who normally vote in elections and reasons for not voting are very different in a party political election than they are in a referendum where every individual vote counts.

There was evidence of large queues in polling stations in deprived housing estates where people normally do not vote. Many others reported voting for the first time in their lives or for the first time in decades. It was very hard for opinion pollsters to weight their survey populations because they had to rely on past data that was primarily related to elections rather than referendums. Considering that these polling companies have a very poor track record in predicting elections for which they have a lot of past data it is hardly surprising that they called the EU referendum very wrong.

Interpreting Data

By the time an opinion poll company has decided how many would vote for remain and how many for leave a lot of interpretation has already taken place. The process of weighting results has already gone into considerations about selecting the population sample and what adjustments that need to be made. Those considerations are less complicated for a referendum that was set up as every vote is counted in a UK wide tally. For a general election it is more complicated because a extrapolating results for a relatively small sample to the national picture is made more difficult by issues such as safe seats limiting the impact of both those who vote with and against the main party in those seats. In the referendum opinion polls the results may seem to require little interpretation as it is a case of how many plan to vote remain and how many leave, but it takes a lot of prior consideration before those numbers can be released.

Pollsters would have factored in the predictions of pre campaign polls that the Celtic nations would likely vote with remain majorities and as the populations are so much smaller than the leave leaning England those figures are likely to use more English people in their sample. In YouGov/Times survey on 18 May 2016 there were no Northern Irish based interviewees and in the YouGov/Times survey on 22 June 2016 there only 94 such interviewees or 2.5% of their sample which roughly equates to the Northern Irish making up 2.7% of the registered electorate. The exclusion of Northern Irish voters on 18 May might reflect YouGov's standard election practice due to Northern Ireland having distinct party political traditions from the rest of the UK or they may have been acting under instructions from The Times, who were paying for the survey. Scottish voters are 8.6% of the registered electorate and on 18 May constituted 9.2% and on 22 June 9% of the sample. Welsh voters are 4.9% of the registered electorate, but do not merit separate treatment in either survey, which groups Wales with the English Midlands. However the eve of poll survey on 22 June goes into detail about how it weighted interviewees by region and political group and breaks out the Welsh sample from the Midlands so it did include a relatively accurate 5.2% Welsh interviewees. A major problem for YouGov and possibly other pollsters is that they followed their normal political polling practice of down weighting any interviewees who said that they did not vote at the previous general election. This was problematic due to the anecdotal evidence that many of those who refuse to vote in elections voted on EU membership.

Missing Millions

A key missing demographic from online polling is the cannot be bothered with politics segment. That might help a poll's accuracy in an election campaign as the apolitical may not bother to vote, but the referendum allowed people say on an important issue rather than simply choosing between politicians or parties. Those with less interest in politics were probably not signed up to online polling groups and YouGov, who conducted nearly 50% of online referendum polls, selected their samples from their standard collection of 600000 participants. It is unlikely that election focused general sample group would include many people who could not normally be bothered to vote in an election as they are unlikely to be inspired to answer an online poll about who they would like to vote for in an election. The fact that YouGov pay their participants would increase the likelihood of participation, but they are still probably missing most of the apolitical segment of the UK population.

A more obvious problem for online polls is that is requires its interviewees to be online. The Office for National Statistics in May 2015 gave the number of recent internet users in the UK at 44.7 million, which is only slightly lower than the UK electorate in June 2016 at 46.5 million. However the difference between internet users and the electorate at 1.8 million is larger than the 1.3 million majority for leave in the referendum vote. 5.9 million have never used the internet and they are predominantly found among the oldest sections of society, who are more likely to vote and (if you believe opinion polls) more likely to vote to leave the EU. The ONS statistics also covers those between 16 and 18, who were unable to vote in the referendum. YouGov cite their margin of error as 3% yet even the most connected part of the UK (Bedfordshire, Buckinghamshire, and Oxfordshire) has 8.1% non-internet users and the least connected part (Northern Ireland) has 18.8% who do not use the internet.

Telephone surveys have a different problem in that they tend to speak to too many graduates. Only one third of the UK population possesses a degree, but telephone surveys average 50% graduates among their sample. If it was true that graduates were more likely to vote remain that would suggest why for most of the referendum campaign telephone surveys were more likely to report a lead for remain than online based polls.

The biggest problem for any survey are the missing millions that represent the people not interviewed. YouGov interviewed 3766 people for their Times eve of vote survey and then extrapolated the results to the 46500001 electorate. The is 0.008% of the registered electorate and there are a lot of unknown variables in the decision of each individual as to whether they vote and which box they tick on the ballot paper. The problem is most evident in terms of the Northern Irish voters who were interviewed for that survey. They surveyed 14 males under 40 (but weighted them as if they spoke to 19), 37 males over 40 (weighted down to 29), 12 females under 40 (weighted up to 20), and 38 females over 40 (weighted down to 32). These numbers were small so that YouGov could reflect the percentage of the electorate represented by Northern Ireland without requiring too large a UK wide sample. In order to reflect a better age profile than their sampling produced they treated 12 women under 40 years of age as if they were 20 different individuals. That means that even before extrapolating numbers to the whole electorate they were already working of less diversity in their raw data. A survey of 3766 is relatively large for a political opinion poll, for example the 18 May You Gov/Times survey had 1648 participants, but it is still a small number from which to build in the requisite diversity before extrapolating to a 46.5 million electorate.

Other Opinions

Opinion Polls on the referendum did not always restrict themselves to asking whether the participant intended to vote remain or leave but also asked questions about what would influence their vote. Such answers could help those campaigning to know how to target their message better, but only if those answers were more reliable than the primary question of voting remain or leave. We have no way of knowing about those other questions unless we compare other opinion polls, but we can compare opinion polls results to the final vote. Most opinion polls called the result wrong in favour of remain, but within the 3% margin of error, a margin that the actual result was also within. In that sense the opinion polls, especially online ones, were correct as they essentially reported it as too close to call and the actual vote had it been an opinion poll was within that too close to call band.

Those other questions about why someone would vote one way or another are more difficult for many voters to answer. Someone might know if they intend to vote remain or leave as it is an exclusive choice. It is harder when asked to say which reason dominated their choice as it might be a collection of two or more issues. For example someone might be concerned about both fishing quotas and immigration or about the European democratic deficit and over regulation of British business. So besides all the other concerns about the level of trust that should be placed in opinion polls we have the issue that participants are asked to choose one reason when it might be driven by several combining factors. An example of this problem is YouGov's survey on the day of the poll when they contacted 4772 participants and asked them to cite just one of eight possible reasons driving their vote. They included two extra options of Something Else and None of the Above, but they did not allow for an answer of More Than of the Above. Pollsters prefer single boxes to be ticked so that they can work out their statistics more easily, but that ease often comes at the expense of accuracy as life is seldom as simple as pollsters would like it to be.

Nevertheless such polls can provide a riposte to received wisdom is there is a clear lack of numbers supporting the mainstream political or media narrative. Many remain activists, politicians, and pundits were quite clear that the leave campaign had been all about immigration. An investigation of the leave campaigns make clear that this was not the case, but the YouGov day of the vote survey suggests that no such impact was made on their vote. Of those who voted leave only 26% cited immigration as the main factor, while 45% cited the UK having more control over its own actions. Such a poll is more likely to reflect reality than the numerous news reports that look for quotations from leave voters who will state that it was all about immigration, while oddly seeking out quotations from remain voters who will claim that they lost because of Vote Leave lies about the NHS. Those two claims do not tally up and that is where an opinion poll has a clear advantage in that it relies not on deliberately sourced quotations, but a selection from a list of options.

It has become the standard response for a political leader struggling in the polls to declare that only one poll counts: the one at the ballot box. In recent British political history there are now so many examples of the opinion polls calling it wrong that such declarations have the weight of evidence behind them. Yet still opinion polls dominate the media because it gives the media something to talk or write about. In this book I will give very limited attention to opinion polls because of the problems set out in this chapter.

© Mercia McMahon. All rights reserved

Advert