Pollsters in Hong Kong must deal with credibility gap
Public opinion surveys are widely cited in city, but many doubt they give a fair picture in light of outdated methods and political polarisation
Every evening at 6.30pm they file in, filling the industrial-green cubicles and air with chatter as they pick up the phones and dial.
Over the next four hours, each of the 80 or so people packed into two small rooms will go through about 300 phone numbers. Some won't work, others don't pick up, and yet others will hang up. They have just seconds to convince people of Hong Kong to give up 10 to 15 minutes of their time.
These are the pollsters for the University of Hong Kong who, for HK$30-$50 an hour, will be yelled at, hung up on and given all the other treatment that unsolicited callers get, all to get a sense of what you are thinking.
"We ask about their television habits, whether they're happy with the government, rate Legco members, about H7N9," says Angela Lam Oi-lan, a housewife who works at HKU's public opinion programme three to four night a week. "I consider this an investment in seeing how confident people are in Hong Kong's future."
Lam doesn't watch the news much, so this is a way not just to make pocket money but also to get a sense of what's going on in Hong Kong. This is also why governments, interests groups, political parties and institutions commission such polls.
But while polls the world over are considered fairly accurate ways to judge public sentiment, with the increased polarisation of Hong Kong politics, more are asking the question: are they really a reflection of what's going on around us?
It's a question that is growing in importance as Hong Kong moves towards universal suffrage in the 2017 election for chief executive. Various parties and individuals tout their solutions for the shape of the city's political structure … often citing polls in support of their ideas.
All polls are not created equal.
They can vary from 10,000 to 20,000-person mega-studies conducted face to face, to 100-person phone interviews or internet surveys. The former are expensive and time-consuming affairs. A 16,000-person study by HKU's medical school on Hongkongers' health and happiness needed HK$100 million in funding from the Jockey Club and two years to complete.
Phone surveys are the most common type of poll, with 90 per cent of polls HKU conducts done that way. It uses different variations of that traditional method which, in a perfect world with textbook methodology, is thought by statisticians and politicians to give a good snapshot of what people are thinking.
But, says Fu King-wa, assistant professor with HKU's Journalism and Media Studies Centre: "If it's garbage in, it's garbage out."
Fu has written several papers on the accuracy of polling, and has also put forth another method using social media as an alternative to phone surveys.
He has several problems with phone polls. Firstly, he doesn't think the random sampling of domestic landlines gives an unbiased and truly random sampling of the population. The technique also means there's a part of the population that's left out.
"I don't know many people who will sit down and spend 15 minutes on the phone answering questions," he says. "People just hang up or say that they are busy."
He says this lends itself to a sampling and response bias, meaning the people who are likely to respond probably have certain characteristics - perhaps naturally outspoken, older and with more time on their hands, and of a certain socio-economic status.
Also, most young people do not use landlines, meaning their opinions will not be included in the sampling.
"This is not just a problem unique to Hong Kong, it's a worldwide issue," Fu says.
Other's are more positive.
"We see mobile phones as a bigger problem but we still think household numbers are representative," says Frank Lee Wai-kin research manager at HKU's public opinion programme.
The programme, he says, tries to counter possible biases using a process called weighting, which takes into account population demographics - age and sex being the most common factors. If a survey finds more seniors are being polled, it will weight those results less to reflect the true population percentages, and interview more people until they have a large enough sample of younger interviewees.
"This is why we always do opinion polls in the evening. We want to avoid ruling out the working class," Lee says.
The programme looks at demographics, sex, age, occupation, social class, housing type and income level, and maps them with the census data. To avoid the bias of certain family members picking up the phone, it follows the "birthday rule", interviewing the family member whose birthday is coming up next.
But even with weighting, it is impossible to cover all the biases that can crop up.
"Say you're polling about something that's not really related to age or gender, like their views on homosexuality," there are cognitive biases, not just demographic, Lee says. "People under 18 are underrepresented because they don't usually pick up the phone. If you weight that small sample you have, you might amplify their views, and amplify an error." The small sample's view might not be representative of most people their age.
Another place where the results can fall short is if the pollster is doing the analysis wrong.
For example, a survey conducted by the Plaza Hollywood shopping mall in February last year looked at whether children would like to learn how to cook and whether parents would let them. It surveyed about 300 children and their parents at the mall. Then it sent out a press release saying 84 per cent of children wanted to learn how to cook, and 71 per cent of parents wouldn't allow it.
What's the problem? It's the claim that this is a representative sample of "Hong Kong children". It's useful for mall bosses to know this about its customers' habits, but it's not useful to say their customers are representative of broader Hong Kong. Would people shopping at the more upscale IFC or Elements malls be representative of Hong Kong? Not so much. It would have been accurate to say: 84 per cent of Plaza Hollywood customers' children want to learn to cook, but that wouldn't have generated the media interest the mall wanted.
"There's a lot of poorly designed polls, but the media is not critical enough to expose them," Fu says.
The mall defended its survey, while acknowledging the possibility of misunderstanding.
One tricky question is the issue of sample size.
There is a myth that a larger sample size will give a more accurate result. This is true if everything else is done right, according to HKU statistician Professor Paul Yip Siu-fai, but a small sample size with random sampling will yield better results than a study with biased questioning and no random sampling.
Typically, a sample size of 1,000 is considered enough to yield a poll with a margin of error of plus or minus 3 per cent in a population running into the hundreds of thousands.
"The American Association for Public Opinion Research has a guideline for journalists," says Fu, bringing up a list of questions every journalist should ask themselves when judging the validity of polls.
Who paid for the poll and why was it done? Who did the poll? How was the poll conducted? How were people chosen for the study? How were the questions worded? And so on.
"If they can't give you the data, be suspicious," Fu advises.
Lau Siu Kai, professor emeritus at the Chinese University and former head of the government's Central Policy Unit says: "Hong Kong lacks authentic polling, given the political atmosphere. I wouldn't trust 90 per cent of polls, with differing degrees of suspicion." None of the polling organisations, he says, is truly independent.
As the head of the Central Policy Unit from 2002 to 2012, Lau oversaw all government polls. The curious thing about these polls is that the results are never released. The reason, says Lau, is so no political bias sneaks in.
"The government does not want to fool itself. Because the polls aren't published, it can be honest with itself. We can ask questions that may be embarrassing to the government to calibrate policy."
Lau says public opinion polls are a major factor in government policy.
He questions the independence of university pollsters, saying many were too involved in politics to be neutral. "HKU and [public opinion programme boss] Dr Robert Chung Ting-yiu are too close to the democrats. The Chinese University pollsters are less suspect because their political affiliations aren't so clear," he says.
Chung did not reply to requests for comment. But HKU stand by its methodology.
"We safeguard our independence and autonomy. Tell the public we bear the final responsibility, not the client," Lee says.
Lau also questions the methodology of surveys, saying: "They rely very much on students who come and go, some of whom are not that honest. Are the questions being asked in a correct manner? Or are they even being asked at all?"
Because of these issues, even those who commission polls take them with a pinch of salt.
"Polls are accurate in terms of impressions, but it cannot be scientific ... it might be a minority with a loud voice. It would be dangerous to just rely on polls," says Michael Tien Puk-sun, deputy chairman of the New People's Party.
"These polls are not the final say on any matter," says the Civic Party's Stephen Chan Ching-kiu.
But they all concede there is no better way of doing it at the moment.
"We need to accumulate more experience," Lau says. "We need more organisations that do polling, and more people who understand the need to be neutral."
Adds Fu: "It's a problem in the long run because it misdirects us. We may underestimate one thing, and overestimate another, and social resources and social attention goes where it shouldn't."
Polling organisations in the city include:
- The University of Hong Kong
- Baptist University
- Polytechnic University
- City University
- Chinese University
- Lingnan University
- HK Policy Research Institute
- One Country Two Systems Research Institute
There's no one answer to how we poll. We have different types of polls and different types of methodology for each one. Anything that is in a textbook, we do it. Law Chi-Kwong, Democratic Party
Sometimes we use pollsters, sometimes we do it ourselves through SMS and Facebook. You'd be surprised how widely our Facebook is used. Michael Tien Puk-sun, New People's Party
There are three channels we go through. One uses an interactive voice response system for a snapshot survey ... in another, we commission universities to do more scientific polls using real people ... and we refer to polls conducted by other institutions like government officials and legislators. Stephan Chan Ching-kiu, Civic Party