It’s time to treat fake news like a terrorist activity
Media companies need to take responsibility for the content in their pipelines
Groomed into the journalism profession in the 1970s at the height of anti-Vietnam War protest, with Bernstein and Woodward of Watergate fame as my mentors and heroes, I feel embarrassed and ashamed that it has come to this: fake news being sprayed unchecked by “bots” conceived by hate groups and influencing the outcome of the US presidential election – all in the name of freedom of speech and democracy.
If I were dead, I would turn in my grave. But since I am alive, all I can do is rack my brain on how social media could have reduced “journalism” to this, and what might conceivably be done to restore the indispensable role that the media has historically played in protecting trust in our democracies.
To be frank, this is an information crisis we should have seen coming. When I resigned from the Financial Times in 1994, it was not just to begin a second career. It was also about disillusion that people were reading “the Pink Un” and other serious-minded publications NOT to be informed, but to cull for anecdotes that could support their prejudices.
This habit underpinned what I sarcastically called “Ah, but…” philosophy, which I saw over and over again during yam cha lunches on Sunday, in pub discussions over a pint of beer, and from the armchair pontification of my recently-deceased father.
You may recall it too: a group may have been animatedly discussing an issue, converging around clear majority agreement on the “truth” – for example that the world was not made in seven days. The prejudiced sceptic who disagrees with the majority view only needs a single anecdote in support of his or her prejudiced view to be able to interject and say “Ah, but….”. That one anecdote succeeds in insulating his or her prejudice whatever the wider truth.
Facebook, Youtube, Twitter and Google have not only empowered the anecdote: they have succeeded in connecting prejudiced, hateful, eccentric and extreme fringe views in a wild west of outrage machines. In medieval England, the village idiot was always recognised by the rest of the village as the village idiot, and discretely tolerated as such. What today’s social media have done is given all the village idiots their own incubating chat room within which their craziness becomes both normal and right.
We are not talking here about the legitimate differences of opinion that can exist in any diverse community, and which need to be brokered by civilised means in any properly functioning democracy. We are talking about the deliberate failure of our powerful social media to protect us against willful – perhaps criminal – abuse of the power they have brought to their millions of “outrage machines”.
A couple of years ago, I called a journalist at a Chinese-language newspaper that shall remain unnamed to point out a flagrant and damaging inaccuracy, and to seek a correction. To my shock, he acknowledged the inaccuracy, but saw no reason to correct: “I have a right to my opinion,” he rebutted.
It is this outrageous “right to my opinion” that sits as the pious hypocritical heart of Twitter and Facebook. They say they are giving freedom of speech, and defending the democratic rights of previously voiceless citizens. The disingenuous Mark Zuckerberg commented: “We believe in giving people a voice, which means erring on the side of letting people share what they want, whenever possible.”
As a professional journalist, I too had a right to my opinion. But before committing to print, I was under a legally-enforced obligation to check what I wrote was honest and balanced. On any given “story”, I had to talk to people of opposing views. I had to seek out and nurture the “honest brokers” who could counsel me to manage the flow of lies and prejudice that were so often directed at journalists.
If serious allegations were being made, we automatically reverted to the FT’s libel lawyers, who were on permanent call, and we never published without prior legal clearance. Yes, this news gathering and verification process was time consuming and expensive, but was indispensable in enforcing the high ethical standards of the profession, and in justifying the trust readers put in what appeared in the paper. Legal liability sat not just with me as the journalist, but with any source who made false allegations, with my editor, and with the publisher. If I made a mistake, I was under a legal obligation to correct the mistake as speedily as possible. Journalists who made mistakes did not have long careers in journalism.
By disingenuously calling themselves technology groups rather than media groups, the bosses of Facebook, Google, Twitter et al are willfully and unacceptably trying to sidestep the ethical and legal responsibility they have to their readers and audiences, and in the process exposing our democracies to grave danger. They allowed entirely fictional entities like the “Denver Guardian” and the “Event Chronicle” to “report” that George Soros as dead, and to “report” that Hillary Clinton had suffered brain damage, had alcohol and drug addiction problems, and had links to money laundering and even sex crimes against children.
If any of these entirely false “reports” had appeared in a legitimate publication like the Post the author, editor and publisher would all have been liable to prosecution and massive libel fines. As Martin Sorrell, head of WPP the world’s largest advertising agency, claimed: “They are media companies. They are responsible for the content in their digital pipes.”
They claim the challenge is too huge. No one can challenge the awesome scale of the task they face. But such is the harm being done by empowering so much falsehood, tribalism, and politically, socially and economically harmful activity that they have no choice. Let them be subject to libel laws like the rest of us. When forced to action against the terrorist-inciting accounts active in Twitter, the management in August shut down 360,000 accounts. When forced, it can be done. But it cannot be done by techie-crafted algorithms alone. They will need human beings. We call them editors. That might be a start.
David Dodwell researches and writes about global, regional and Hong Kong challenges from a Hong Kong point of view