Source:
https://scmp.com/tech/apps-social/article/3020146/philippines-content-moderators-youtube-facebook-and-twitter-see
Tech

In the Philippines, content moderators at YouTube, Facebook and Twitter see some of the web’s darkest content

  • In the last couple of years, social media companies have created tens of thousands of jobs around the world to vet and delete violent or offensive content
In the last couple of years, social media companies have created tens of thousands of jobs around the world to vet and delete violent or offensive content. Photo: Lea Li, SCMP

A year after quitting his job reviewing some of the most gruesome content the internet has to offer, Lester prays every week that the images he saw can be erased from his mind.

First as a contractor for YouTube and then for Twitter, he worked on a high-up floor of a mall in the Philippine capital, Manila, where he spent up to nine hours each day weighing questions about the details in those images.

He made decisions about whether a child’s genitals were being touched accidentally or on purpose, or whether a knife slashing someone’s neck depicted a real-life killing – and if such content should be allowed online.

He’s still haunted by what he saw. Today, entering a tall building triggers flashbacks to the suicides he reviewed, causing him to entertain the possibility of jumping. At night, he Googles footage of bestiality and incest – material he was never exposed to before but now is ashamed that he is drawn to.

For the last year, he has visited a mall chapel every week, where he works with a church brother to ask God to “white out” those images from his memory.

“I know it’s not normal, but now everything is normalised,” said the 33-year-old, using only his first name because of a confidentiality agreement he signed when he took the job.

Workers such as Lester are on the front lines of the never-ending battle to keep the internet safe. But thousands of miles separate the Philippines and Silicon Valley, rendering these workers vulnerable to exploitation by some of the world’s tech giants.

In the last couple of years, social media companies have created tens of thousands of jobs around the world to vet and delete violent or offensive content, attempting to shore up their reputations after failing to adequately police content including live-streamed terrorist attacks and Russian disinformation spread during the US presidential election.

Yet the firms keep these workers at arm’s length, creating separation by employing them as contractors through giant outsourcing agencies.

Workers here say the companies do not provide adequate support to address the psychological consequences of the work. They said that they cannot confide in friends because the confidentiality agreements they signed prevent them from doing so, that it is tough to opt out of content that they see, and that daily accuracy targets create pressure not to take breaks.

The tech industry has acknowledged the importance of allowing content moderators these freedoms – in 2015 signing on to a voluntary agreement to provide such options for workers who view child exploitation content, which most workers said they were exposed to.

The technology hardware industry has long operated by amassing armies of outsourced factory workers, who make the world’s smartphones and laptops under stressful working conditions, with sometimes fatal consequences.

The software industry also increasingly leans on cheap and expendable labour, the unseen human toil that helps ensure that artificial intelligence voice assistants respond accurately, that self-driving systems can spot pedestrians and other objects, and that violent sex acts don’t appear in social media feeds.

The vulnerability of content moderators is most acute in the Philippines, one of the biggest and fastest-growing hubs of such work and an outgrowth of the country’s decades-old call center industry. Unlike moderators in other major hubs, such as those in India or the United States, who mostly screen content that is shared by people in those countries, workers in offices around Manila evaluate images, videos and posts from all over the world.

The work places enormous burdens on them to understand foreign cultures and to moderate content in up to 10 languages that they don’t speak, while making several hundred decisions a day about what can remain online.

In interviews with The Washington Post, 14 current and former moderators in Manila described a workplace where nightmares, paranoia and obsessive ruminations were common consequences of the job. Several described seeing colleagues suffer mental breakdowns at their desks. One of them said he attempted suicide as a result of the trauma.

Several moderators call themselves silent heroes of the internet, protecting Americans from the ills of their own society, and say they’ve become so consumed by the responsibility of keeping the web safe that they look for harmful content in their free time to report it.

“At the end of a shift, my mind is so exhausted that I cannot even think,” said a Twitter moderator in Manila. He said he occasionally dreamed about being the victim of a suicide bombing or a car accident, his brain recycling images that he reviewed during his shift. “To do this job, you have to be a strong person and know yourself very well.”

The moderators worked for Facebook, Facebook-owned Instagram, Google-owned YouTube, Twitter and the Twitter-owned video-streaming platform Periscope, as well as other such apps, all through intermediaries such as Accenture and Cognizant.

Each spoke on the condition of anonymity or agreed to the use of their first name only because of the confidentiality agreements they were required by their employers and the tech companies to sign.

In interviews, tech company officials said they had created many of the new jobs in a hurry, and acknowledged that they were still grappling with how to offer appropriate psychological care and improve workplace conditions for moderators, while managing society’s outsize expectations that they quickly remove undesirable content.

The companies pointed to a series of changes they’ve made over the past year or so to address the harms. A Facebook counsellor acknowledged a form of PTSD known as vicarious trauma could be a consequence of the work, and company training addresses the potential for suicidal thinking.

They acknowledged there were still disconnects between policies created in Silicon Valley and the implementation of those policies at moderation sites run by third parties around the world.

The Post also spoke with one moderator in Dublin, and dozens in the United States, who recounted experiences similar to those reported in Manila. But unlike Facebook and Google contractors in the US, whose advocacy in recent months has helped lead the companies to promise increased wages and benefits, experts said the Filipino workers are far less likely to advocate for better working conditions.

And while American workers view the job as a stepping stone to a potential career in the tech industry, Filipino workers view the dead-end job as one of the best they can get. They are also far more fearful of the consequences of breaking their confidentiality agreements.

The jobs have strong attractions for many recent college graduates here. They offer relatively good pay and provide a rare ticket to the middle class. And many workers say they walk away without any negative effects.

Though tens of thousands of people around the world spend their days reviewing horrific content, there has been little formal study of the impact of content moderators’ routine exposure to such imagery.

While many Filipino workers say they became content moderators because they thought it would be easier than customer service, the work is very different from other jobs because of “the stress of hours and hours of exposure to people getting hurt, to animals getting hurt, and all sorts of hurtful violent imagery,” said Sylvia Estrada-Claudio, dean of the College of Social Work and Community Development at the University of the Philippines, who has counselled workers in the call-center industry.

She said she worries about a generation of young people exposed to such disturbing material. “For people with underlying issues, it can set off a psychological crisis.”