image

Social media

The Facebook of graphic deaths and child porn: Filipinos earning US$1 an hour filtering out the filth so you don’t have to see it

Movie to be shown at Hong Kong’s Human Rights Documentary Film Festival lifts lid on psychological toll taken on content moderators, some of whom see 25,000 disturbing images a day

PUBLISHED : Tuesday, 18 September, 2018, 9:01am
UPDATED : Wednesday, 19 September, 2018, 6:47am

For 10 hours or more every day, they look at thousands of photos and videos of child sexual abuse, terrorist attacks and acts of self-harm on their computers.

Sometimes it is a horrendous video of a terrorist chopping off the head of a captive. Sometimes it can be an elderly man sexually abusing a child.

On their computer screen are two buttons they can click on: delete or ignore. They decide what netizens across the world see, or do not see, on social media platforms.

They are content moderators, people hired by outsourcing companies to keep social media platforms such as Facebook clean by deleting content considered unsuitable for public viewing.

In The Cleaners, a documentary by German directors Hans Block and Moritz Riesewieck, to be screened in Hong Kong in October, the secretive world of content moderating in the Philippines is exposed, as the directors looked at the ethical questions raised by the jobs, and how the work takes a toll on the moderators’ mental health.

“For most of them, it’s a job they are proud of. You have to consider many people have jobs that are much less prestigious than working in such nice looking buildings in one of the best parts of Manila,” Riesewieck said in an interview.

Content moderators reveal what it’s like to watch most disturbing material on web

“The salary, compared to other jobs, is not that bad, at around US$1 to US$3 an hour. With this money they can care for the whole family of five, six, or seven persons. They are the breadwinners and they are happy to get this job.”

The Cleaners will be the closing film for the 8th Human Rights Documentary Film Festival, an annual event organised by the Amnesty International Hong Kong. Seven films will be screened from 26 September to 4 October.

Five years ago, the directors were shocked to come across a video of a young girl being sexually assaulted by an elderly man on Facebook. It was shared 16,000 times before it was removed. They started asking themselves – how are these videos deleted? By computer algorithm or people?

For the first time, Facebook reveals how it decides what posts get deleted

They started looking at job advertisements in the Philippines, and found that none of the adverts for social media moderating states clearly what the jobs are about.

“The job descriptions do not say content moderators, but community operations analysts, or data analysts with international clients,” Riesewieck said.

Block said that, for many content moderators, the first moment they found out what the job was about was when the training began after they had signed a contract.

“So they were sitting in front of the screen watching, for example, pornography or a child abuse video. They don’t have a chance to exit because they have already signed a contract.”

The directors have met with 15 to 20 content moderators in Manila.

In the 1½-hour documentary, the content moderators said they were proud of their jobs because they were keeping the social media platforms “clean”.

None of them revealed their names and which social media platforms they were hired for. All of them have signed non-disclosure agreements.

“I am passionate about my job. I like what I am doing,” one Filipino man said in the documentary. “As a moderator, you’re like security. You protect the user.”

Another man said he has a daily target of 25,000 pictures to go through, and jokingly said he should be in the Guinness World Records.

How chance discovery of a Philippine child-sex ring helped FBI shut backpage.com

The moderators are under tremendous mental stress because of the nature of the images they constantly see at work. There is also evidence to suggest the work damages the moderators’ brains and makes them think that violence is normal.

The documentary talked about how one moderator visited the house of another moderator and found him dead. The dead man, who specialised in self-harm live videos, had hanged himself in his flat.

Asked about the issues raised in the documentary, a Facebook spokesman referred the Post to a statement its vice-president of operations Ellen Silver issued in July.

Silver said Facebook planned to double the size of its safety and security team to 20,000 – a mix of full-time employees, contractors and companies Facebook partnered with – this year.

Beheading and suicide videos led US soldier Ikaika Kang into arms of Islamic State

“This job is not for everyone – so to set people up for success, it’s important that we hire people who will be able to handle inevitable challenges that the role represents. Just as we look for language proficiency and cultural competency, we also screen for resiliency,” she said.

Facebook has a team of four clinical psychologists across three regions, Silver said, and they are tasked with designing, delivering and evaluating resiliency programmes for everyone who works with graphics and objectionable content.

Twitter did not respond to a request for comment.