image

Facebook

Ex-content moderator sues Facebook, claims disturbing images gave her PTSD

Facebook moderators under contract are ‘bombarded’ with ‘thousands of videos, images and live-streamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder,’ the lawsuit said

PUBLISHED : Tuesday, 25 September, 2018, 12:54pm
UPDATED : Tuesday, 25 September, 2018, 9:46pm

A former Facebook content moderator is suing the company on the grounds that reviewing disturbing material on a daily basis caused her psychological and physical harm.

A lawsuit by former moderator Selena Scola, who worked at Facebook from June 2017 until March, alleges that she witnessed thousands of acts of extreme and graphic violence “from her cubicle in Facebook’s Silicon Valley offices,” where Scola was charged with enforcing Facebook’s extensive rules prohibiting certain types of content on its systems.

Facebook moderators under contract are “bombarded” with “thousands of videos, images and live-streamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder,” the lawsuit said.

Scola, who worked at Facebook through a third party contracting company, developed post-traumatic stress disorder “as a result of constant and unmitigated exposure to highly toxic and extremely disturbing images at the workplace,” the suit says.

The lawsuit was filed Monday in a California superior court.

Facebook relies on thousands of moderators to determine if posts violate its rules against violence, hate speech, child exploitation, nudity and disinformation.

Many objectionable categories comes with their own sublists of exceptions. It is staffing up its global workforce – hiring 20,000 content moderators and other safety specialists in places like Dublin, Ireland, Austin, Texas, and the Philippines – in response to allegations that the company has not done enough to combat abuse of its services, including Russian meddling, illegal drug content and fake news.

The social network says that in recent years it has been developing artificial intelligence to spot problematic posts, but the technology isn’t sophisticated enough to replace the need for significant amounts of human labour.

The complaint also charges the Boca Raton, Florida-based contracting company, Pro Unlimited, Inc., with violating California workplace safety standards.

The lawsuit does not go into further detail about Scola’s particular experience because she signed a non-disclosure agreement that limits what employees can say about their time on the job.

In 2017, two former content moderators also sued Microsoft, claiming that they developed PTSD and that the company did not provide adequate psychological support.

Facebook in the past has said all of its content reviewers have access to mental health resources, including trained professionals on-site for both individual and group counselling, and they receive full health care benefits.

“We take the support of our content moderators incredibly seriously, … ensuring that every person reviewing Facebook content is offered psychological support and wellness resources,” said Bertie Thomson, director of corporate communications.

Additional reporting by Reuters