Content Moderator Sues Facebook for PTSD

John Lister's picture

A former Facebook content moderator is suing the site's operators claiming the work mentally harmed him. Daniel Motaung says the low-paid work left him with post-traumatic stress disorder.

Motaung is suing Facebook's owner Meta along with Sama, the contracting company that hired him for the work. He says he was misled by a job ad that implied content moderation was a small part of a wider customer service role.

He was recruited in South Africa and relocated to work in Nairobi where he was paid the equivalent of $2.20 an hour. He says this relocation made it more difficult for himself and fellow workers to leave the job if they didn't want to continue.

Sama denies these claims, saying "It is completely inaccurate to suggest that Sama employees were hired under false pretences or were provided inaccurate information regarding content moderation work." (Source: bbc.co.uk)

Horrific Content

Motaung says his work involved reviewing videos posted to the site to see if they broke Facebook's content rules. He says he regularly saw extremely disturbing content including a beheading and material involving children.

TIME investigated claims made by other workers at the facility. It found content moderators have a target of averaging just 50 seconds to review each video, regardless of its length. Senior staff then spot-check reviews and moderators must have made the "correct" decision at least 84 percent of the time. (Source: time.com)

15 Second Deadline

Internal guidelines suggested many reviews didn't even get this brief period. The guidelines said that if there were no signs of unsuitable material in data such as the title, thumbnail and comments, and if the video hadn't already been reported or flagged, then the moderator should only watch the first 15 seconds of the video.

It's another sign of the logistical difficulties in making sure online material doesn't breach site rules. The sheer volume of content posted brings serious challenges to effective human review, but artificial intelligence doesn't yet seem reliable enough to filter content.

What's Your Opinion?

How should Facebook moderate content to avoid harmful material getting online? Will automated moderation ever be reliable enough? Is there any way to have human moderators without risking their mental health?

Rate this article: 
Average: 5 (4 votes)

Comments

buzzallnight's picture

If he isn't an outright fraudster he certainly is not intelligent enough to be a content moderator.

He relocated from South Africa to Nairobi for the equivalent of $2.20 an hour? That is 2721 miles and 55 hours driving time!

Contracting companies have only one purpose and that is to remove any liability from an employer for anything and they are very good at it.

"Sama denies these claims, saying "It is completely inaccurate to suggest that Sama employees were hired under false pretences or were provided inaccurate information regarding content moderation work.""

Game over Daniel Motaung will lose.

Facebook clearly searched the whole world and found the best content moderators in South Africa and Nairobi. /sarc off

All Facebook has to do is look like they are doing something, they don't have to be successful at it.

Will automated moderation ever be reliable enough?

The question is could automated moderation ever keep up with the total insanity of humans?
No, probably not.

Is there any way to have human moderators without risking their mental health?
No, but millions of people view this content every day :)

beach.boui's picture

I agree with much of buzzallnight's post. But, being young and inexperienced, or even stupid, is not a reason to suggest someone is a fraud. That's not a fair comment on Motaung. I suspect job opportunities in the part of the world are few and far between, so again, it is unfair to criticize Motaung to taking the moderation job. I give him credit for trying. I can't imagine some of the horrific content he must have seen that he never, ever expected to have to filter.

I hope Motaung is successful in his lawsuit. I hope he gets million$ in settlement. Then Facebook will realize it's cheaper to hire enough people to do the work in the first place, rather than over work and under pay an inadequate number of works to do such important work. Fuck Facebook. They are more the problem these days than they are the solution.