Everything in Moderation: Former content moderator sues Meta over working conditions

May 31, 2022

3 min read

Sign up to our mailing list! 👇

What's going on here?

Meta – the parent company of Facebook – and its outsourcing company Sama are being sued in Kenya for a lack of workplace support for content moderators.

What does this mean?

Social media companies currently regulate their content through content moderators. These moderators act as gatekeepers who filter user-generated content based on internal rules and regulations. This allows them to determine what is appropriate for an online social platform. Some describe content moderators as “the emergency first responders of social media”. This underlines the vital role content moderators play in the industry, particularly in the midst of heated public debate over what kind of content should and shouldn’t be censored online. Facebook, the social media platform with the most users globally, employs more than 15,000 moderators.

Meta and Sama are being sued by Daniel Motaung, a former content moderator, who claims that the company’s lack of an appropriate working environment has enabled forced labour, union busting, and human trafficking. He also accuses Facebook of misrepresenting the job description for its own content moderators to applicants by not disclosing the requirement to view psychologically harmful content. Motaung is hoping to improve wages, implement mental-health support and achieve compensation for both former and current content moderators.

What's the big picture effect?

A job as a content moderator is an important one, but at what expense? Isabella Plunkett, after working as a Facebook content moderator for just over two years and trawling through vast swathes of content on the platform daily, spoke out about the drain on her mental health. She describes it as “not a normal job where you can go to work and go home and forget about it – the stuff you’re seeing is really ingrained in your mind”. Motaung also highlights that his first experience of harmful content was a graphic video of a beheading.

But doesn’t content moderating imply a need to view difficult content? Motaung claims that Sama, which outsources content moderators, fired him after he tried to form a trade union. Motaung claims he was diagnosed with severe Post-Traumatic Stress Disorder and was certain many others had the same experience. He claims workers in Kenya are not given the same protections as elsewhere in the world, such as Europe which has access to professional “wellness coaches”. But even then, wellness coaches cannot avert the harms to mental health that can arise from the viewing of potentially scarring content on a regular basis.

The lawsuit also includes the fact that it is especially difficult for those from disadvantaged backgrounds to leave these positions, despite being paid about $2.20 per hour. Given the circumstances, Motaung claims flying workers to Kenya to operate in these jobs amounts to human trafficking.

This is not the first time Facebook has faced such claims. In 2018, a group of third-party US moderators also sued Facebook for the unsafe working environment they provided for content moderators, with around 11,250 being deemed eligible for compensation. Facebook agreed to a $52m settlement but did not admit it was guilty, instead promising to introduce new tools that would reduce the impact of viewing harmful content for third party moderators.

This lawsuit reveals the dark truth that resides within public debate around content moderation – that, ultimately, there is a high cost to social networks and the human moderators they employ to attempt to fight disinformation and keep the internet a safer place. This results in desensitising its workers to psychologically taxing content, all whilst keeping content moderation invisible. Facebook has also outsourced this trauma to the developing world in places such as Africa. Compensation aside, these factors will require Facebook to be more transparent in the treatment of its content moderators and will hopefully raise awareness of the dark side of social media.

Report written by Kerianne Pinney

Share this now!

Check out our recent reports!