How can I work on Facebook moderation

Facebook pays $ 52 million to traumatized content moderators

In 2018, former Facebook content moderator Selena Scola sued Facebook for damages for developing post-traumatic stress disorder in her job. In a settlement with numerous content moderators, Facebook has now agreed to pay a total of US $ 52 million in compensation for trauma suffered. In addition, the conditions for the moderation of disturbing content should be optimized.

A few thousand dollars for permanent damage

Many content moderators on Facebook have to perceive a series of disturbing images every day while they compare and remove content against the guidelines of the platform. An anonymous moderator was quoted in the Guardian as early as 2017:

There was literally nothing enjoyable about the job. You’d go into work at 9am every morning, turn on your computer and watch someone have their head cut off. Every day, every minute, that's what you see. Heads being cut off.

As a result, many of these employees have developed mental health problems, including post-traumatic stress disorder. In the course of such illnesses there were various lawsuits against Facebook, in which the content moderators insisted on compensation for pain and suffering. Now there was a settlement before the San Mateo Superior Court, as The Verge reports. This stipulates that a total of 11,250 content moderators will receive payments of at least 1,000 US dollars and a maximum of 6,000 US dollars. Steve Williams, who represents the plaintiff, announced in a statement that he was satisfied with the cooperation that Facebook is offering:

We are so pleased that Facebook worked with us to create an unprecedented program to help people performing work that was unimaginable even a few years ago. The harm that can be suffered from this work is real and severe.

If further injuries or illnesses can be proven that occurred in the context of the content moderation, according to The Verge, up to 50,000 US dollars can be claimed as compensation. But even this sum should by no means be able to outweigh a post-traumatic stress disorder or other illnesses.

Changes to the requirements for content moderation are also planned

In addition to the payments of over 50 million US dollars - Facebook's latest quarterly profit was 4.9 billion US dollars - Facebook will also revise the conditions for content moderation. For example, the sound of the videos should be turned off by default, while they are also displayed in black and white. These changes should be available to all moderators by 2021, and at least 80 percent by the end of the year.

In addition, the content moderators, who deal with disturbing images on a daily basis, will receive weekly one-on-one coaching sessions with mental health experts. And in acute problem cases, an appointment with a licensed consultant should be made possible within 24 hours. Monthly group therapies are also part of the program in order to at least address the psychological stress, if not to reduce it.

Partner companies that hire content moderators for Facebook now also have to meet additional requirements. This includes examining applicants even more closely for their psychological resilience. They are also asked to post information about the offers of help for mental health problems that are available in every job. And they are asked to instruct the content moderators on how to directly report violations of workplace standards.


What happens on social media affects customer sentiment about your products and services, and ultimately your brand too. Learn what you can do to steer conversations about your brand in the right direction.

Download it for free now


Only some of the content moderators receive money

The comparison, the validity of which is still provisional, only applies to content moderators from California, Arizona, Texas and Florida who have worked for Facebook (or a third-party company) since 2015. Changes can be proposed before the court confirms the agreement. Facebook itself stated in a statement:

We are grateful to the people who do this important work to make Facebook a safe environment for everyone. We're committed to providing them additional support through this settlement and in the future.

Facebook has around 15,000 content moderators hired by third-party companies in the US alone, according to the BBC. All these moderators and thousands more worldwide are working to ensure that the 2.6 billion monthly active Facebook users find the safest possible environment. The gruesome images that the majority of users are spared are, however, part of everyday life for the content moderators. This is why preventive and proactive measures are so important to their working conditions. Financial compensation for trauma suffered as a relatively cheap way of combating the symptoms of the problem for Facebook could remain a side note for the social media company; for the content moderators, however, this decision is possibly at least one step on the way to a work atmosphere that places more emphasis on mental health.