Facebook content moderators demand ‘safer working conditions’ in an open letter to Zuckerberg

Image-Facebook-content-moderators-in-an-open-letter-to-Mark-Zuckerberg-mediabrief.jpg

More than 200 Facebook employees said that the social media giant is making content moderators return to the office during the pandemic because the company’s attempt to rely more heavily on automated systems has “failed.”

This comes after some Facebook content moderators — who deal with things like sexual abuse and graphic violence — were required to return to their office during the pandemic. Shortly after they returned to the office, a few Facebook content moderator reportedly tested positive for COVID-19.

The workers made a claim and demanded safer working conditions in an open letter to Facebook CEO Mark Zuckerberg and Chief Operating Officer Sheryl Sandberg as well as the heads of two companies, Accenture and CPL, to which Facebook subcontracts content moderation.

Here is the full text of an open letter which more than 200 Facebook content moderators from across the world have just addressed to Facebook’s leaders.

Dear Mr. Zuckerberg, Ms. Sandberg, Ms. Heraty, Ms. Sweet

We, the undersigned Facebook content moderators and Facebook employees, write to express our dismay at your decision to risk our lives—and the lives of our colleagues and loved ones—to maintain Facebook’s profits during the pandemic.

After months of allowing content moderators to work from home, faced with intense pressure to keep Facebook free of hate and disinformation, you have forced us back to the office. Moderators who secure a doctors’ note about a personal COVID risk have been excused from attending in person. Moderators with vulnerable relatives, who might die were they to contract COVID from us, have not.

The pandemic has been good for Facebook. More than 3 billion people have now joined Facebook services, creating more demand for our work than ever. Mr. Zuckerberg nearly doubled his fortune during the crisis. He is now worth well over $100 billion. It has been good for Facebook’s contractors, too: CPL, one of the main European contractors, is due to be sold for €318m.

Despite vast sums flowing to each of you as corporate executives, you have refused moderators hazard pay. A content moderator at Accenture’s office in Austin, Texas generally earns $18/hour.

Before the pandemic, content moderation was easily Facebook’s most brutal job. We waded through violence and child abuse for hours on end. Moderators working on child abuse content had targets increased during the pandemic, with no additional support.

Now, on top of work that is psychologically toxic, holding onto the job means walking into a hot zone. In several offices, multiple COVID cases have occurred on the floor.Workers have asked Facebook leadership, and the leadership of your outsourcing firms like Accenture and CPL, to take urgent steps to protect us and value our work. You refused. We are publishing this letter because we are left with no choice.

Stop Needlessly Risking Moderators’ Lives. It is important to explain that the reason you have chosen to risk our lives is that this year Facebook tried using ‘AI’ to moderate content—and failed.

At the start of the pandemic, both full-time Facebook staff and content moderators worked from home. To cover the pressing need to moderate the masses of violence, hate, terrorism, child abuse, and other horrors that we fight for you every day, you sought to substitute our work with the work of a machine.

Without informing the public, Facebook undertook a massive live experiment in heavily automated content moderation. Management told moderators that we should no longer see certain varieties of toxic content coming up in the review tool from which we work— such as graphic violence or child abuse, for example.

The AI wasn’t up to the job. Important speech got swept into the maw of the Facebook filter—and risky content, like self-harm, stayed up.

The lesson is clear. Facebook’s algorithms are years away from achieving the necessary level of sophistication to moderate content automatically. They may never get there.

This raises a stark question. If our work is so core to Facebook’s business that you will ask us to risk our lives in the name of Facebook’s community—and profit—are we not, in fact, the heart of your company?

Without our work, Facebook is unusable. Its empire collapses. Your algorithms cannot spot satire. They cannot sift journalism from disinformation. They cannot respond quickly enough to self-harm or child abuse. We can.

Facebook needs us. It is time that you acknowledged this and valued our work. To sacrifice our health and safety for profit is immoral.

These are our demands.

1. Keep moderators and their families safe. At the moment, only individual content moderators with a doctors’ note indicating that they are high risk are excused from working in the office. Even this is not offered in some workplaces. Those who live with an at-risk person – who have, for example, a child with epilepsy – have been forced to come in. All content moderators who are high risk or who live with someone who is high risk for Covid should be permitted to work from home indefinitely.

2. Maximize at-home working. Work that can be done from home should continue to be done from home. You have previously said content moderation cannot be performed remotely for security reasons. If that is so, it is time to fundamentally change the way that the work is organized. There is a pervasive and needlessly secretive culture at Facebook. Some content, such as content that is criminal, may need to be moderated in Facebook offices. The rest should be done at home.

3. Offer hazard pay. If you want moderators to risk their lives to maintain ‘community’ and profit, you should pay. Moderators who are working in the office on high-risk material (eg, child abuse) should be paid hazard pay of 1.5x their usual wage.

4. End outsourcing. There is, if anything, more clamor than ever for aggressive content moderation at Facebook. This requires our work. Facebook should bring the content moderation workforce in house, giving us the same rights and benefits as full Facebook staff.

5. Offer real healthcare and psychiatric care. Facebook employees enjoy various benefits, including private health insurance and visits to psychiatrists. Content moderators, who bear the brunt of the mental health trauma associated with Facebook’s toxic content, are offered 45 minutes a week with a ‘wellness coach’. These ‘coaches’ are generally not psychologists or psychiatrists and are contractually forbidden from diagnosis or treatment. And they generally cannot build a relationship of trust with moderators, since workers know that Facebook management (and Accenture/CPL management) ask ‘coaches’ to reveal confidential details of counselling sessions. Moderators deserve at least as much mental and physical health support as full Facebook staff.

The current crisis highlights that at the core of Facebook’s business lies a deep hypocrisy. By outsourcing our jobs, Facebook implies that the 35,000 of us who work in moderation are somehow peripheral to social media. Yet we are so integral to Facebook’s viability that we must risk our lives to come into work.

It is time to reorganize Facebook’s moderation work on the basis of equality and justice. We are the core of Facebook’s business. We deserve the rights and benefits of full Facebook staff. We look forward to your public response.

Your thoughts, please