Facebook, Twitter, and YouTube have all announced this week that thousands of content moderators are being sent home—leaving more of our feeds in the hands of machines.
Facebook users across the world began to notice something strange happening on their feeds on Tuesday night. Links to valid news outlets and websites, including The Atlantic, USA Today, the Times of Israel, and BuzzFeed, among many others, were being taken down altogether for reportedly violating Facebook’s spam rules. The problem impacted many people’s ability to share news articles and information about the developing coronavirus pandemic.
Facebook said the problem to a mundane bug in the platform’s automated spam filter, but some researchers and former Facebook employees worry it’s also an indication of what’s yet to come. With a global health crisis making its way across the globe millions are restricted to their homes, and social media platforms have become one of the most crucial ways for people to share information and socialize with eachother. But in order to protect the health of its staff and contractors, Facebook and other tech companies have also sent home their content moderators, who serve as their first line of defense against the horrors of the internet. Their work is often difficult, if not impossible, to do from home. Without their labor, the internet might become a less free and more frightening place.
“We will start to see the traces, which are so often hidden, of human intervention,” says Sarah T. Roberts, an information studies professor at UCLA and the author of Behind the Screen: Content Moderation in the Shadows of Social Media. “We’ll see what is typically unseen—that’s possible, for sure.”
Following the 2016 US presidential election, Facebook significantly ramped up its moderation capabilities. By the end of 2018, it had more than 30,000 people working on safety and security, about half of which are content reviewers. Most of these moderators are contract workers, employed by firms like Accenture or Cognizant in offices around the world. They work to keep Facebook free of violence, child exploitation, spam, as well as other unseemly content. Their jobs can be stressful and traumatizing.
Comments