It’s no secret Meta employs to do much of the hard work of enforcing its content moderation policies. And despite assisting one of the most valuable companies in the world, those workers have frequently complained of their jobs involving poor compensation and anxiety-inducing work. Some are now also saying they’re being treated worse than other workers.
According to , Genpact, a Meta subcontractor that has previously been accused of fostering poor working conditions, has required the Spanish-language moderators out of its Richardson, Texas office to report for in-person work since April 2021. Those workers have had to put their health at risk against both the delta and omicron coronavirus variants while their English-language counterparts have been allowed to cycle through the office in three-month rotations.
The news of the situation at Genpact comes just one week after workers at Accenture, another Meta subcontractor, successfully protested to force the company to it had in place for hundreds of Facebook moderators to return to in-person work on January 24th.
Contractors who spoke to BuzzFeed News claim Genpact also holds them to unreasonable standards. They say they’re expected to make moderation decisions in about a minute while maintaining an 85 percent accuracy rate. Complicating everything is the fact that Meta reportedly doesn’t disseminate guidelines on how to apply Facebook’s Community Standards in a language other than English, leaving those workers in a situation where they’re forced to first translate that guidance before applying it.
And there’s the scale of the problem the team has to tackle. Genpact’s Spanish-language moderation team is named after Mexico but in addition to moderating content posted by people living in the North American country, they’re also responsible for Facebook and Instagram posts from Spanish-speaking users in most Latin American countries as well. In Mexico alone, Facebook has more than . By contrast, the Genpact Mexican market team consists of approximately 50 individuals.
“We use the combination of technology and people to keep content that breaks our rules off of our platform, and while AI has made progress in this space, people are a key part of our safety efforts,” a Meta spokesperson told Engadget. “We know these jobs can be difficult, which is why we work closely with our partners to constantly evaluate how to best support these teams.”