Internet

Facebook, once again under the magnifying glass for the bad treatment of its content moderators

Facebook, once again under the magnifying glass for the bad treatment of its content moderators

Moderation of content on social networks is an unpleasant task and one that still requires a great deal of human effort. Facebook has been one of the platforms most criticized for its practices related to inappropriate material, and is once again in the center of the scene for unhappy reasons.

Workers subcontracted in Europe to carry out the task complain that Mark Zuckerberg's company does not protect them . Isabella Plunkett, a moderator of the social network, gave her testimony to an Irish parliamentary committee about the poor working conditions imposed by Facebook.

As noted by Engadget, Plunkett works for Covalen, a firm through which Facebook outsources content moderation in Ireland. The young woman stated that workers do not have adequate access to mental health resources , and that benefits such as coverage for sick days or work from home are not recognized for them.

“The content is terrible and it would affect anyone. No one can be okay if they watch graphic violence 7 to 8 hours a day,” he said. In addition, Plunkett explained that the confidentiality agreements that Facebook forces them to sign are so restrictive that they cannot even tell their family members what they do .

Content moderators dedicate strenuous working hours to reviewing violent and disturbing material that is uploaded to the platform . But despite the importance of their task, they do not receive recognition from Facebook and suffer the consequences of continued exposure to toxic posts.

The drama of content moderators outsourced by Facebook

Photo by Firmbee.com on Unsplash “There are thousands of Facebook content moderators around the world, and more than a thousand are in Ireland. Facebook could not exist as a platform without them , and yet it does not hire them directly and pay them wrong, “said Plunkett.

The social network has already been the subject of controversy due to situations of this type in other parts of the world. Last year, for example, it agreed to pay $ 52 million to US-based moderators who suffered post-traumatic stress disorder .

At the time, the company promised to incorporate more tools based on Artificial Intelligence that allow to streamline the review process . However, there is still a long way to go before the content moderation task can be done without human supervision.

On the other hand, Facebook has taken measures to try to soften the impact on moderators . Some of the tools allow you to view videos in black and white, mute the sound, or detect the most important moments, to limit exposure to violent material.

The moderators consider that the social network can do much more to protect and contain them . For now, a Facebook spokesperson said they want to fix the problem and that the moderators are trained according to their community standards. “It is an important issue and we are committed to doing it well,” he said.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Most Popular

To Top