Internet

Moderator sues TikTok for psychological trauma: “Hours of non-stop violent footage”

Moderator sues TikTok for psychological trauma: “Hours of non-stop violent footage”

After Facebook, TikTok also ends up in the eye of the storm for the question of content moderation and the psychological impact it has on employees forced to screen thousands of videos, some terrible, to select which ones to post.

The lawsuit was filed in California by Candie Franzier , who did content moderation for TikTok through a third-party company, Telus International. The moderators play a fundamental role for the social networks , because they are entrusted with the task of viewing the contents – videos and photos, in the case of TikTok the videos – which are posted daily and establish whether comply with the company's policy for publication. Many companies avail themselves of the advice of external companies, which are “contracted out” for various tasks, including that of moderation, and Franzier worked for one of these companies.

The woman filed a formal complaint against TikTok and the Chinese giant to which she belongs, the Bytedance , saying that she worked as a moderator for 12 hours a day, with breaks of about 15 minutes each, and of having been forced to see all kinds of horror and brutality : shootings, beheadings, rape, violence against minors and animals, mutilations, “thousands of acts of extreme and explicit violence” generated in a continuous flow . Chilling scenes that she necessarily had to watch for work and that caused her very serious psychological damage, nightmares, panic attacks and a perennial state of anxiety: all the symptoms of post traumatic stress disorder .

The videos also compulsorily remained on the screen for 25 seconds , time established to allow the moderator to understand what genre they were and if they contained scenes that violate the terms of use of the service (and also the law, seen the content) and were also projected simultaneously, up to 10 videos at the same time, to optimize times. Frazier reported in the complaint that in addition to being forced to watch them, a moderator supervision system monitored the speed and time spent on each video, posting and scolding her in case she took unauthorized breaks .

Frazier's lawyers pointed out that the woman did not enjoy any protection commonly recognized for workers in the sector, including psychological support, more frequent and longer breaks and the ability to blur the most violent videos: an ad hoc protocol that the social giants have begun to adopt precisely in light of the psychological consequences that moderators can face in carrying out their work of monitoring and censoring violent content. Facebook , for example, provides a period of specific training and direct psychological assistance to help them cope with the impact that videos and photos can have on the worker.

In the complaint, Franzier's lawyers support that TikTok did not warn the woman that viewing such posts “can have a significant negative impact on mental health,” and that it has not implemented the planned measures to protect employees. Telus International's job application for content moderator, available online, further explains that posts that need to be supervised “may include explicit, violent, explicit, political, profane and disturbing “.

Among the requirements to apply for the position are “ stress and emotion management skills”, and a spokesperson for the company (who is not a defendant in the suit) assured that a program is in place. ad hoc for moderators working for them: “Frazier has never previously raised these concerns about his working environment and his allegations are totally inconsistent with our policies and practices,” the company spokesman said in a statement. .

TikTok, for its part, did not want to officially comment on the case, but stated that it cares about the well-being of employees and the quality of the working environment , even for contract workers.

“Our security team works with third-party companies to help protect the TikTok platform and community, and we continue to expand a range of wellness services so moderators feel mentally and emotionally supported,” he said. the company in a note. In 2020 Facebook agreed to pay compensation of 52 million dollars to the moderators who filed a lawsuit for the psychological consequences deriving from viewing violent content: in this case too, it was cited post traumatic disorder from stress.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Most Popular

To Top