Last Monday (21), a former content moderator sued YouTube, accusing the video sharing platform of not adequately protecting its employees in charge of capturing and removing violent content posted on the site.
According to NBC News, the complainant stated that, in order to fulfill her duties, she had to watch videos of beheadings, shootings, child abuse and other disturbing content. As a consequence, she started having nightmares, panic attacks and crowd phobia, without counting on any kind of psychological support from the company.
The former moderator, whose name was not revealed, was formed by a third-party agency, Collabera, as is the case with most of these professionals. The employee claimed that the moderation teams were insufficient, which forced moderators to have to extrapolate the recommended four hours a day to scan violent videos.
YouTube’s productivity goals state that each moderator will analyze between 100 and 300 pieces of video content per day, with a “margin of error” of 2 to 5%, according to the process. In addition, the company also determines how the videos are displayed, whether in full screen or thumbnails, blurred and how quickly the pieces are watched in sequence.
The process comes at a time of heated discussions about the impact of this type of work on people’s mental health. During the pandemic, YouTube used computers to locate and delete inappropriate content, but had to return the function to humans after many cases of undue censorship.
San Francisco law firm Joseph Saveri, which represents a group of Facebook moderators, filed a similar lawsuit, which resulted in a $ 52 million deal in May.