A former TikTok content moderator has filed a lawsuit against the site, stating that parent firm ByteDance fails to protect moderators’ mental health from a near-constant barrage of distressing material.
Candie Frazier claims she spent 12 hours a day filtering videos submitted to TikTok for a third-party contracting agency called Telus International, according to a proposed class-action lawsuit filed in the California Central District Court. Frazier claims to have seen “thousands of acts of intense and graphic violence” during that time, including mass shootings, child rape, animal mutilation, cannibalism, gang murder, and genocide.
Frazier claims that in order to deal with the massive amount of content submitted to TikTok every day, she and her coworkers had to watch three to ten videos at the same time, with new films loading in every 25 seconds. Moderators are only given one 15-minute break during the first four hours of their shift, with subsequent 15-minute breaks every two hours. According to the lawsuit, ByteDance regularly monitors performance and “heavily penalizes any time spent away from watching explicit movies.”
According to the complaint, TikTok and its partners failed to fulfill industry-recognized standards aimed at mitigating the negative effects of content filtering. More frequent breaks for moderators, psychological assistance, and technical precautions such as blurring or decreasing the quality of recordings under review are among them.
Frazier claims she has undergone “serious psychological trauma” as a result of her profession, including despair and anxiety and PTSD symptoms. Frazier has “difficulty sleeping, and when she does sleep, she has awful dreams,” according to the lawsuit. She frequently lies awake at night, unable to fall asleep, reliving films in her thoughts. She suffers from acute and incapacitating panic episodes.”
The evidence in Frazier’s case is consistent with stories of content moderators working (indirectly) for other major digital corporations such as Facebook, YouTube, and Google. The appalling working circumstances that these moderators face — a workforce that is critical to the profitability of some of the world’s largest corporations — have come under increased scrutiny in recent years. Despite the heightened attention, reports like Frazier’s imply that working circumstances for moderators are still extremely difficult.
Frazeri’s case was filed by the Joseph Saveri Law Firm in California, which had previously brought a similar claim against Facebook moderators in 2018. In that instance, Facebook agreed to pay $52 million to its content moderators as a settlement.
In a response to The Verge, TikTok spokesperson Hilary McQuaide stated, “While we do not comment on pending litigation, we endeavor to establish a compassionate working environment for our employees and contractors.” “Our Safety team collaborates with third-party businesses to assist secure the TikTok platform and community, and we continue to build on a range of wellness programs to ensure that moderators are psychologically and emotionally supported.”