San Francisco, CA (WorkersCompensation.com) – A moderator for the digital video platform TikTok has filed suit in California against the social media company and its parent company, ByteDance, accusing them of failing to take steps to protect her mental health as she viewed hours of traumatic videos.
Candie Frazier, who is employed by third-party Canadian contracting firm Telus International, file suit on Dec. 23 in U.S. District Court in California. In the proposed class action suit, Frazier said TikTok and ByteDance controlled her work, and their policies required her to watch hundreds of videos during her 12-hour day shifts.
According to the suit, content moderators often have to watch between three to 10 videos at the same time, reviewing only about 25 seconds of each video. The strenuous pace required her to watch “thousands of acts of extreme and graphic violence,” the suit alleges, including mass shooting, child rape, animal mutilation, cannibalism, and genocide. Moderators are only allowed to take one hour-long break for lunch, and two 15-minute breaks per shift, the suit said.
Despite being employed by a third-party firm, Frazier said it was ByteDance that monitored the moderators’ performance, punishing them if there is any time “taken away from watching graphic videos.”
Additionally, the suit said, the companies did not adhere to industry standards to protect moderators via frequent breaks and psychological support, or technical safeguards, such as blurring or reducing the resolution of the videos moderators are required to watch.
“As a result of constant and unmitigated exposure to highly toxic and extremely disturbing images at the workplace, Ms. Frazier has developed and suffers from significant psychological trauma including anxiety, depression, and posttraumatic stress disorder (‘PTSD’),” the suit alleges. “ByteDance and TikTok are aware of the negative psychological effects that viewing graphic and objectionable content has on content moderators. Despite this knowledge, they have not implemented safety standards known throughout the industry to protect their content moderators from harm. These safety standards could have reduced the risk and mitigated the harm suffered by content moderators working on behalf of ByteDance and TikTok.”
In addition to anxiety and depression, and PTSD, the suit alleges Frazier also struggles to get sleep and suffers from horrific nightmares when she does get to sleep. .
“She often lays awake at night trying to go to sleep, replaying videos that she has seen in her mind. She has severe and debilitating panic attacks,” the suit said.
The suit requests a jury trial and compensation for Frazier and other content moderators for the psychological injuries they have suffered. Additionally, the suit asks the court to make the company set up a medical fund for moderators.
“Without this Court’s intervention, ByteDance and TikTok will continue to injure content moderators and breach the duties they owe to Content Moderators who review content on their platform,” the suit said. “On behalf of herself and all others similarly situated, Plaintiff Frazier brings this action (1) to compensate content moderators that were exposed to graphic and objectionable content on ByteDance’s TikTok platform; (2) to ensure that ByteDance and TikTok provide content moderators with tools, systems, and mandatory ongoing mental health support to mitigate the harm reviewing graphic and objectionable content can cause; and (3) to provide mental health screening and treatment to the thousands of current and former content moderators affected by ByteDance’s and TikTok’s unlawful practices.”
In a statement to Bloomberg, a spokesperson for TikTok said the company doesn’t comment on ongoing litigation, but that it does work “to promote a caring working environment for our employees and contractors.”
“Our safety team partners with third party firms on the critical work of helping to protect the TikTok platform and community, and we continue to expand on a range of wellness services so that moderators feel supported mentally and emotionally,” the spokesperson said in the statement.
The suit is similar to one brought against Facebook in 2018 when a content moderator alleged that she develop PTSD after being exposed to content that featured rape, suicide and violence. Facebook ultimately agreed to pay $52 million in a class action settlement, paying for mental health treatment for content moderators and workplace changes.
But a representative for Telus told CNN that Frazier had not brought any of those concerns about her work to that company, and that the allegations in the suit were inconsistent with the company’s policies and practices.
“We have a robust resiliency and mental health program in place to support all our team members, as well as a comprehensive benefits program for access to personal health and well-being services,” the Telus spokesperson told CNN. “Our team members can elevate questions and concerns about any aspect of their job through several internal channels, all of which the company takes very seriously.”