Pornography

TikTok employee 'traumatised' by extreme content on the platform

Solen feyissa Xfnf Ml Np W Do unsplash

Social media platform TikTok is flooded with extreme and disturbing content which traumatises those paid to identify and remove it, an ex-employee has said.

Candie Frazier worked as a moderator for TikTok but quit her job, saying what she witnessed on the platform was causing her “significant psychological trauma”.

Frazier worked 12-hour shifts in which she viewed pornography, child sexual abuse, violence, and murder. She is now suing the company for the harm she experienced through constant exposure to extreme content. Her lawsuit states:

“Every day, TikTok users upload millions of videos to its platform. Millions of these uploads include graphic and objectionable content such as child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder.”

It continues: “Without [this] intervention, ByteDance and TikTok will continue to injure Content Moderators and breach the duties they owe to Content Moderators who review content on their platform.”

The UK Parliament is currently considering legislation designed to tackle harmful online content, with concern that children in particular are being traumatised by what they witness online.

CARE campaigned for stricter rules around internet pornography, which is freely accessible to young children. In 2017, parliament approved legislation requiring sites to verify the age of visitors and take down “extreme” porn.

However, after a series of delays, Ministers announced they would not enact these changes. CARE is working to ensure these safeguards are included in the governments forthcoming Online Safety Bill.

Share