Pornography
TikTok recommends pornography to children’s accounts

A campaign group has found that TikTok’s algorithm recommends pornography and sexualised content to accounts for children.
Researchers created accounts on clean phones with no search history, pretending to be 13-year-olds. The accounts activated TikTok’s ‘restricted mode’ which should prevent users from seeing “mature or complex themes, such as… sexually suggestive content.”
Yet those accounts were recommended sexually suggestive content, including explicit videos.
A popular platform
TikTok is an increasingly popular social media platform for children in the UK. While their community guidelines state that users must be 13 or older, studies from Ofcom have found that TikTok was one of the most popular platforms among eight to 11-year-olds. Ofcom’s studies have also found that more than 25% of 5 to 7-year-olds report using TikTok, with a third doing so unsupervised.
Global Witness, who conducted the research into TikTok’s recommendations, contacted the company after their findings in April of this year. TikTok claimed they had taken immediate action to remove the problem. However, the group repeated their research later in the year and found that the platform was still recommending sexual content.
Ava Lee from Global Witness said that “TikTok isn't just failing to prevent children from accessing inappropriate content - it's suggesting it to them as soon as they create an account.”
A law to be enforced
July this year saw the Online Safety Act come into force, which includes Children’s Codes which impose a legal duty to protect children online. This has followed years of campaigning from CARE and others to implement age verification to protect children online.
Platforms now need to implement secure and effective methods to check the age of users and stop children seeing pornographic content. Their algorithms should also be changed to block content that promotes harmful behaviour. Global Witness’ second round of testing occurred after the Online Safety Act had been implemented.
Announcing the legislation in April, Technology secretary Peter Kyle said: “The time for tech platforms to look the other way is over. They must act now to protect our children, follow the law, and play their part in creating a better digital world… if they fail to do so, they will be held to account.”
Global Witness commented that “Everyone agrees that we should keep children safe online… Now it's time for regulators to step in.”
Share