New guidance urges tech companies to crack down on online child abuseOnline Safety
Ofcom has released the first guidance for tech platforms in light of the new Online Safety Bill.
The guidance, which is over 1,500 pages long, urges social media platforms to tackle illegal content – including online child abuse.
Tech companies are asked to fight online grooming by stamping down on the default nature of suggesting children as “friends” on social media platforms.
They must disable functions that allow children to share location information on their profile or posts and must prevent children from receiving messages from anyone not in their contacts list.
Recent figures from Ofcom have revealed that over one in ten 11–18-year-olds have been sent naked or semi-naked images online.
Ofcom will also require some platforms to adopt hash-matching in order to detect child sexual abuse material (CSAM).
This technology converts an image into numbers – called a “hash” – which will be compared to a database of numbers generated by known CSAM images. If a new image matches the database then it means a CSAM image has been found.
However, this hash-matching will not apply to encrypted messaging services.
Asked in a BBC interview if those powers would ever be used, Ofcom's chief executive Dame Melanie Dawes said, "it's hard to say right now, but there isn't a solution yet, a technology solution, that allows scanning to take place in encrypted environments without breaking the encryption."
But she encouraged encrypted messaging companies to find ways to combat child abuse on their platforms.