Children require courage

Hailed as a “gamechanger” by the Chief Executive of Ofcom, Dame Melanie Dawes, Ofcom launched its ‘Protecting Children From Harms Online’ codes this week in a bid to make the internet a safer place for children.
Since the passage of the Online Safety Act in 2023, social media companies and search engines have a duty to ensure their services aren’t used for illegal activity or to promote illegal content, with particular protections for children.
The communications regulator, Ofcom, has been tasked with implementing the Act and has the legal powers to enforce compliance — with companies potentially facing fines of up to £18 million, or 10% of their global revenue (whichever is greater), if they fail in their new duties.
Ofcom has been pursuing a three-phase implementation with measures already taken regarding phase 1 and illegal harms. But it is phase 2, which concerns child safety, pornography and the protection of women and girls, that has been of particular interest to campaigners, and which made the headlines this week.
There have long been calls to make the internet a safer place for children and with children as young as 7 stumbling across pornography online, mental health rates among teenagers (teenage girls in particular) spiralling, and the prevalence of bullying and abusive content online it is easy to understand why. Coupled with harrowing stories such as that of Molly Russell, a teenager who took her own life after encountering content promoting suicide and self-harm online, it has long been clear that the internet — for all its benefits — can be incredibly destructive especially to children.
Indeed, in recent years this issue has received something of a groundswell of support after the earlier heyday in which social media was routinely celebrated as a societal good. Campaigns for smartphone free schools and raising the age limit for social media platforms have all made it into the mainstream.
Social media is routinely criticised for spreading mis- and disinformation, increasing political polarisation, inciting violence and abuse, aiding radicalisation, destroying mental health, and generally seeing conversation and debate descend into a cesspit of lies, falsehoods, abuse, and hatred.
The introduction of the Online Safety Act in 2023, an internationally significant landmark piece of legislation, offered hope of an online world that was less violent, degrading, and harmful than the one that exists — particularly for children.
What do the new codes do?
Given the context, there is much to be welcomed in the new codes issued by Ofcom. Theoretically, they should vastly improve children’s experience of the internet: with measures to improve age verification, prevent the promotion of harmful content, and ensure children are prevented from accessing pornography and content that encourages self-harm, eating disorders, and/or suicide. Likewise, children should be given age-appropriate access to content which constitutes bullying, abuse or hate, or encourages serious violence and injury, amongst other things.
SecurityBrief reports, “Ofcom has instructed providers to produce safer content feeds, implement robust age-verification checks, and act swiftly to remove harmful material. Additionally, children must be given more autonomy and support, with clearer processes for reporting and resolving issues, and platforms are expected to demonstrate strong governance in applying these policies.”
Dame Melanie Dawes argued the new codes “will mean safer social media feeds with less harmful and dangerous content, protections from being contacted by strangers and effective age checks on adult content” Former Facebook Safety Officer, Professor Victoria Baines agreed, saying the move was “a step in the right direction” and suggesting “Big tech companies are really getting to grips with it, so they are putting money behind it, and more importantly they're putting people behind it.”
Likewise, the Technology Secretary Peter Kyle, suggested the new Ofcom codes would be a “watershed moment” that turned the tide on “toxic experiences on these platforms.”
Measures don’t go far enough
Yet campaigners for a children friendly internet were left sceptical and disappointed.
Andy Burrows, CEO of the Molly Rose Foundation branded the proposals a “whole series of missed opportunities” that were “giving far too much weight to industry – rather than focusing on how it builds measures or how it sets objectives that can actually tackle the problem”.
He was joined by Rachel De Souza, Children’s Commissioner for England, who argued the new codes were too weak: “I made it very clear last year that its proposals were not strong enough to protect children from the multitude of harms they are exposed to online every day. I am disappointed to see this code has not been significantly strengthened and seems to prioritise the business interests of technology companies over children’s safety.”
Indeed, CARE’s Senior Policy Officer, Tim Cairns agreed saying “the new rules are great. In theory we could have the safest internet in the world. But that will only happen if Ofcom properly enforce the rules. Without robust, swift and tough enforcement, these rules will not be worth the paper they are written on. So the duty now is on Parliament to hold Ofcom's feet to the fire and ensure they follow through with sanctions on Big Tech when and if required."
The NSPCC drew attention to the role of private messaging platforms in their calls for the regulator to go further. Ofcom’s Child Protection Policy Director, did recognise there was more work to do but that the new codes would be “transformational”.
Ofcom’s enforcement problem
Whilst it is right to acknowledge that there is good in these new measures, a brief glance at the Ofcom’s enforcement record presents a discouraging picture of the likelihood of genuine change.
Since 2020, Ofcom has held regulatory and enforcement powers under Video-Sharing Platform (VSP) regulation. This would include sites such as OnlyFans, a social media platform rife with pornography and illegal content. Despite pledges to be building “the safest social media platform in the world”, the site was investigated by Reuters which found multiple cases of sexual slavery, child sexual abuse material and nonconsensual or “revenge” porn.
Yet Ofcom action against the site has been restricted to a fine for failing to disclose information related to measures to check age. OnlyFans reportedly told Ofcom that their age verification software required further proof of age should the individual look younger than 23, when in reality they had set the age at 20. OnlyFans was subsequently fined £1.05 million, yet the site brings in $1.3 billion in revenue — hardly a ringing endorsement of robust and effective enforcement.
However, it should be said that this week Ofcom announced it was investigating possible breaches under the Online Safety Act, with Peter Kyle praising them for wasting “no time in taking action. They've already started enforcement action against several companies. This is the kind of thing that I want to see.”
He continued, “There is no point in Ofcom having these powers if they are not used, if children are being subject to harmful content.” Without strict enforcement of the legislation, any laws — no matter how good they might be— will fail to have an impact and leave children at the behest of the online wild west.
The proof will be in the pudding
A robust and active Ofcom that is willing to challenge powerful and wealthy online platforms, businesses, and individuals where there is evidence of malpractice and illegality is essential if this piece of legislation is to be worth anything.
And yet worth something it must be.
The discrepancy between the protections we provide for children offline and online is disastrously laughable.
We rightly restrict movies and TV shows so that they are age-appropriate. We limit the sorts of video games children can play. We place limitations on individual freedoms rightly recognising that children are too young to drink or gamble for example. Yet when it comes to the online world children are left remarkably exposed.
Sure, many social media sites require you to be a certain age to have an account and yet this is rarely robustly policed. Age-verification measures on even some of the most explicit sites are pathetic, a mere box-ticking exercise.
You don’t even have to go looking for horrendous content for it to find you. 60% of 11-13 year olds who have seen pornography encountered it accidentally whilst online. And this is without even considering all the content promoting abuse, suicide, self-harm, eating disorders, radicalisation and the like.
It is true: online regulation involves some complicated questions and the pace of technological change often means regulation is out of date before its even got going. Yet there is overwhelming public support for taking the issue more seriously.
A God who cares
And it is not just the British public who care. When we look at the Bible, we see that the wellbeing and safety of children is of great concern to God.
Despite children carrying low status at the time, Jesus welcomes them to Him and uses their faith as an example of what it means to believe in Mark 9, Matthew 18 and 19.
Likewise, the Bible routinely talks of God’s concern for those who are voiceless, without power and without influence. God is one who cares for the fatherless and the orphan.
Psalm 72 speaks of how God’s promised king will “defend the afflicted among the people and save the children of the needy… he will deliver the needy who cry out, the afflicted who have no one to help.”
Proverbs 31 urges the king to “Speak up for those who cannot speak for themselves, for the rights of all who are destitute. Speak up and judge fairly; defend the rights of the poor and needy.”
And whilst it might sound like a pretty low bar to our ears, but God’s law explicitly prohibited the sacrificing of children to idols, a widespread cultural practice at the time. Anyone who did faced not only the death penalty but being cut off from God.
There is a clear Biblical call to take up the cause of children in our society, and especially so due to their disempowered and inconsequential standing.
And so, when it comes to the issue of the safety of children online there is a need to reckon with the costs of action and find the courage needed to take on the power and the wealth of social media sites like OnlyFans, but also the wider Big Tech industry more broadly where impropriety is to be found.
This must also go beyond just the concerning headlines around the awful content, for the nature of these platforms can in themselves be part of the problem.
Authors such as Samuel D. James, have highlighted how the internet is changing the world we experience and the way we interact with one another. Designed to harvest vast amounts of data, to sell advertising space, and to retain engagement through outrage, addiction, and extreme material, it is not just the content on these platforms that is concerning, but the very nature of the platforms themselves. In some ways their design makes such content inevitable.
The measures set out this week do “require tech companies to take a "safety-first" approach in both the design and operation of their services.” This ought to be welcomed, and yet many view the latest codes as insufficient. Until the regulator is willing and brave enough to enforce such measures, the online world remains destructive for children.
We need to pray that these codes will do some good, that they will be strengthened as is needed, and that the regulator will have both the courage and the means necessary to take on vested interests with power and money — for the sake of those who have no influence.