CARE: Christian Action, Research and Education

For what you believe
Open menu Close menu

Ofcom set to be given powers to enforce duty of care on social media

Online Safety
14 February 2020
Online Safety 28p429 Resources 0 4

The Government has published its initial response to the public consultation on its Online Harms White Paper and have revealed that they are ‘minded to appoint Ofcom as the new regulator’.

The White Paper on Online Harms was published in April 2019, following an extensive consultation. After its publication, the Government said it would consult on the White Paper’s proposals to ensure stakeholders were able to feedback. That consultation ran from 8 April 2019 until 1 July 2019 and after nearly a year, the Department of Digital, Culture, Media and Sport has published an initial response. This will be followed in the Spring with a final response which will in turn pave the way for legislation, in the form of an Online Harms Bill. Reports suggest that will be tabled later this year.

CARE was one of a number of organisations who submitted response to the consultation. We argued for the inclusion of age verification checks for websites and social media platforms as a key tool to help create a safer internet.

What is the Government proposing?

Recognition was given to the fears expressed in a number of responses to the consultation that regulation of the internet could endanger free speech. In response, the Government was keen to highlight protections for freedom of expression. Here are a number of relevant quotes from the response:

‘Safeguards for freedom of expression have been built in throughout the framework. Rather than requiring the removal of specific pieces of legal content, regulation will focus on the wider systems and processes that platforms have in place to deal with online harms, while maintaining a proportionate and risk-based approach.

‘To ensure protections for freedom of expression, regulation will establish differentiated expectations on companies for illegal content and activity, versus conduct that is not illegal but has the potential to cause harm. Regulation will therefore not force companies to remove specific pieces of legal content. The new regulatory framework will instead require companies, where relevant, to explicitly state what content and behaviour they deem to be acceptable on their sites and enforce this consistently and transparently.

‘Recognising concerns about freedom of expression, the regulator will not investigate or adjudicate on individual complaints. Companies will be able to decide what type of legal content or behaviour is acceptable on their services, but must take reasonable steps to protect children from harm. They will need to set this out in clear and accessible terms and conditions and enforce these effectively, consistently and transparently.

One of the key things the Government want to do to create a safer internet is to establish an independent regulator to enforce a statutory duty of care on social media companies. There had been some speculation that Ofcom would be the new regulator. In the response, the Government said:

‘We are minded to make Ofcom the new regulator, in preference to giving this function to a new body or to another existing organisation. This preference is based on its organisational experience, robustness, and experience of delivering challenging, high-profile remits across a range of sectors. Ofcom is a well-established and experienced regulator, recently assuming high profile roles such as regulation of the BBC.’

Note the fact that at this stage, the Government is signalling that Ofcom will probably (almost definitely) become the new regulator, but it is not set in stone. This will allow organisations and other stake holders to respond in the meantime and suggest alternatives.

What’s missing?

From CARE’s perspective, the most obvious weakness in the Government’s approach is the failure to introduce age verification to online pornography. Children as young as seven are reporting accessing porn online. This is not limited to websites but can also be found on social media. In such a context, it is crucial that the age checks are introduced, both on commercial porn websites, but also on social media platforms.

Part Three of the Digital Economy Act (DEA) which has been approved by MPs and Peers, makes provision for age verification to be introduced to better protect children and young people online.

However, in October last year, the Government suddenly dropped its plans and did a U-turn. At the time, it was said the Government would come up with something better. However, months later, it is still not clear what that ‘something better’ will be.

The Government is also subject to a legal challenge by four age verification companies who claim the U-turn on age checks is an abuse of power. The companies involved aim to force the Government’s hand to make sure age checks are introduced.

CARE is supporting a new Bill by Baroness Howe which would require an implementation date for Part Three of the DEA to be set and then adhered to. This would achieve the same end as the legal challenge. Either way, our hope is that the Government will do the right thing and honour the previous commitment to implement age verification as soon as possible.

Receive news from CARE each week

By signing up stay in touch you agree to receive emails from CARE. You can change your mailing preferences at any time either by getting in touch with CARE, or through the links on any of our emails.

Recent news in Online Safety

Online

Online Safety

For children and young people, access to harmful online content is only a click away. CARE is working towards a society where they are as well-protected online as they are offline.

Find out more about the cause