CARE works in a number of ways to see the sanctity of the family and the innocence of childhood protected online, advocating the equipping, encouragement and education of families.
CARE is concerned principally about two online dangers:
- First, accessing inappropriate material. In some cases children will deliberately seek out inappropriate material. This is a particular issue for boys in relation to pornography. In other cases children will accidentally stumble on adult content that they might find interesting or distressing but which in either case is not good for them.
- Second, online behavioural challenges which include cyber bullying, sexting, and grooming.
These two challenges need to be dealt with in different ways. CARE argues that the solution for dealing with inappropriate material is the introduction of an opt-in system. This would be delivered by requiring internet service providers and mobile phone operators to provide users with an internet free from adult content, such as pornography, but with the option for anyone to opt-in to access this content subject to a quick age verification process demonstrating that they are 18 years or over.
Although filtering technology is improving all the time, it is not perfect and so it is vital that a robust opt-in system includes the provision of a mechanism for quickly dealing with the inappropriate blocking of sites. If someone feels their site has been blocked in error they should be able to appeal to OFCOM who should be required to quickly adjudicate ordering, when appropriate, the unblocking of the site in question. We argue that the solution for dealing with online behavioural challenges is better education. Both solutions are vital.
Blocking technology cannot deal with online behavioural problems and better education cannot be relied upon to deal with the problem of children deliberately wanting to seek out adult content like pornography.
A Growing Problem
In 2016, a major NSPCC survey discovered it was more likely for young people to find explicit material accidentally online than to specifically seek it out – and the graphic content was influencing and vastly damaging young people’s understanding of sex and relationships.
The survey, carried out by Middlesex University, was jointly commissioned by the NSPCC and the children’s commissioner for England, and shows around 53 per cent of 11-16 year olds have seen graphic porn content online and 94 per cent of them had viewed adult content by the time they were just 14 years old.
In total 1,001 children aged between 11-16 years old were questioned and the survey found 65 per cent of 15-16 year olds had viewed porn, as had 28 per cent of 11-12 year olds. More than half of the boys surveyed, 53 per cent, said they thought porn was a realistic portrayal of sex as did 39 per cent of girls. More than a third of 13-14 year olds and a fifth of 11-12 year old boys also said they wanted to copy the action they had seen.
Our recent work
Digital Economy Act 2017
Concerned by this growing and alarming trend, CARE campaigned for greater protection for children online, specifically through the introduction of age verification checks on adult websites. After many years of campaigning, the Government agreed its Digital Economy Act must require people visiting pornographic websites to prove they are over 18. CARE worked closely with MP Claire Perry on an amendment to ensure this is backed by proper powers of enforcement. The Government opposed this, but the amendment was finally included.
We also assisted MP Fiona Bruce with her amendment to close a loophole that meant 18-rated content on ‘video on demand’ platforms would not need the same age verification checks as other pornographic websites.
CARE also submitted written evidence on the Digital Economy Bill during its committee stage.
Whilst we welcome much of the Digital Economy Act 2017, there are a number of ways in which the legislation could be improved still further.
CARE helped several Peers to oppose amendments to the Bill that allow violent pornography and non-photographic images of child sex abuse, which are illegal offline, not to be blocked so long as age verification checks are in place. These efforts were unsuccessful and now CARE is supporting Baroness Howe who has introduced the Online Safety Bill specifically to address this problem.
CARE’s polling in 2018 demonstrated that the vast majority of MPs do not think it is right that the Digital Economy Act makes non-photographic child sex abuse images available online. ComRes polling conducted on behalf of CARE shows 71 per cent of MPs are not happy with this accommodation of animated child sex abuse images and that number jumps to 76 per cent of female MPs.