Pornography

Social Media Platforms now to face fines for harmful content

As of this Monday, tech companies in the UK must take stronger action against illegal online content under the Online Safety Act, or face heavy fines. The law targets fraud, terrorism, child sexual abuse material, and other harmful content, requiring platforms to remove or block such material.

Companies that fail to comply could face fines of up to £18m or 10% of global revenue—which could mean billions for major platforms like Meta and Google. In extreme cases, services could be taken down.

Ofcom, the regulator overseeing the act, has set out codes of conduct for platforms to follow, which include:

  • Protecting children by hiding their profiles and locations from strangers.
  • Allowing women to block harassers more easily.
  • Using technology to prevent the spread of illegal images and content.
  • Providing reporting channels for online fraud cases.

Last year, Ofcom warned that tech firms still had work to do to comply. On Monday, the regulator also announced it would scrutinise online storage services to ensure they prevent the distribution of child abuse material.

Legal experts say the new rules mark a major shift, requiring tech companies to be proactive in identifying and removing illegal content. However, the act has faced criticism from US politicians, including JD Vance, who argued it restricts free speech.

The Technology Secretary Peter Kyle said, in response to such comments: "Our online safety standards are not up for negotiation".

Kyle said that the introduction of the new Online Safety laws was "just the beginning". He continued, "In recent years, tech companies have treated safety as an afterthought. That changes today."

Share

Jonas leupe Warg GLQW Yk unsplash
Pornography

Recent news in Pornography

  1. Explainer: What is the porn review?

    Pornography