Social media sites could get big fines under the UK’s new online safety laws
Social media platforms must protect users from harmful content like terrorist material and child sexual abuse under the Online Safety Act.
If these platforms don’t put strong safety measures in place in the UK, they could face large fines. The illegal content includes fraud, terrorism, child abuse, promoting suicide, extreme pornography, and selling drugs.
Starting Monday, all websites and apps covered by the Online Safety Act — over 100,000 services including Facebook, Google, X (Twitter), Reddit, and OnlyFans — must take steps to block this harmful content or remove it quickly if it appears.
The technology secretary, Peter Kyle, said this action against illegal content is “just the beginning.”
“Tech companies have not focused enough on safety in recent years. But that changes now,” he said.
Companies that break the new law could face fines of up to £18 million or 10% of their global income. For big companies like Facebook’s owner, Meta, or Google, this could mean billions of pounds. In serious cases, their services might even be shut down.
Ofcom, the UK regulator in charge of the law, has shared rules for tech companies to follow so they don’t break the law. The law lists 130 types of illegal content that companies must deal with first, making sure their systems can handle these issues.
Some of the rules include:
- Hiding children’s profiles and locations from strangers by default.
- Giving women tools to block or mute people who are harassing or stalking them.
- Setting up a way for organisations to report online fraud.
- Using special technology called “hash matching” to stop the spread of terrorist content and private intimate images shared without consent (often called “revenge porn”).
Last year, Ofcom said that tech companies still had a lot to do to follow the rules of the Online Safety Act. They hadn’t yet put in place all the steps needed to keep children and adults safe from harmful content. In December, Jon Higham, who works at Ofcom, told the Guardian that many big and risky platforms were not using the safety measures recommended by Ofcom.
“We don’t think any of them are doing everything they should,” he said.
Mark Jones, a lawyer at Payne Hicks Beach, said the new rules on illegal harmful content were a big change. Now, tech companies must actively find and remove dangerous material, not just wait for someone to report it.
Some people have criticized the Online Safety Act. For example, US vice-president JD Vance said last month that free speech in the UK was “in retreat” because of it. But Kyle said the act will not be used as a bargaining tool in any talks with the Trump administration about possible tariffs on UK goods sold in the US.
“Our online safety rules are not up for discussion. They are law and will stay,” Kyle told LBC radio last week. The UK government, as Keir Starmer said in Washington, believes the act is about fighting crime, not stopping free debate.
Published: 17th March 2025
For more article like this please follow our social media Twitter, Linkedin & Instagram
Also Read:
Best Fitness Apps for UK Residents: Your 2025 Guide
Best Wellness Retreats in the UK for Ultimate Relaxation 2025
Top Sports Events to Attend in the UK in 2025