These adjustments have led to some high-profile feedback from politicians, and even a petition to repeal the invoice which has generated over 400,000 signatures. The Act itself handed in 2023, nonetheless, the roll out of legislation has been staggered, therefore the renewed public curiosity as some new guidelines come into place.
The act makes it the obligation of social media firms and platform suppliers to guard kids and adults from hurt by making them accountable for their consumer’s security while on their platforms. Failure to adjust to these guidelines may end up in massive fines, firm executives being jailed and even websites being banned within the UK.
However what are the precise adjustments that the Act is implementing? How are they supposed to guard kids and younger folks from hurt? And, most significantly, what’s the proof that these adjustments will truly shield psychological well being?
Eradicating unlawful content material
A rule which got here into pressure in December 2024 requires all firms to take motion towards unlawful content material being shared on their platforms or unlawful transactions and exercise going down via their platforms and companies.
The kinds of unlawful content material the Act outlines embody pictures of sexual abuse, the promoting of unlawful medicine or weapons, the sharing of state-sponsored disinformation via the Overseas Interference Offence, and stopping exploitative or coercive behaviours.
A survey carried out by Savanta for the Psychological Well being Basis earlier this yr, discovered that 68% of younger folks (aged 16-21) reported having seen dangerous or disturbing content material on-line. This doesn’t solely normalise dangerous behaviours for impressionable kids. Viewing violent or abusive content material can even trigger psychological misery and even trauma, significantly in younger folks.
‘Media Induced Trauma’ is nicely documented, significantly following high-profile traumatic occasions. In 2021, researchers at Boston College discovered that following a college taking pictures, individuals who considered extreme protection or upsetting content material, together with graphic movies of the taking pictures itself, have been extra more likely to have signs of PTSD and different psychological issues.
The brand new legislation makes it the duty of social media firms, tech corporations and web suppliers to hold out a threat evaluation of unlawful content material being shared on their platforms and take applicable steps to take away it on an ongoing foundation.
Age verification
Placing the onus onto firms to take away unlawful content material isn’t the one instance of the brand new guidelines. The Act additionally requires platforms to guard kids from viewing content material which could not be unlawful, however which could possibly be in any other case dangerous to growing minds. For instance, pornography web sites now have an obligation to confirm the ages of customers earlier than they’ll entry specific content material.
Pornography, significantly excessive or hardcore pornography, might be damaging for kids’s psychological well being. It may possibly result in dependancy, issues with emotional intimacy and forming relationships, and skewed understandings of intercourse and gender roles in relationships. Publicity to pornography at a younger age makes kids extra more likely to develop signs of hysteria and extra weak to sexual exploitation as sharing specific pictures with younger kids is a standard grooming tactic of predators. The common age kids are first uncovered to on-line pornography is 12 years previous, with 15% of youngsters being beneath the age of 10.
As of July 2025, web sites within the UK internet hosting pornography should confirm customers ages earlier than permitting them to entry specific content material. It’s these age verification guidelines which have prompted some backlash. Critics of the brand new guidelines argue that as customers now must confirm their age by presenting ID, or utilizing different instruments comparable to facial recognition software program, there are dangers of knowledge breaches or customers privateness being threatened.
Social media use
Along with pornography websites, there’s a particular onus on social media websites to risk-assess the content material on their platforms and if it may be dangerous for kids, to then implement applicable age limits. Most youngsters over the age of 10 have a presence on social media with the most well-liked being YouTube and WhatsApp:
Discussion about this post