ARTICLE
6 November 2025

European Parliamentary Committee Pushes For Tougher Rules To Make Online Services Safer For Children

LS
Lewis Silkin

Contributor

We have two things at our core: people – both ours and yours - and a focus on creativity, technology and innovation. Whether you are a fast growth start up or a large multinational business, we help you realise the potential in your people and navigate your strategic HR and legal issues, both nationally and internationally. Our award-winning employment team is one of the largest in the UK, with dedicated specialists in all areas of employment law and a track record of leading precedent setting cases on issues of the day. The team’s breadth of expertise is unrivalled and includes HR consultants as well as experts across specialisms including employment, immigration, data, tax and reward, health and safety, reputation management, dispute resolution, corporate and workplace environment.
In a bid to curb the growing risks children face online, members of the European Parliament's Internal Market and Consumer Protection Committee (IMCO) have published a report urging...
United Kingdom Media, Telecoms, IT, Entertainment
Lewis Silkin are most popular:
  • within Cannabis & Hemp, Privacy and Law Practice Management topic(s)

In a bid to curb the growing risks children face online, members of the European Parliament's Internal Market and Consumer Protection Committee (IMCO) have published a report urging the European Commission to rapidly enforce the EU's Digital Services Act (DSA) (fully applicable since 17 February 2024 – see our article on the DSA here). It wants the EC to tackle designs that keep kids glued to screens — think endless scroll, autoplay, and gaming features that can feel a lot like gambling.

The Committee's message is blunt: major platforms are not doing enough. IMCO says children are too often exposed to addictive features, harmful content, and manipulative practices. It wants an EU-wide "digital minimum age" of 16 for access to social media, video-sharing platforms, and AI companions without parental consent.

How children use the internet is changing fast

Young people today don't go online the way previous generations did. A recent Eurobarometer survey shows that they're turning more and more to digital sources for news, entertainment, and social connection. The shift isn't just about screen time — it's about what they see and how they interact with content.

Ofcom (the UK's communications regulator), has found troubling trends with children as young as 8 having accessed pornography online. Governments and regulators are moving to make platforms safer, especially for under-18s.

  • In the UK, the Online Safety Act sets new expectations for how platforms handle harmful content and protect users, with additional rules now emerging (see our articles discussing the Online Safety Act and proposed updates).
  • In the EU, the DSA already sets guardrails for large platforms, and a new proposal, the Digital Fairness Act, aims to go further. It focuses on issues like manipulative design, personalised experiences that can push users in unhealthy directions, and the protection of vulnerable groups, including children. The consultation and call for evidence closed on 24 October 2025, with more clarity and a draft law expected in 2026.

Raising the bar on age checks

IMCO backs the European Commission's work on privacy-preserving age assurance systems (i.e. tools that can verify a user's age without collecting unnecessary personal data). But it also makes a key point: age checks are no substitute for safer product design. It says that services should be safe by default and by design.

The report proposes a two-step model: (i) a universal minimum age of 13 for any access to any social media, alongside (ii) a stricter EU-wide digital minimum age of 16 for social media, video platforms and AI companions unless parental authorisation is provided.

Enforcing the Digital Services Act

Words are one thing; enforcement is another. IMCO is urging the European Commission to fully deploy its enforcement toolkit under the DSA against services that endanger minors, including imposing significant fines and, if needed, suspending or banning non-compliant apps.

It also sets out priority actions it thinks that the Commission should consider:

  1. Personal liability for senior managers in cases of serious and repeated failures to protect minors (particularly failures in age verification).
  2. A ban on engagement-based recommender algorithms for minors, with the most addictive design features disabled by default.
  3. Strict limits on profiling minors, so recommender systems cannot present content based on behavioural tracking.
  4. A prohibition on gambling-like mechanisms such as loot boxes in games accessible to children, and a ban on monetising or incentivising "kidfluencing," where minors act as influencers.
  5. Targeted action on AI-powered nudity apps and their ethical and legal risks.
  6. Stronger enforcement under the AI Act against manipulative or deceptive chatbots.

Closing the gaps on digital fairness

IMCO also wants the forthcoming Digital Fairness Act to take square aim at persuasive techniques that exploit behavioural biases, especially when used on young people. That includes targeted advertising, influencer marketing, addictive design patterns, dark patterns, loot boxes, infinite scroll, autoplay, disappearing stories, and gamification mechanics that encourage compulsive engagement or spending. The goal is to prevent design choices that intentionally increase exposure to harmful content or push minors to overuse.

We do not yet know exactly what the Digital Fairness Act will propose or how it will interact with the DSA. But the direction of travel is clear: design choices will be held to a higher standard, and children's interests will come first.

What comes next

The European Parliament is due to vote on IMCO's recommendations at the 24–27 November plenary session. If approved, that vote will ratchet up pressure on the EC to accelerate DSA enforcement against harmful design and content practices and to advance the Digital Fairness Act to close loopholes and align platform design with the best interests of minors.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More