Following consultation, the European Commission has published guidance on protecting children online under the Digital Services Act. Alongside the guidance, it has also issued a prototype of an age verification app.
The guidance aims to ensure that children benefit from high levels of privacy, safety and security on online platforms. Amongst other things, it includes recommendations on:
- Setting children's accounts to private by default so that their personal information and social media content is restricted to their connections, to reduce the risk of unsolicited contact by strangers.
- Modifying the platforms' recommender systems to reduce the risk of children encountering harmful content or getting stuck in rabbit holes of specific content, including by empowering children to be more in control of their feeds.
- Empowering children to be able to block and mute any user and ensuring they can't be added to groups without their explicit consent. This could help to prevent cyberbullying.
- Prohibiting accounts from downloading or taking screenshots of content posted by children to prevent the unwanted distribution of sexualised or intimate content and sexual extortion.
- Disabling by default features that contribute to excessive use, like communication "streaks," ephemeral content, "read receipts," autoplay, or push notifications, as well as removing persuasive design features aimed predominantly at engagement and putting safeguards around AI chatbots integrated into online platforms.
- Ensuring that children's lack of commercial literacy is not exploited and that they are not exposed to commercial practices that may be manipulative, lead to unwanted spending or addictive behaviours, including certain virtual currencies or paid loot-boxes.
- Introducing measures to improve moderation and reporting tools, requiring prompt feedback, and minimum requirements for parental control tools.
The guidelines adopt a risk-based approach, like the DSA, recognising that online platforms may pose different types of risks to minors, depending on their nature, size, purpose and user base. The guidance says that platforms should make sure that the measures they take are appropriate and do not disproportionately or unduly restrict children's rights.
Age verification
The verification app will be tested and further customised in collaboration with member states, online platforms and end-users. The guidance outlines when and how platforms should check the age of their users. It recommends age verification for adult content platforms and other platforms that pose high risks to the safety of minors. It specifies that age assurance methods should be accurate, reliable, robust, non-intrusive and non-discriminatory.
Update from the UK
Ofcom is keeping busy where it comes to protecting children online as well. Last week it opened an enforcement programme to monitor whether providers are complying with their children's risk assessment and record-keeping duties under the Online Safety Act 2023. All user-to-user and search services were required to carry out, and keep a written record of, a children's access assessment by 16 April 2025 to establish whether a service is likely to be accessed by children. Where a service is likely to be accessed by children, those services must also carry out a children's risk assessment by 24 July 2025 which requires providers to assess the risks of their service and associated content being harmful to children. Providers must also make and keep a written record, in an easily understandable form, of all aspects of every children's risk assessment, including details about how the assessment was carried out and its findings. Ofcom's rules for protecting children come into force on 25 July.
The EU's guidance differs from the very detailed guidance provided by Ofcom and it's likely that if a provider complies with the Ofcom guidance it will comply with most requirements of the Digital Services Act as well as the Online Safety Act. However, it covers some aspects that are not specifically covered by the Onlne Safety Act, such as loot boxes. That said, last month, Ofcom published three reports about the online lives of children in the UK. One covered financial concerns and mentioned loot boxes, dissociative features (such as bundled in-game currency packages), misleading features (such drip pricing), social influence features (such as streak rewards) and impulse activating features (such as time-limited mystery rewards). So, Ofcom is keeping an eye on how these affect children.
With more changes to the online safety regime coming, platforms need to keep on their toes.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.