On 14 July 2025, the European Commission (Commission) released its first guidelines under Article 28(4) of the Digital Services Act (DSA) (Guidelines), which will apply to all online platforms accessible to minors (with the exception of micro and small enterprises), setting a new benchmark for child safety online. This article highlights the key measures and recommendations of the Guidelines for all online platforms.
By way of recap, Article 28 of the DSA outlines four core obligations for online platform providers to consider regarding the online protection of minors, namely that:
- appropriate and proportionate measures should be implemented where the platform is accessible to minors to protect their privacy, safety, and security;
- profiling-based advertising should not be used if the provider is aware that the user is a minor;
- online platform providers are not obliged to process additional personal data to determine a user's age; and
- under Article 28(4), the Commission may issue guidelines to assist online platforms in applying the above-listed obligations.
The latest Guidelines are the first issued under Article 28(4).
Although not legally binding, the Guidelines are intended to assist Digital Services Coordinators, such as Coimisiún na Meán in Ireland, in interpreting and applying Article 28 of the DSA, with the Commission describing them as a "significant and meaningful benchmark" for assessing compliance by online platforms accessible to minors. As such, providers would be ill-advised to disregard them. Notably, while the primary aim is to ensure a high level of privacy, safety, and security for minors, the Guidelines also encourage providers to adopt these protective measures more broadly, for the benefit of all users. Therefore, even platforms not specifically targeting minors should consider the Guidelines in light of this broader recommendation.
Principles to be embedded in any measures adopted
At the core of the Guidelines are four interconnected principles that the Commission recommends online platform providers should apply holistically when implementing measures under Article 28(1) of the DSA:
- First, any action must be proportionate and appropriate to the platform's specific risks, requiring a case-by-case assessment based on its nature, scale, and potential impact on children's rights.
- Second, all measures must take into account children's fundamental rights, including protection, privacy, non-discrimination, access to information, and participation, in line with the EU Charter of Fundamental Rights and the UN Convention on the Rights of the Child (UNCRC), with the best interests of the child as a guiding principle.
- Third, platforms should embed privacy, safety, and security into the design of their services from the outset.
- Finally, features should be tailored to children's developmental stages, ensuring accessibility and usability across age groups.
Main measures and recommendations
In addition to the core principles, the Guidelines set out a range of practical, risk-based measures and recommendations that the Commission expects providers to implement to ensure a high privacy, safety, and security standard. Below is a summary of selected examples from the non-exhaustive list included in the Guidelines:
1. Risk review to be completed
Given the wide variation in risk profiles across online platforms, and therefore to determine which measures are most suitable, providers are encouraged to carry out a comprehensive risk review addressing eight key areas. These include the likelihood of minors accessing the service, the actual or potential impact on their privacy, safety, and security, and the effectiveness of any existing safeguards.
The review should be conducted at least annually, or whenever a significant change to the platform's design or operating context could affect child safety. Providers should bear in mind that they are expected to share the outcomes of these reviews with relevant supervisory authorities and publish the findings before the next review cycle. The Guidelines also recommend that providers consider submitting their assessments to independent experts or relevant stakeholders for feedback.
While the Guidelines state that the Commission may issue further tools or guidance to support this process in the future, providers are encouraged to draw on existing standards and child rights impact assessment tools. As a practical first step, we recommend that online platform providers review their current risk assessment processes and evaluate how the steps outlined in the Guidelines can be integrated into their existing frameworks.
2. Service-design related measures
The Commission, via the Guidelines, presents several approaches for designing online platform services that align with the requirements of Article 28 of the DSA. For example:
2.1 Age Assurance
Before implementing any age assurance method, the Guidelines encourage providers to assess its necessity and proportionality to ensure high privacy, safety, and security standards for minors. This includes evaluating whether these objectives could be achieved through less intrusive means. The use of existing risk and child rights impact assessment tools is encouraged to support this analysis, and providers are encouraged to publish the assessment results, regardless of outcome.
The Guidelines identify three types of age assurance: self-declaration, age estimation and age verification. While these methods differ in accuracy, the Commission notes that accuracy alone does not determine their impact on children's rights. Depending on the context, more accurate methods may raise significant concerns, while less accurate ones may be less intrusive and more rights-preserving.
Notably, where age estimation involves the processing of personal data, providers should consult the European Data Protection Board's (EDPB) Statement 1/2025. It outlines ten principles for GDPR-compliant age assurance, such as data minimisation, purpose limitation, and proportionality. For providers unfamiliar with these principles, our article 'Navigating Age Assurance: Another Layer – Insights from the EDPB' offers a helpful overview and deeper insights into the EDPB's statement.
Crucially, the Commission emphasises that age assurance and access restrictions alone are insufficient. These must be implemented alongside the broader protective measures outlined throughout the Guidelines. Additionally, where only specific features of a service pose risks to minors, restrictions should be applied selectively to those elements, rather than to the entire platform.
The Guidelines also offer practical examples illustrating when different age assurance methods may be most suitable. Providers who have already implemented such measures should evaluate their current practices against the Commission's recommended use cases to ensure consistency and compliance. Likewise, for those starting from scratch, the use cases outlined in the Guidelines serve as a valuable reference point for determining the most appropriate approach in each context.
Looking ahead, providers should be aware that EU Member States are in the process of rolling out EU Digital Identity Wallets (the euID) to all citizens, residents, and businesses. These wallets, expected to be available by the end of 2026, will offer a secure, privacy-preserving means of electronic identification, enabling users to verify their identity and manage digital documents safely. In the interim, on 14 July 2025, the Commission announced a pilot standalone EU age verification solution, designed to meet the effectiveness criteria set out in the Guidelines. Once finalised, this tool will serve as a compliance benchmark and a reference standard for device-based age verification. Providers are encouraged to engage with the testing of early versions of this solution where possible.
2.2 Registration
The Guidelines recognise that registration and authentication are crucial for ensuring minors access online services safely and appropriately. The Commission sees these processes as a proportionate way to implement age assurance while upholding privacy, safety, and security. Where registration is not required, platforms should configure settings to protect unregistered users, especially from risks like adult impersonation. Where registration is required or offered, it should be accessible, transparent, and suited to minors' evolving capacities, without encouraging underage sign-ups. It should also serve as an opportunity to implement age assurance, highlight safety features, and limit data collection to that which is necessary, in addition to parental consent (where required). Providers should review their registration processes to ensure alignment with the standards set out in the Guidelines.
2.3 Account settings
The Guidelines emphasise that online platforms accessible to minors should be designed with protective default account settings and features that support informed decision-making. Since users often do not adjust default settings, providers are encouraged to configure them in ways that prioritise children's privacy, safety, and security. This includes:
- limiting interactions to approved contacts;
- restricting the visibility of posts and account information;
- disabling features such as geolocation, camera access, and contact syncing; and
- turning off the visibility of likes and user activity.
Platforms should also provide clear warnings when minors attempt to change these settings, along with regular reminders and simple options to restore the original configuration. In addition, the Guidelines recommend:
- avoiding persuasive design techniques (e.g. dark patterns) that encourage excessive use;
- offering accessible and customisable time management tools;
- ensuring that AI chatbots and filters are not enabled by default; and
- implementing safeguards that inform minors when they are interacting with AI rather than a human.
These safeguards should also alert users to the possibility that AI-generated content may be misleading or factually incorrect.
2.4 Recommender systems and search features
Recommender systems influence how content and contacts are prioritised and presented to minors, shaping their online experience. If certain types of content are amplified at scale, these systems can pose serious risks to minors' privacy, safety, and security. The Guidelines advise providers to regularly test and adapt these systems for minors, in line with the risk review process outlined in Section 1 above. The Guidelines list several factors to consider when conducting reviews of such systems, including that providers should:
- prioritise explicit user-provided signals when determining recommendations;
- prevent the spread of illegal content involving minors; and
- limit exposure to potentially harmful content, especially when repeated, such as unrealistic beauty ideals or the glorification of mental health issues.
The Guidelines also state that this review process should include input from minors, guardians, and independent experts.
Additionally, providers should allow users to reset their feeds, present recommender system terms in a child-friendly format, explain why content is recommended, and offer accessible tools, ideally at account setup, for minors to customise their content preferences.
2.5 Commercial practices
The Guidelines set out several measures for online platforms to follow when initiating commercial practices involving minors. Providers should ensure that a minor's limited commercial literacy is not exploited and offer appropriate support. They must prevent exposure to harmful, unethical, or unlawful advertising and manipulative design techniques, such as scarcity tactics. Platforms should also avoid excessive volumes or frequency of commercial content that could lead to impulsive spending and ensure AI systems do not nudge minors toward purchases. Additionally, providers must avoid techniques that obscure the transparency of transactions, such as certain virtual currencies, which may mislead minors. This aligns with the European Consumer Protection Cooperation Network's 'Key Principles on In-game Virtual Currencies', published in March 2025.
This is merely a selection of the factors the Guidelines outline, and we strongly recommend that all online platforms consider the full text of the Guidelines and/or reach out to a member of our Technology team for comprehensive guidance on this issue.
2.6 Moderation
Finally, the Guidelines highlight the importance of effective content moderation in protecting minors from harmful content and behaviour that could compromise their privacy, safety, and security. In addition to the obligations under Articles 14 to 17 of the DSA, the Guidelines recommend further steps for online platform providers. These include:
- clearly defining what constitutes harmful content for minors in consultation with children, civil society, and independent experts;
- establishing and regularly reviewing moderation policies;
- ensuring human oversight alongside automated tools;
- providing ongoing training and resources for moderation teams; and
- adopting effective technologies such as hash matching and URL detection to address known harmful or illegal content.
Online platform providers should review their moderation procedures and incorporate any additional elements from the Guidelines that are not yet reflected in their current practices.
3. Reporting, user support and tools for guardians
In addition to the DSA's core online platform obligations, the Guidelines recommend further measures to strengthen reporting, feedback, and complaints processes for minors. Providers should allow users to report inappropriate content, behaviour, accounts, or suspected underage users, and offer free access to internal complaint systems, particularly for age-related disputes. While reporting categories are discouraged, they should be adapted for young users if used.
Reports should default to being confidential and anonymous, with minors able to opt out of anonymity. For user support, platforms should provide clearly visible tools, limit reliance on AI-based assistance for children, and display warning messages with links to national helplines when minors search for or report harmful or illegal content.
Additionally, all platforms accessible to children should implement guardian control tools aligned with Article 28(1) of the DSA. These tools should promote communication and empowerment rather than control, be easy to activate without requiring guardian accounts, work across devices and operating systems, and notify minors in real time when monitoring features are enabled.
4. Governance
The Guidelines emphasise that strong platform governance is essential to properly prioritising minors' privacy, safety, and security. Providers are expected to adopt internal policies outlining how these protections will be upheld, assign responsibility to a dedicated team or individual with sufficient resources, and foster a culture that values child safety and participation in platform design.
In addition to the DSA's Article 14 obligations on terms and conditions, the Guidelines recommend that a platform's terms clearly explain account opening to closing processes, promote positive and inclusive community standards, and define harmful or illegal content affecting minors. These terms should also be easy to locate, searchable, regularly updated, and implemented in practice without unduly restricting children's rights.
Finally, from a governance perspective, transparency is a core principle, and platforms should make information about safety measures accessible to minors and, where appropriate, their guardians through user-friendly interfaces.
Concluding thoughts and next steps for online platform providers
These 64-page Guidelines represent a major step forward in the EU's efforts to protect minors online and offer valuable insight into how regulators expect Article 28, arguably one of the DSA's most significant provisions, to be applied in practice. As outlined in this article, the Guidelines go beyond the DSA's core obligations, introducing detailed recommendations that will likely vary in implementation depending on a platform's use of AI, recommender systems, and commercial practices. For some, aligning with these standards may require substantial investment. We strongly encourage all online platform providers whose services are accessible to children to assess the Guidelines against their current risk assessments, complaints procedures, age assurance methods, and recommender system practices to identify gaps. We are available to support providers with this analysis as needed.
Importantly, this is unlikely to be the final word on the matter. The Commission is expected to review the Guidelines within 12 months, or sooner, if circumstances demand. As the digital landscape continues to evolve, regulatory expectations will also grow. Notably, Ireland's online safety code came into effect on 21 July 2025, following a nine-month grace period. Video-sharing platforms are now required to implement robust age verification systems, making the release of these Guidelines particularly timely. Therefore, providers must begin engaging with them now to stay ahead of emerging compliance obligations.
Contributed by Aishling Taite
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.