EU guidelines on keeping children safe online under the Digital Services Act
The guidelines apply to any service that is used, or likely to be used, by minors. This includes social media platforms, video-sharing sites, gaming environments, and community forums. The Commission underlines that it is not enough for a service to declare it is ‘for adults only’ if in practice young users can access it.

The European Commission has published new guidelines to help online platforms comply with the Digital Services Act (DSA) and better protect children and teenagers using digital services. Adopted in July 2025, these guidelines explain in practical terms what online services must do to ensure that minors can use the internet safely, with respect for their rights and well-being.
The DSA applies to all online platforms available in the European Union, regardless of where they are based. Its rules require platforms to provide a high level of privacy, safety, and security for minors. The new guidance focuses on how platforms should meet these obligations through age verification, safer design choices, clear labelling of commercial content, and stronger moderation systems.
Scope and principles
The guidelines apply to any service that is used, or likely to be used, by minors. This includes social media platforms, video-sharing sites, gaming environments, and community forums. The Commission underlines that it is not enough for a service to declare it is ‘for adults only’ if in practice young users can access it.
Three main principles guide the recommendations: children’s rights come first, safety by design, and understanding user needs. This means platforms must integrate safety and privacy protections into their services from the start, rather than reacting after harm occurs.
Assessing risks and verifying age
Platforms are required to identify potential risks for minors, such as cyberbullying, harmful or illegal content, and excessive screen time. They should then put in place effective measures to reduce these risks without unnecessarily limiting access to beneficial content and interactions.
A key component of the guidelines is age assurance, which refers to methods for determining whether users are old enough to access certain services. The guidelines distinguish between three approaches:
- Self-declaration (entering a date of birth or clicking a confirmation box) — considered unreliable;
- Age estimation (using tools such as facial analysis or behaviour patterns);
- Age verification (checking official IDs or trusted digital credentials, such as the upcoming EU Digital Identity Wallet, expected in 2026).
Platforms are encouraged to choose the least intrusive and most reliable method for their users, explaining the process in plain language and offering appeals when age assessments are inaccurate.
Safer design and user settings
The guidelines emphasise that most users, including young ones, rarely change default settings. Therefore, platforms must ensure that default options are private and secure. These should limit who can contact minors, turn off geolocation and tracking features, prevent strangers from downloading content, and disable notifications that may encourage constant use.
Design features that promote overuse, such as infinite scrolling, streaks, or daily rewards, should be avoided. Interfaces should be easy to navigate, accessible to all children (including those with disabilities), and provide clear warnings when interacting with AI tools.
Responsible algorithms and advertising
The guidance calls on platforms to make their recommender systems, the algorithms that suggest content, transparent and controllable. Children should be able to adjust what they see, reset their feed, and understand why specific content is recommended. Unsafe or inappropriate content must be filtered out.
When it comes to advertising, platforms must clearly distinguish between paid and organic content. All ads, product placements, and influencer promotions should be clearly labelled and suitable for minors. The guidelines discourage manipulative design practices such as countdown timers or “buy now” prompts, and call for bans on gambling-like elements such as loot boxes in games.
Moderation, reporting, and parental tools
The guidelines require platforms to maintain effective moderation systems that detect and remove harmful or illegal material promptly. Moderators should be trained to identify threats such as grooming or cyberbullying. Reporting tools should be easy to find and use, with quick responses and feedback for users who report problems.
Parents and guardians should have access to supportive tools that foster communication rather than surveillance. These tools must respect children’s privacy and inform them when parental features are activated.
Oversight and next steps
Enforcement of the DSA is shared between the European Commission and national Digital Services Coordinators (DSCs) in each EU country. DSCs can appoint expert organisations known as trusted flaggers, whose alerts about illegal or harmful content require prompt action from platforms.
Looking ahead, the Commission and member states plan to test and implement the EU age-verification app, develop an EU action plan against cyberbullying, and study the impact of social media on children’s mental health.