EU Commission issues guidelines to strengthen online protection for minors under the Digital Services Act

The publication of these guidelines reflects the EU’s broader commitment to building a safer digital environment for all, especially the youngest and most vulnerable users.

EU Commission issues guidelines to strengthen online protection for minors under the Digital Services Act

The European Commission published detailed guidelines to support the implementation of Article 28 of the Digital Services Act (DSA), aiming to enhance the online privacy, safety, and security of minors. These guidelines respond to growing concerns about the digital risks children face and offer a framework for online platforms to adopt more child-safe practices.

The guidelines propose a series of measures that platforms accessible to minors should consider. These measures target risks such as exposure to harmful or illegal content, grooming, cyberbullying, manipulative commercial practices, and the impact of addictive or excessive use. Special attention is given to features driven by algorithms or AI, including recommender systems and chatbots, that may amplify these risks.

Among the recommended safeguards are setting children’s accounts to private by default, restricting unsolicited contact, limiting the visibility of minors’ profiles and content, disabling auto-play and streak features, and improving parental control and reporting tools. Platforms are also encouraged to restrict high-risk content, such as pornography, gambling, or adult-oriented advertising, using robust age verification methods. These methods must be accurate, reliable, and non-intrusive, and should avoid collecting unnecessary personal data. Notably, the guidelines outline the future use of the EU Digital Identity Wallet and an EU-wide age verification solution as reference models.

The Commission emphasises a ‘safety and privacy by design’ approach, urging platforms to integrate safeguards into service architecture from the outset. The guidance calls on providers to carry out periodic risk assessments and child rights impact reviews, take into account the evolving capacities of children, and involve youth voices in the design of safety tools. Importantly, the measures must be balanced to ensure that they do not unduly limit minors’ fundamental rights, including freedom of expression and access to information.

While adherence to the guidelines is voluntary, they serve as a benchmark for assessing compliance with Article 28(1) of the DSA. The Commission, national Digital Services Coordinators, and other regulators may use these guidelines to guide enforcement and evaluate whether platforms have adopted adequate measures to protect minors online. The guidelines were developed following consultations with experts, regulators, civil society, and children themselves.

The publication of these guidelines reflects the EU’s broader commitment to building a safer digital environment for all, especially the youngest and most vulnerable users.

Go to Top