Commission steps up DSA enforcement on child safety and platform design

The European Commission is increasing enforcement of the Digital Services Act, focusing on risks to minors, platform design features, and age-verification systems.

Commission steps up DSA enforcement on child safety and platform design

The European Commission has intensified enforcement actions under the Digital Services Act (DSA), focusing on how online platforms address risks to minors.

According to the Commission, recent actions target issues such as exposure to harmful or restricted content, the use of design features that may encourage prolonged use, and the effectiveness of age-verification systems.

Enforcement measures have involved several major platforms, including TikTok, Facebook, Instagram, Snapchat, and Shein. The Commission has raised concerns about features such as infinite scrolling, autoplay functions, and personalised recommendation systems, which may affect how minors interact with content.

Additional actions have been taken against online adult content platforms for not implementing effective age-verification mechanisms.

Henna Virkkunen, Executive Vice-President of the Commission, stated that these measures are intended to ensure platforms are accountable for how their services affect children and young users.

Alongside enforcement, the Commission is developing a digital age verification system. The tool is designed to allow users to prove their age without sharing unnecessary personal data, using privacy-preserving technologies such as zero-knowledge proofs.

The system is currently being tested in several EU Member States and may be deployed either as a standalone application or integrated into national digital identity systems.

In parallel, the Commission is working on a coordination mechanism to align national approaches to age verification. The aim is to create a consistent framework across the EU, reducing fragmentation while maintaining data protection standards.

Go to Top