EU Commission opens investigation into Snapchat over child safety concerns under DSA

The European Commission has launched formal proceedings to assess whether Snapchat is complying with EU rules on protecting minors online, focusing on age checks, harmful content, and platform design.

EU Commission opens investigation into Snapchat over child safety concerns under DSA

The European Commission has opened a formal investigation into Snapchat to assess whether the platform complies with the EU’s Digital Services Act (DSA) in protecting children online.

The probe examines whether Snapchat ensures a high level of safety, privacy, and security for minors. The Commission has identified several areas where the platform may fall short, including age verification, exposure to harmful interactions, and access to illegal or age-restricted content.

One key concern relates to age assurance. Snapchat requires users to be at least 13 years old, but the Commission suspects that relying on self-declared age is not sufficient. This approach may allow younger children to access the platform and does not reliably distinguish between younger teens and older users, which is necessary to provide age-appropriate protections.

The investigation also focuses on risks of grooming and criminal exploitation. The Commission suspects that Snapchat may not adequately prevent adults from posing as minors, which could expose children to harmful contact, including sexual exploitation or recruitment into criminal activities.

Default account settings are another area under scrutiny. According to the Commission, features such as automatic friend recommendations and enabled notifications may not provide sufficient safeguards for younger users. It also notes that users are not clearly guided on how to adjust privacy and safety settings when creating an account.

The Commission is further examining whether Snapchat effectively limits the spread of content related to illegal goods, such as drugs, or age-restricted products like alcohol and vaping products. Current moderation systems may not be sufficient to prevent such content from reaching users, including minors.

In addition, the Commission raises concerns about how users report illegal content. Reporting tools may be difficult to access or use, and users may not be clearly informed about how to challenge platform decisions.

The investigation follows earlier action by the Dutch Authority for Consumers and Markets, which examined the sale of vaping products to minors through the platform. The Commission has now taken over the case and will conduct a broader assessment.

As part of the proceedings, the Commission may request further information, carry out inspections, and, if necessary, impose interim measures or penalties. Snapchat may also propose changes to address the concerns identified.

Go to Top