EU Commission finds Meta and TikTok may have breached transparency rules under the Digital Services Act
It is important to emphasise that these are preliminary findings. They do not amount to a final decision. The platforms in question now have the opportunity to review the investigation’s documents and submit written replies. The formal proceedings remain open.
The European Commission has issued preliminary findings indicating that Meta and TikTok may have violated key transparency and user rights obligations under the EU’s Digital Services Act (DSA). The assessment, published on 24 October 2025, follows months of investigation into whether both companies are providing adequate data access to researchers and effective tools for users to report illegal content or appeal moderation decisions.
The DSA, which came into effect in 2022, sets strict rules for very large online platforms, including Meta’s Facebook and Instagram, and TikTok. One of its central goals is to ensure transparency about how these platforms influence users and public discourse. According to the Commission, both companies appear to have created complex and restrictive procedures that limit researchers’ access to public data. Such barriers, it noted, leave researchers with incomplete or unreliable information, making it difficult to assess the spread of harmful or illegal content and its impact on users, especially minors.
The Commission stressed that data access for researchers is a core transparency obligation under the DSA. Allowing independent experts to study platform activity is meant to strengthen public oversight and help identify systemic risks—such as the promotion of misinformation or exposure to harmful material.
In Meta’s case, the Commission also examined whether Facebook and Instagram provide EU users with straightforward mechanisms to report illegal content. The preliminary conclusion is that these tools are neither user-friendly nor easily accessible, and that the company uses so-called ‘dark patterns’ – design techniques that confuse or discourage users from completing reports. Such practices could make it harder to remove illegal material quickly, which is a key requirement of the DSA.
Another issue concerns users’ ability to appeal content moderation decisions. The DSA gives every user in the EU the right to challenge a platform’s decision to remove their content or suspend their account. However, Meta’s current system reportedly does not allow users to explain why they disagree with the decision or to provide evidence supporting their case. The Commission said this undermines the purpose of the appeals process and limits users’ ability to defend their rights.
The findings are based on extensive cooperation with Coimisiún na Meán, Ireland’s Digital Services Coordinator, which oversees Meta’s compliance in the EU. Both Meta and TikTok now have the opportunity to respond to the Commission’s findings, examine the evidence, and propose corrective measures. These are not final conclusions; the platforms remain under investigation, and the Commission’s final decision will depend on their responses.
If the preliminary assessment is confirmed, the Commission could issue a formal non-compliance decision. This could result in fines of up to six percent of each company’s global annual turnover or periodic penalties until they meet their obligations.
The announcement also comes days before new DSA rules on researcher access to non-public data take effect on 29 October 2025. These rules will expand transparency obligations further, allowing approved researchers to access a broader range of platform data for accountability and risk-assessment purposes.
Why does it matter?
The Commission’s preliminary finding signals that major social-media platforms are under increased regulatory scrutiny in the EU — not just for the content they host, but also for their transparency, accessibility of data to independent researchers, and user-friendliness of their rights mechanisms under the DSA. Platforms like Meta’s Facebook/Instagram and TikTok now face scrutiny not only for moderation decisions but for the systems and interfaces that underlie transparency, reporting and appeals.
For observers of digital regulation, this case presents an early test of the DSA’s effectiveness: whether it can translate its obligations on paper into meaningful enforcement to process, data-access, and user-rights compliance. If upheld, the case may influence how platforms globally approach transparency and researcher engagement.
