EU rolls out revised chat control plan, dropping mandatory scanning but keeping key risks
The EU’s updated proposal on the Regulation to Prevent and Combat Child Sexual Abuse removes mandatory scanning of all user communications, shifting to a voluntary detection model for messaging platforms. Denmark’s revised text is being presented as a compromise that protects encryption, yet it introduces new identity-verification rules, broad risk-mitigation duties, and vague provisions that critics say could re-open the door to surveillance in the future. As negotiations accelerate toward year’s end, the debate now centres on whether the reform meaningfully protects rights or merely repackages previous risks in a different form.
The latest revision of the EU’s Regulation to Prevent and Combat Child Sexual Abuse (CSAR) marks a significant departure from earlier versions of the so-called ‘Chat Control’ proposal. Denmark’s new draft removes the obligation for messaging platforms to perform blanket scanning of all user content. Instead, the detection of child sexual abuse material would become voluntary. This shift is presented as a way to ease long-standing concerns around mass surveillance, encryption, and privacy, while still allowing platforms to deploy detection tools if they choose to do so.
However, the proposal introduces new uncertainties. A central feature of the revised text is Article 4, which requires platforms classified as high-risk to implement all appropriate risk mitigation measures. The wording is open to interpretation, and digital-rights groups argue that it leaves room for a de facto return of scanning mandates in the future. The draft also contains a review clause that explicitly empowers the European Commission to reassess whether detection should become mandatory at a later stage and, if needed, propose new legislation. Although the current text removes mass scanning, it does not close the door to its reintroduction.
Targeting requirements remain another point of contention. The European Parliament’s position has been clear: scanning should only occur when authorised by a court and directed at specific individuals or groups suspected of child sexual abuse. This would provide a higher level of due process and minimise the risk of broad, indiscriminate monitoring. Denmark’s compromise proposal takes a different path. It allows detection orders to apply to “identifiable parts” of a service, such as user communities or channels, but stops short of requiring that those targets be under suspicion. Critics note that this approach risks sweeping in large segments of users with no link to wrongdoing.
The revised text introduces measures that could curb online anonymity. New registration rules may require users to verify their identity or even provide biometric data, such as facial images. These provisions are aimed at limiting abuse, but they raise substantial concerns for groups who rely on anonymity for safety. Journalists, whistle-blowers, LGBTQ+ youth, and political dissidents may face new barriers to secure communication if anonymous accounts are no longer possible.
Age-verification obligations add an additional layer of complexity. Providers will be required to implement technologies to determine whether users are minors and, in some cases, restrict access accordingly. Yet the proposal does not explain how these mechanisms should function without creating new privacy risks, storing sensitive data, or introducing barriers for adults who cannot or will not share identification. Digital-rights organisations have repeatedly warned that mandatory age checks can undermine users’ privacy and significantly reshape how Europeans access online platforms.
Taken together, the revised proposal attempts to address objections to mass scanning while preserving tools for child-protection efforts. But its flexible language, potential for future expansion, and new identification requirements leave unresolved questions about surveillance, data protection, and the long-term direction of EU policy. With Denmark eager to close negotiations before the Cypriot presidency takes over in January, legislators face a compressed timeline to settle these issues and determine the balance between child safety and fundamental rights in Europe’s digital environment.
