EU Parliament Halts Law Permitting Tech Firms to Scan for Child Sexual Abuse
The European parliament has blocked the extension of a critical law that allowed major technology companies to scan their platforms for child sexual exploitation material, leading to a legal void that child safety experts assert will result in undetected crimes and reduced reporting of abuse. This decision, which has drawn sharp criticism from tech giants like Google, Meta, Snap, and Microsoft, comes amid ongoing privacy concerns from some lawmakers, creating uncertainty for companies obligated to remove illegal content under other regulations.
Legal Gap Sparks Fears of Increased Abuse and Reduced Reporting
The law, originally established as a temporary carve-out from the EU Privacy Act in 2021, permitted the use of automated detection technologies to scan messages for harms such as child sexual abuse material (CSAM), grooming, and sextortion. Its expiration on 3 April, without a parliamentary vote for extension, has left a regulatory gap. While scanning for harms is now illegal, companies remain liable under the Digital Services Act to remove any illegal content hosted on their platforms, a contradiction that complicates enforcement efforts.
In a joint statement posted on a Google blog, the tech firms expressed disappointment, labeling the failure to extend the law as "an irresponsible failure to reach an agreement to maintain established efforts to protect children online." They pledged to continue voluntary scanning for CSAM, but experts warn this may not suffice to prevent a significant drop in abuse reports, echoing a 58% decline observed during a similar legal hiatus in 2021.
Child Protection Advocates Warn of Dire Consequences
Child safety organizations have raised alarms about the potential impact of this lapse. John Shehan, vice-president at the National Center for Missing and Exploited Children (NCMEC), highlighted that "when detection tools are disrupted, we lose visibility that directly impacts our ability to find and protect child sexual abuse victims." He emphasized that abuse does not cease when detection capabilities are diminished, pointing to cross-border implications where perpetrators may exploit legal uncertainties to target minors in Europe.
In 2025, NCMEC received over 21.3 million reports globally, including more than 61.8 million files suspected of child abuse, with about 90% originating outside the US. The EU parliament has not disclosed whether it conducted assessments to evaluate the consequences of allowing the law to expire, adding to concerns about preparedness and prioritization of child safety measures.
Privacy Concerns and Technological Realities Clash in Tense Negotiations
For the past four years, negotiations over a permanent child sexual abuse regulation have been fraught with contention. Privacy advocates argue that scanning messages for abuse threatens fundamental rights, equating it to "chat control" that could enable mass surveillance and false positives. However, child protection experts counter that blocking CSAM does not infringe on privacy, as free speech does not encompass the sexual abuse of children.
Emily Slifer, director of policy at Thorn, a non-profit developing technology to detect online child abuse, explained that scanning systems use machine learning for pattern detection without storing data. Trained analysts review known CSAM from sources like police reports to generate unique digital fingerprints, or hash values, which platforms use to automatically block matching content. "The technology doesn't find babies in bathtubs and things like that," Slifer noted, clarifying that it distinguishes between abusive and consensual content based on distinct patterns.
Despite the block on scanning for child abuse, the EU has allowed voluntary scanning for terrorist content under 2021 legislation, a disparity that critics say undermines child protection efforts. Hannah Swirsky of the Internet Watch Foundation warned, "The EU is effectively risking open doors for predators," urging the establishment of a permanent legislative framework to safeguard children and enable detection.
As negotiations continue without a clear timeline, the lapse of this vital law underscores the complex balance between privacy rights and the urgent need to combat online child exploitation, with ripple effects expected to impact global efforts to protect vulnerable individuals.



