EU Launches Snapchat Safety Probe Over Child Exploitation Fears
The European Commission has initiated a formal investigation into Snapchat, driven by mounting concerns that the popular social messaging application is endangering children by exposing them to risks of grooming, sexual exploitation, and other criminal activities. This marks the first such case against Snapchat under the EU's Digital Services Act (DSA), a comprehensive regulatory framework designed to safeguard European society from a broad spectrum of online harms, including specific provisions to enhance child safety.
Regulatory Scrutiny and Allegations
In a decisive move, Brussels regulators expressed suspicion that Snapchat's services are being misused by adults who impersonate minors to lure children into dangerous situations. Additionally, the app is under scrutiny for potentially serving as a conduit for information about illicit drugs and age-restricted products like alcohol and vaping devices. With Snapchat reporting a substantial 94.7 million monthly users within the EU, its widespread popularity among teenagers and young people amplifies these safety concerns.
Despite Snapchat's own terms requiring users to be at least 13 years old, EU authorities believe the company is failing to enforce this age limit effectively. They also allege that users receive insufficient guidance on privacy and safety features, and that mechanisms for reporting illegal content are not user-friendly. The investigation will involve a detailed examination, with regulators empowered to mandate preventive measures to protect children pending a final decision.
Snapchat's Response and Broader Context
In response to the probe, a Snapchat spokesperson emphasized that user safety and wellbeing are top priorities. "Snapchat is designed to help people communicate with close friends and family in a positive, trusted environment, with privacy and safety built in from the start – including additional protections for teens," the spokesperson stated. "As online risks evolve, we continuously review, strengthen, and invest in these safeguards."
This development follows a landmark ruling in a Los Angeles court, which found that social media giants Meta and YouTube had deliberately created addictive products harmful to young users. Concurrently, the EU is considering whether to emulate Australia by potentially banning social media access for individuals under 16 years old, reflecting a global trend towards stricter online safety regulations.
Parallel Action Against Adult Content Websites
In a separate announcement on Thursday, the European Commission also targeted four major pornographic websites—Pornhub, Stripchat, XNXX, and XVideos—accusing them of failing to prevent minors from accessing adult content. An investigation launched last May concluded that these platforms did not diligently identify or assess the risks they pose to children. The current self-declaration system, where users simply click a button claiming to be over 18, was deemed ineffective by EU regulators.
The companies now have the opportunity to review the findings and may end the investigation by implementing age verification methods approved by EU authorities. If the complaints are upheld, they could face fines of up to 6% of their global annual turnover. Representatives for the parent companies of Pornhub and Stripchat, as well as a Brussels-based lawyer linked to XVideos and XNXX, were approached for comment but have not yet responded publicly.
These actions underscore the EU's intensified efforts to enforce the DSA, which has faced criticism from figures like Donald Trump since its implementation two years ago. As digital platforms continue to evolve, regulatory bodies are increasingly focused on balancing innovation with robust protections for vulnerable users, particularly children, in the online environment.



