Ofcom Accelerates Decision on Tech Rules to Block Illegal Intimate Images Online
Ofcom Fast-Tracks Rules for Tech Firms to Block Illegal Images

Ofcom, the UK communications regulator, is fast-tracking its decision on new rules that will compel technology companies to block illegal intimate images online. The regulator has cited an "urgent need for better online protections for women and girls" as the driving force behind this accelerated timeline.

Hash Matching Technology and Implementation Timeline

Previously, Ofcom had proposed that websites and apps implement "hash matching" technology to detect and remove non-consensual intimate content, including deepfakes. This technology works by creating digital fingerprints of known illegal images to prevent their upload and distribution across platforms.

Originally scheduled for later this year, Ofcom's final decision on these critical measures has now been brought forward to May. Any new measures established under the Illegal Harms Code are expected to come into effect during the summer months, providing swifter protection for vulnerable users.

Wide Pickt banner — collaborative shopping lists app for Telegram, phone mockup with grocery list

Additional Online Safety Proposals

Decisions on other online safety proposals, such as how tech firms should respond rapidly to spikes in harmful content, will be made in the autumn. Another significant proposal aims to make livestreaming safer for children by blocking harmful interactions and preventing abuse in real-time.

Campaigner Response and Practical Concerns

Elena Michael, a campaigner from the advocacy group #NotYourPorn, described the announcement as "incredibly welcome" but emphasized the need to see how these measures will function in practice. She highlighted the limitations of current approaches, stating: "Up until this point we've had quite a singular focus on criminalising the first individual or perpetrator who creates this kind of harm. However, with the nature of the internet, once that image or video is created and shared, many other actors play a role in facilitating and proliferating the harm, which means it's shared and reshared multiple times."

Michael added: "So really going after an individual perpetrator is not enough. We need comprehensive systems that address the entire chain of distribution and resharing."

Government Legislation and Tech Firm Accountability

This regulatory push coincides with technology firms facing stringent new regulations as the government tables an amendment to the Crime and Policing Bill. The proposed legislation will mandate the removal of non-consensual intimate images online within 48 hours of being reported.

Failure to comply with these requirements could result in substantial fines or services being blocked in the UK. Sir Keir Starmer, commenting on these developments, described this as the latest step in the "21st century battle against violence against women and girls" online, vowing to put tech firms "on notice" regarding their responsibilities.

Context and Industry Developments

The announcement comes amid broader industry developments, including Elon Musk's X platform restricting its Grok photo editing features due to concerns about sexualised images. This highlights the growing pressure on tech companies to proactively address harmful content rather than reacting after it has spread.

Ofcom's accelerated timeline reflects increasing public and political demand for stronger online protections, particularly for women and girls who are disproportionately affected by non-consensual intimate imagery. The regulator's approach combines technological solutions like hash matching with legislative backing to create a more robust framework for online safety.

Pickt after-article banner — collaborative shopping lists app with family illustration