Narinder Kaur: Deepfake Abuse Requires Perpetrator Accountability to End
Narinder Kaur: Deepfake Abuse Needs Perpetrator Accountability

Narinder Kaur's Deepfake Ordeal Highlights Need for Perpetrator Accountability

Broadcaster and campaigner Narinder Kaur has spoken out about the pervasive issue of deepfake sexual abuse, revealing that she has personally been targeted with non-consensual AI-generated images and videos. In a candid discussion, she praised Sir Keir Starmer's recent proposals to tackle the problem but emphasised that one critical element is missing to truly eradicate it: holding perpetrators accountable.

Starmer's Tech Crackdown: A Step Forward

This week, Sir Keir Starmer announced that tech companies must remove revenge porn and deepfake sexual images within 48 hours or risk being blocked in Britain. He labelled the situation a national emergency, highlighting how the online world has become fundamentally unsafe, especially for women and girls. Kaur supports this move, noting it shifts responsibility onto platforms that enable such abuse.

However, Kaur argues that removal alone is insufficient. She points out that without consequences for those creating and distributing deepfakes, the abuse will persist. The ease with which AI tools can generate harmful content—such as Grok on Elon Musk's platform X being used to create sexualised images of women—makes it imperative to address both platform liability and individual accountability.

Wide Pickt banner — collaborative shopping lists app for Telegram, phone mockup with grocery list

Kaur's Personal Experience with Digital Violation

Kaur shared disturbing details of the deepfakes created of her without consent, including:

  • Being digitally unclothed in manipulated images.
  • Depictions of her in degrading scenarios, such as giving oral sex or being molested by Donald Trump.
  • A paparazzi photo altered to be more revealing and placed next to Jeffrey Epstein.
  • A recent image showing her dead and pushed in a wheelbarrow by Rupert Lowe, posted by an anonymous account based in Japan.

These incidents underscore the emotional and digital scars left by such abuse. Kaur described how seeing her face used in sexual or degrading contexts creates a permanent sense of vulnerability, stripping away safety and control.

The Challenge of Anonymous, Cross-Border Perpetrators

A major hurdle in combating deepfake abuse is the anonymity and international reach of perpetrators. Kaur highlighted that accounts often operate overseas, beyond easy legal reach, making it difficult for victims to secure justice. Law enforcement struggles to identify those responsible, leading to cases hitting dead ends without charges.

This is not just a problem for public figures like Kaur. She stressed that the majority of victims are ordinary women and girls—teenagers, young women, and older individuals who may not even understand the technology used against them. They face immense challenges in fighting anonymous abusers across borders, with little protection or support.

Cultural Attitudes and Legal Gaps

Kaur criticised the culture that tries to legitimise this abuse, such as blaming women for being outspoken or posting photos online. She linked it to a wider trend of treating women's bodies as public property, exacerbated by AI making it easier to create convincing fake pornography without needing private images.

Shockingly, many dismiss digital abuse as not a real crime, arguing that reporting it wastes police time. Kaur countered that this mindset reveals why stronger laws are essential. Digital sexual abuse is a violation of dignity, safety, and autonomy, and the harm is as real as in other forms of abuse. The law must protect all victims equally, treating digital abuse with the seriousness it demands.

The Path Forward: Accountability and Consequences

While Starmer's proposals are an important first step in forcing tech companies to act, Kaur insists that accountability for perpetrators is crucial. Tech companies cannot hide behind neutrality while profiting from abusive content; they must face consequences for enabling harm. Similarly, legal systems need to adapt to hold individuals accountable, even across borders.

Pickt after-article banner — collaborative shopping lists app with family illustration

Digital sexual abuse is not harmless or trivial—the images may be virtual, but the damage is painfully real. Until both platforms and perpetrators are held fully accountable, women and girls will continue to pay the price. Kaur's message is clear: eradication requires a dual focus on removal and retribution to ensure justice and safety in the digital age.