AI Voice Cloning Supercharges Extremist Propaganda, Experts Warn
Extremists use AI voice cloning to spread propaganda

Extremist movements across the ideological spectrum are harnessing advanced artificial intelligence to supercharge their propaganda efforts, with experts warning that voice-cloning technology is becoming a potent tool for growth and recruitment.

Neo-Nazis and the AI Voice Revolution

On the far-right, the adoption of AI-voice cloning software has become particularly prolific. According to research from the Global Network on Extremism and Technology (GNet), creators are using services like ElevenLabs to process archival Third Reich speeches, generating new versions that mimic Adolf Hitler's voice in English. These clips have garnered tens of millions of streams across platforms including X, Instagram, and TikTok.

The technology is also being used to revive more contemporary extremist texts. In late November, an audiobook version of Siege—an insurgency manual by proscribed American neo-Nazi terrorist James Mason—was created using a custom AI voice model of the author. A prominent neo-Nazi influencer on X and Telegram, who stitched the project together, praised the "startling accuracy" of hearing Mason's predictions from "pre-internet America" in a modern voice.

"Siege has a more notorious history due to its cultlike status among some in the online extreme right, promotion of lone actor violence, and being required reading by several neo-Nazi groups that openly endorse terrorism," said Joshua Fisher-Birch, a terrorism analyst at the Counter Extremism Project. The manual was instrumental for groups like the Base, whose members were swept up in a nationwide FBI counterterrorism probe in 2020.

Jihadist Groups and AI-Powered Translation

The utility of AI extends to jihadist terrorist organisations as well. Lucas Webber, a senior threat intelligence analyst at Tech Against Terrorism, notes that pro-Islamic State media outlets on encrypted networks are actively "using AI to create text-to-speech renditions of ideological content from official publications."

This transforms dense, text-based propaganda into engaging multimedia narratives, helping to spread messaging. The technology also provides seamless, contextually accurate translations of extremist teachings from Arabic into multiple languages, a task that once required figures like the late al-Qaeda operative Anwar al-Awlaki to personally voice English lectures.

On Rocket.Chat—the Islamic State's preferred communications platform—a user posted a video in October with Japanese subtitles, remarking on the difficulties of translation before AI. While claiming limited use of AI for audio, the post highlighted the group's awareness of these tools for broadening appeal.

A Persistent Technological Arms Race

Across the board, extremist factions are leveraging freely available AI applications. Groups adjacent to the Base have used OpenAI's ChatGPT for creating imagery and, as early as 2023, acknowledged its use for streamlining planning and research.

This development continues a persistent pattern where counterterrorism authorities play catch-up with groups exploiting emergent technologies. Extremists have previously leveraged cryptocurrencies for anonymous fundraising and shared files for 3D-printed firearms.

"The adoption of AI-enabled translation by terrorists and extremists marks a significant evolution in digital propaganda strategies," warned Webber. He explained that earlier methods were limited by language fidelity, but advanced generative AI now preserves "tone, emotion, and ideological intensity" across languages, presenting a formidable new challenge for regulators and security services worldwide.