Roblox Mass Shooting Simulator Created by Suspect in Canadian School Attack
The 18-year-old suspect in a devastating high school shooting that occurred on Wednesday in British Columbia, Canada, had previously developed a mass shooting simulator on the popular gaming platform Roblox. This disturbing revelation has emerged following the tragic incident in Tumbler Ridge, a small coalmining community, which resulted in nine fatalities, including the perpetrator.
Details of the Virtual Simulator and Platform Response
According to initial reports by 404 Media on Thursday, the simulator was set in what appeared to be a virtual shopping mall environment. Within this digital space, users, represented by typical Roblox-style avatars, could pick up weapons and engage in shooting other players. The suspect's Roblox account and the associated game were first identified by users on Kiwi Farms, a website notorious for doxxing and trolling activities.
In response to the incident, Roblox issued a statement to the Guardian, confirming that the company has removed the user account connected to the horrifying event, along with any content associated with the suspect. The California-based firm emphasized its commitment to fully supporting law enforcement in their ongoing investigation. Roblox further clarified that the "Mall experience" simulator was only accessible through Roblox Studio, a separate application used by developers to create games. Consequently, the simulator recorded a mere seven visits, indicating limited exposure.
Background of the Tumbler Ridge School Shooting
Wednesday's attack stands as one of Canada's deadliest school shootings since the 1989 École Polytechnique massacre, where a gunman killed 14 women. In Tumbler Ridge, the victims included a teacher, five students, the suspect's mother, and her stepbrother. The suspect, identified by Canadian police as Jesse Van Rootselaar, reportedly had a history of mental health issues and was found deceased from a self-inflicted gunshot wound at the scene.
Roblox's Content Moderation and Broader Criticisms
Roblox stated that it employs a combination of artificial intelligence and a dedicated team of safety specialists to review content uploaded to its platform before it is made available to other users. However, this is not the first instance where Roblox has faced criticism regarding its content. The platform, which allows millions of users to create and share their own video games—often benign ones featuring cartoon fish and camping trips—has also been implicated in controversies.
It has allegedly hosted Jeffrey Epstein-themed content accessible to children and is currently facing a lawsuit in California for allegedly facilitating the sexual exploitation and assault of minors. These issues highlight ongoing concerns about content moderation on user-generated gaming platforms.
The Debate on Violent Video Games and Real-World Violence
The links between violent video games and mass shootings have been extensively debated, with large-scale studies finding at most a small correlation between gaming and real-world aggression, and the overall evidence remaining inconclusive. However, recent incidents underscore a growing trend of "gamified violence," where extremists adopt elements of video game design in the context of real-world attacks.
This phenomenon is becoming increasingly common. For example, the attackers in the 2019 mosque shooting in Christchurch, New Zealand, broadcast their massacre on Twitch, a platform that allows users to livestream themselves playing video games. Similarly, the shooter in the racially-motivated attack in Buffalo, New York, in 2022, utilized similar methods. These cases illustrate how digital platforms can intersect with real-world violence, raising critical questions about online safety and content regulation.



