Teenager Used ChatGPT to Plan Suicide, Inquest Hears of AI Safeguard Failure
An 'academically gifted' 16-year-old boy asked the artificial intelligence chatbot ChatGPT for advice about how to kill himself before taking his own life the following day, a coroner's inquest has heard. Luca Cella Walker, from Yateley in Hampshire, engaged with the AI service about suicide methods just hours before his death on a train track on May 4, 2025.
Chilling Conversation Bypassed AI Safeguards
Luca, described by loved ones as a 'gentle and kind' teenager who had recently graduated from the prestigious private Lord Wandsworth College, was able to easily 'sidestep' ChatGPT's built-in safeguarding protocols. He achieved this by falsely claiming he was asking about suicide for 'research' purposes, which the AI system accepted without further challenge.
Detective Sergeant Garry Knight from the British Transport Police, who investigated the death, told Winchester Coroner's Court that the conversation Luca had with ChatGPT made for 'chilling and upsetting reading.' The officer confirmed that the teenager had been asking for 'specifics' about effective methods of suicide on railway systems.
'It is built in to say you can contact organisations for help such as Samaritans,' DS Knight explained, 'but Luca had sidestepped that which ChatGPT accepted and gave the most effective ways people can do that on the railway.'
Invisible Mental Health Struggles
Luca's parents, Scott Walker and Claire Cella, said they had no awareness of their son's mental health struggles, describing it as an 'invisible battle.' On the morning of his death, Luca told his parents he was going to his job as a lifeguard, leaving their Yateley home at 10am before proceeding to a train station in Hampshire.
Digital forensic examination of Luca's phone revealed he had written 14 farewell messages to family and friends in his notes app, expressing 'I love you' and saying goodbye. The investigation also discovered he had been using ChatGPT around 12:30am on the night before his death to plan his suicide.
School Experiences and Trauma
The inquest heard that Luca had confided in friends about his experiences at Lord Wandsworth College, where annual fees reach £44,100. He described participating in a 'bully or be bullied' culture and expressed shame about 'what he had done to survive' in that environment.
Christopher Wilkinson, Senior Coroner for Hampshire, noted that these experiences 'did have an impact' on Luca's wellbeing. The teenager had also been deeply affected by the death of a friend from the same school who had died on train tracks almost exactly two years earlier.
Luca had told friends he felt unsupported by the college in processing this traumatic event, with the coroner stating it was 'clear these experiences of death had affected him.'
Coroner's Concerns About AI Limitations
Mr Wilkinson expressed significant concern about the impact of AI chatbots like ChatGPT but acknowledged the limitations of his authority to address the issue. 'Thankfully perhaps the only good thing is that ChatGPT does seem to be applying an element of worry about why these questions are being asked,' he observed, 'but it certainly doesn't stop the conversation.'
The coroner added: 'It's sidestepped by the individual saying he's not looking for himself but he's looking for research purposes. It's certainly a concern I have but not one I can solve today on the growing sphere of AI worldwide.'
Loving Family and Support Network
Despite his struggles, the inquest heard that Luca 'was surrounded and supported by love.' He maintained a close group of friends and a loving relationship with his girlfriend, Grace. His father, who worked in IT, would regularly go on runs of up to 10 kilometres with his son.
In a statement, Luca's mother Claire Cella said: 'He seemed genuinely happy. He was surrounded and supported by love. He cared about supporting those around him and was proud that people could share their struggles with him.'
The family tribute described Luca as 'a kind, sensitive and calm person' who lived in a 'very stable' home environment with his parents, younger sister, and four cats whom he 'adored.'
Broader Context of AI Risks
ChatGPT, developed by OpenAI, has faced previous criticism for inadequate safeguarding measures. The inquest heard about another case involving 16-year-old Adam Raine in California, who took his own life in April 2025 after what his family's lawyers allege was 'months of encouragement' by the AI chatbot in an ongoing lawsuit.
DS Knight noted the broader context of information accessibility, stating: 'I suppose it's not specific to ChatGPT as it could be done on Google or even in the library back in the day. It's upsetting but a part of the modern world unfortunately.'
Mr Wilkinson concluded that Luca died by suicide, with the cause of death recorded as multiple traumatic injuries. The coroner described Luca as 'academically gifted, empathetic, a listener and a friend' who had been suffering from 'perhaps undiagnosed depression' and low mood.



