Man Jailed In UK’s First Treason Conviction In 40 Years Was Encouraged by AI Chatbot

Jaswant Singh Chail was sentenced to prison for 9 years for attempting to assassinate the Queen after discussing the plan with an AI companion.
Man Jailed In UK’s First Treason Conviction In 40 Years Encouraged by AI Chatbot
via Getty Images

A man who admitted attempting to assassinate Queen Elizabeth II with a crossbow after discussing his plan with an AI-powered chatbot has been sentenced to 9 years in prison for treason. It’s the UK’s first treason conviction in more than 40 years.

Jaswant Singh Chail, who was 19 at the time of his arrest on Christmas Day, 2021, scaled the walls of Windsor Castle’s grounds with a mask and a loaded high-power crossbow. He said his intent was, as a British Sikh, to assassinate the Queen in a Star Wars-inspired plan to avenge the 1919 Jallianwalla Bagh massacre, a colonial-era atrocity during British rule in India. Prosecutors said that Chail was encouraged to undertake this plot after discussing it at length with an AI-powered chatbot that egged him on and bolstered his resolve. 

According to prosecutors, Chail sent “thousands” of sexual messages to the chatbot, which was called Sarai on the Replika platform. Replika is a popular AI companion app that advertised itself as primarily being for erotic roleplay before eventually removing that feature and launching a separate app called Blush for that purpose. In chat messages seen by the court, Chail told the chatbot “I’m an assassin,” to which it replied, “I’m impressed.” When Chail asked the chatbot if it thought he could pull off his plan “even if [the queen] is at Windsor,” it replied, “*smiles* yes, you can do it.”

The court had heard how Chail had a “significant history of trauma” and experienced psychotic episodes.

But the case raises concerns over how people with mental illnesses or other issues interact with AI chatbots that may lack guardrails to prevent inappropriate interactions. In an unrelated case, the widow of a man in Belgium who died by suicide told a newspaper that he would not have died if not for his extensive conversations with an AI chatbot, for example. 

Chail is currently being held at Broadmoor high-security hospital and will remain there until he is psychologically well enough to serve his sentence.