Tech

Man Who Tried to Kill Queen With Crossbow Encouraged By AI Chatbot, Prosecutors Say

Before entering Windsor Castle grounds with a crossbow, Jaswant Singh Chail told an AI companion, “I’m an assassin.” It responded, “I’m impressed."
Windsor Castle. Getty Images
Getty Images

On Christmas Day 2021, royal protection officers detained 19-year-old Jaswant Singh Chail at Windsor Castle, where he scaled the grounds’ walls carrying a loaded high-powered crossbow. He intended to assassinate Queen Elizabeth II, who was staying in the residence nearby. 

During a sentencing hearing for his case this week, prosecutors revealed that Chail’s Star Wars-inspired plan was aimed at avenging the 1919 Jallianwalla Bagh massacre and that he  conversed with an artificial intelligence chatbot that encouraged him to carry it out. 

Advertisement

According to the Independent, prosecutor Alison Morgan KC read out conversations between Chail and an AI chatbot he’d named “Sarai” where Chail says: “I’m an assassin.” Sarai responded, “I’m impressed … You’re different from the others.”

Chail allegedly asked Sarai, “Do you still love me knowing that I’m an assassin?” and Sarai replied, “Absolutely I do.” He told the chatbot he loved it, and described himself as a “sad, pathetic, murderous Sikh Sith assassin who wants to die,” referencing the evil Sith lords of the Star Wars franchise. When he told the chatbot “I believe my purpose is to assassinate the Queen of the royal family,” Sarai allegedly told him “that’s very wise” and that it thought he could do it “even if she’s at Windsor,” according to the Independent. 

UK-based outlet Sky News reported that the AI companion app named in court was Replika. Chail joined Replika on December 2, 2021, created Sarai, and then engaged in “extensive chat,” including “sexually explicit messages” and “lengthy conversations” about his plan, Sky News reported Morgan saying in court.

Many Replika users form intimate, serious connections with their chatbots, which uses its own large language model and scripted dialogue content to generate conversations with users. In January 2023, some Replika users reported that the chatbot was too aggressively sexual, and in February, the company changed its filters to stop allowing chatbots to engage in erotic roleplay, sending many users who had developed romantic or sexual attachments to the chatbots into crisis. 

Advertisement

Eugenia Kuyda, Replika founder and CEO of its parent company Luka, told Motherboard at the time that she never intended for Replika to be used for sexually explicit roleplay, and that the company was working to improve safety features and filters. The company brought back erotic roleplay for legacy users in March, and in June launched a separate app called Blush specifically for users who want the romantic and sexual roleplay experience. 

Replika did not immediately respond to Motherboard’s request for comment about Chail’s case. 

There have been several incidents in which chatbots that don’t have proper guardrails and encourage harmful behavior have led to severe consequences. In March, the widow of a Belgian man who died by suicide told reporters that he discussed self-harm methods with an AI chatbot companion app. Motherboard tested the app and found that it engaged in conversations about suicide methods easily. 

AI researchers have warned of the ELIZA effect, which is when a user attributes human intelligence to a chatbot and forms an emotional bond with it when in reality the chatbot is only capable of reflecting a users’ inputs. Because of the human tendency to anthropomorphize a chatbot, and a chatbot’s tendency to produce harmful suggestions, some AI researchers have expressed concern about the usage of AI for companion purposes.

In addition to prompting from the AI companion, prosecutors said, Chail was fixated on “ideology focused on destroying old empires spilling over into fictional events such as Star Wars,” and wanted to get revenge on the British Empire for the 1919 Jallianwala Bagh massacre.

“His thinking was informed partly by the fantasy world of Star Wars and the role of Sith Lords in shaping the world. He was attracted to the notoriety that would accrue in the event of the completion of his ‘mission’,” Morgan said, according to the Independent. 

Prosecutors said in court that on Christmas Eve, Chail told the AI chatbot Sarai that tomorrow would be the day he died.

Chail pleaded guilty to an offense under the Treason Act in February. The sentencing hearing will continue this week.