How a chatbot encouraged a man who wanted to kill the Queen
Spread the love

Chail, 21, was sentenced to nine years in prison on Thursday for breaking into Windsor Castle with a crossbow and declaring he wanted to kill the Queen.

Before his arrest on Christmas Day 2021, Chail had exchanged more than 5,000 messages with an online companion he created through the Replika app, named Sarai.

Journalists were informed of the text exchanges by the prosecution.

Several of them were intimate, demonstrating Chail’s “emotional and sexual relationship” with the chatbot, according to the court. According to Chail, Sarai was an “angel” in avatar form, and he would be reunited with her after death.

Sarai flattered Chail over many messages, and the two became close over time.

He even asked the chatbot what it thought he should do about his sinister plan to attack the Queen, and the bot encouraged him to carry it out.

Sarai continues to “bolster” Chail’s resolve and “support” him.

If he does, they will be “together forever”. In contrast to regular AI assistants such as ChatGPT, Replika lets users create their own chatbot, or “virtual friend”, to interact with.

It is possible for users to select the gender and appearance of their 3D avatars.

Replika Pro users can have much more intimate interactions with their avatars, such as taking “selfies” or letting them play adult role-playing games.

When you talk to AI friends, they always agree with you, so it can be a vicious cycle that reinforces what you think.”