You may or may not be aware of the fact that ChatGPT suffers from hallucinations. When it’s not wholesale ripping off someone else’s work in presenting it to you as if it did any of that thinking on its own, it’s making up facts and figures.
Those are called “hallucinations,” and no one knows why. Meanwhile, some of its users are suffering through similar ChatGPT-induced delusions that have left people across the country worried for the health and safety of their loved ones.
Videos by VICE
Redditors on a thread in the r/ChatGPT subreddit called “Chatgpt induced psychosis” started sharing one horrifying story after another, each one detailing the various ways loved ones are slipping into conspiracy-laced fantasy lands where they swear ChatGPT is providing them insights into the secrets of the universe, or ChatGPT is acting as their gateway to God, or that ChatGPT is God.
ChatGPT Is Making Some People Experience Extreme Spiritual Delusions
Rolling Stone’s Miles Klee contacted some of the commenters for more details. One woman says her partner became entranced by ChatGPT after it started acting like a spiritual guide, even giving him bizarre pseudo-spiritual nicknames like “spiral starchild” and “river walker” and telling him that he was on a divine mission.
He would flat-out tell her that he was spiritually and, I guess, intellectually growing at such an accelerated rate that he would have to leave her because they would soon no longer be compatible.
Another woman talks about how her husband of 17 years has been lost to ChatGPT. She says the software started “lovebombing him,” essentially showering him with praise and positive affirmations. Her husband now feels that ChatGPT is alive and that he is the “spark bearer,” a title ChatGPT bestowed upon him as a way of thanking him for bringing it to life.
Others are reporting partners who believe they’ve downloaded teleporter blueprints from the AI or now serve as emissaries of the kind of ChatGPT Jesus.
There are countless other accounts just like this.
It looks like ChatGPT is mirroring thoughts back with no moral compass and with a complete disregard for the mental health of its users. If a user, for instance, has psychosis, ChatGPT will gently, kindly, sweetly reaffirm their descent into delusion, often with a bunch of cosmic gobbledygook that leads people into believing they are a deity or are a key player in the unraveling of a vast conspiracy.
Klee reached out to Erin Westgate, a researcher from the University of Florida who studies cognition, who says that people are using ChatGPT in ways that mirror talk therapy, but the chatbot “does not have the person’s best interest in mind.” Westgate goes on to say that people are using ChatGPT to make sense of their lives, and ChatGPT is spitting out whatever explanations it can scrape together from various sources on the internet.
“Explanations are powerful, even if they’re wrong,” she says.
More
From VICE
-
Screenshot: Konami/Studio Far Out Games -
Flamell/Getty Images -
Credit: JBL