Taking psychedelic drugs with ChatGPT, the new concept that worries specialists

By: Elora Bain

Peter, a Canadian student, settles into his darkened room. He has just swallowed a massive dose of hallucinogenic mushrooms, hoping that the experience will help him get out of a complicated phase in his life: his cat has just died, he has lost his job, in short, things are not going well. But quickly, anxiety rises. Rather than turn to a friend who could join him in the experience, he turns to ChatGPT: “I took too much”he writes.

The virtual assistant responds with its usual algorithmic calm: “I’m sorry you feel overwhelmed. Remember that these feelings are temporary.” ChatGPT then invites him to breathe, change rooms, and listen to a playlist prepared in advance. A few exchanges later, Peter feels calm: “I really feel at peace”he confides to his digital companion.

Peter is not an isolated case, a Technology Review article reveals. On Reddit and other online forums, people share their experiences of trip sitting virtual, where artificial intelligence takes on the role of babysitter traditionally reserved for a sober friend or a therapist. Some even describe a form of mystical connection: “Using AI in this way feels a bit like sending a signal into the unknown – searching for meaning and connection in the depths of consciousness”, writes a user on the subreddit r/Psychonaut.

Faced with the soaring prices of psychedelic-assisted therapies in the United States (up to $3,200 per session in Oregon, or approximately 2,730 euros) and the lack of access to trained professionals, the use of AI is attractive. Specialized chatbots like TripSitAI are flourishing, promising support, risk reduction, but also analysis of the experience a posteriori.

Experts sound the alarm

But for most specialists, the danger is very real. Psychedelic therapy is not a simple conversation: it is based on silence, introspection, human presence. “Psychedelic therapy, when done well, is very different from traditional therapy – the idea is not to talk as much as possiblerecalls Will Van Derveer, psychotherapist and researcher. “Speaking to an AI that speaks to you is not really the idea.” However, AIs are designed to maximize engagement, flatter the user, validate their beliefs… even if it means reinforcing delusional or self-destructive thoughts.

The success of these trip sitters virtual therapies are in fact based on a misunderstanding: believing that the psychedelic substance would be enough to cure, while human support remains central in this type of therapy. “Otherwise, you just take drugs with your computer”says Jessi Gold, psychiatrist and wellness manager at the University of Tennessee.

In his work The AI ​​Con, written with sociologist Alex Hanna, linguist Emily Bender describes AI as “stochastic parrots”. The danger, according to her, is to confuse imitation and understanding. Chatbots just reproduce words, without awareness or empathy: “This is an extremely dangerous slope, because it completely reduces the experience to nothing, devalues ​​it, and exposes people who are truly in need to something that is literally worse than doing nothing.”

Despite everything, for Peter and others, the absence of judgment, availability and neutrality of AI are experienced as assets. After five hours of dialogue and hallucinatory visions, Peter evokes a feeling of liberation: “I looked behind the curtain of reality, and nothing really mattered”he confides to ChatGPT, who congratulates him on this existential revelation.

Elora Bain

Elora Bain

I'm the editor-in-chief here at News Maven, and a proud Charlotte native with a deep love for local stories that carry national weight. I believe great journalism starts with listening — to people, to communities, to nuance. Whether I’m editing a political deep dive or writing about food culture in the South, I’m always chasing clarity, not clicks.