When Chatgpt helps manage your anxiety, mental health specialists are worried

By: Elora Bain

It is 2 in the morning, panic overwhelms you and it is impossible for you to join your therapist. In this type of situation, many are turning to chatbots, these conversational programs based on artificial intelligence. Sometimes these digital assistants are able to suggest relaxation techniques or moderate the effects of a crisis. For an increasing number of people, they represent an emergency solution accessible at any time.

According to Scientific American, this phenomenon increases due to the limits of access to the mental health system. “We lack professionals and some do not take insurance, which reduces access to care”explains Jeffrey Scott Hall, professor of psychology at the University of Kansas (United States). According to him, this technology can help to fill these shortcomings, provided they are framed safely and responsible for a legal point of view.

Towards strict and framed regulation

The other advantage of chatbots is that they also serve to work on their social interactions. This feature can be particularly beneficial, especially in adolescents. “We can exercise to approach new friends or manage complex social situations”specifies the professor. This virtual preparation would promote self -confidence and help transpose certain skills into real life.

But the less a chatbot is preprogrammed, which gives way to spontaneous answers, the more difficult it becomes to control its responses; The risk of error then increases a lot. “There is a tension between the commitment that these tools offer and the security of the advice provided”warns Jeffrey Scott Hall. Despite everything, chatbots generate a level of interaction unequaled by conventional well-being applications, which users often get tired.

In order to secure the use of these devices, experts call for the realization of federal legislation. This could protect personal data, limit advertising, prohibit addictive techniques and impose regular reports on suicidal crisis detections. “It is essential to prevent any misleading representation, such as presenting a chatbot as if it were a psychologist”underlines the professor.

These digital alternatives already exist. Scientific American gives the example of Therabot, a chatbot designed by American university researchers to support the mental health of its users. Its results show that a safe and effective mental health chatbot is possible, but that it is still marginal compared to the commercial offer. Ultimately, these digital companions could complete the existing traditional care, provided they are supervised to remain real reliable mental health tools.

Elora Bain

Elora Bain

I'm the editor-in-chief here at News Maven, and a proud Charlotte native with a deep love for local stories that carry national weight. I believe great journalism starts with listening — to people, to communities, to nuance. Whether I’m editing a political deep dive or writing about food culture in the South, I’m always chasing clarity, not clicks.