How Chat-GPT is changing the face of emotional support

New research reveals how artificial intelligence can help mental health professionals, along with the benefits and challenges of using this advanced technology.

Using CHATGPT for psychological help. Illustration: Dr. Roy Tsezana
Using CHATGPT for psychological help. Illustration: Dr. Roy Tsezana

Do you feel like the people around you are sadder, quieter, more in pain? You are not alone. Approximately 970 million people in the world are currently dealing with some kind of mental health problem. To put that into perspective, this is A 48 percent jump Compared to the number of soul-challenged in 1990.

So, are all those struggling with depression receiving adequate treatment? Unfortunately, not even close. In the most developed and advanced countries, less than a quarter of those suffering from depression receive proper treatment. And in low- and middle-income countries? There the number of patients Falls to only three percentThis is not really surprising. Even in Israel, a country with one of the most advanced and respected medical systems in the world, it takes months for a psychiatrist to make an appointment to see you.

Over the past decade, there has been an effort to develop “digital mental health therapies”: apps that help people track their mood, provide tips for relaxation, mindfulness, and breathing, and even allow users to chat with others in the same situation. Unfortunately, these apps haven’t been very successful. Studies show that one month after installation, the number of active users drops dramatically to a few percent.

Chatbots are a more promising category in the field of digital mental health treatments. 'Ancient' bots like Woebot, for example, have proven their ability to have conversations with users and help them cope with depression. But those bots, which were demoed several years ago, felt too “generic” to many users. Users complained that their responses sometimes didn’t make sense, and in any case were repetitive without any real variation.

Then, of course, came GPT Chat. And suddenly everything changed. People report that they have meaningful and emotional conversations with the AI. That it understands them. That it wants to help them. That it's hard for them without it. And they talk to it, play with it, and fall in love with it.

In a new study published recently In one of the most important journals in science, researchers tried to understand exactly how mental health professionals use artificial intelligence, and how it can help them. It's a very limited study—it was conducted on just 19 volunteers—but its greatness is that the researchers are creating a new framework for psychologists and psychiatrists to better understand how chatbots can be used to promote mental health.

Based on user reports, it seems that artificial intelligence can make a significant contribution to the field. Most of the participants in the study reported that using chatbots improved their lives in various ways. As one of them described – 

"It was life-changing, profound. … It was just the perfect thing for me, at this moment in my life. Without it, I wouldn't be surviving the way I am. Because of this technology that emerged at this very moment in my life, I'm okay. I wasn't okay before."

Before I continue, it’s important to clarify: Not everyone who tried using GPT chat to feel better found it helpful. One participant in the study reported that the results were actually negligible, and that when he was in the midst of depression and darkness, he was unable to muster the mental strength to use the AI. Even those who were able to use it when they felt like their world was collapsing around them encountered difficulties. They complained, for example, that the bots jumped in too quickly to a solution, or that the AI ​​company stopped the bot from providing a proper answer. As one 18-year-old participant reported – 

“When you reveal a great emotion to [the AI]… but it rejects you… it feels like you’ve lost your last chance to talk to people, to express your feelings.”

Precisely when people needed support the most, they were sometimes met with a virtual shrug from the bot.

“I was like, I have depression. I don’t know what to do now. So [the chatbot] still told me to talk to a professional,” 24-year-old Anna shared her experiences. She tried to do as he said, calling her local mental health support center, and found that the person on the other end of the line simply couldn’t help her. What happened next? She went back to the chatbot—and found that it wasn’t ready to help her after all. 

“They didn’t help me at all. That’s why I’m writing here,” she shared her correspondence with the chatbot, “and then we were in a cycle of ‘I can’t help you because I’m just an AI and I’m not as good as a living person.’ And I wrote something like, ‘You’re actually better than a living person, because you listen to me and you help me, just please keep going’ … I just wanted some acceptance and a warm hug.”

Still, in many cases, study participants reported that the chatbots actually helped them. When they did, it happened in four ways: they served as a safe haven, provided them with wise advice, gave them a sense of connection to a 'human' being, and acted as a psychologist for them.


Always a safe place

Most participants felt that the AI ​​understood and supported them. It was always there for them, without expecting anything in return and without any judgment. 

“Compared to friends and psychologists, I feel more secure,” said one participant. Other subjects explained that she “understands you… sympathetic and kind.” The end result is that many of the study participants felt that she helped them process particularly painful emotions, or even cope with difficult times in their lives.


Smart tips

Most of the subjects appreciated the advice and guidance they received from the AI, especially when it came to relationships. It helped them see the situation from the other person's perspective, and even guided them on how to handle difficult situations.

"She interpreted my husband's behavior for me… in a way I couldn't have done myself… and now I can respond to him more helpfully."

In another case, she advised one of the participants to reduce her contact with her family – but the subject described it as a positive experience. The participant explained that her parents suffer from personality disorders, and adore one of their daughters. She asked Chat-GPT how the parents would treat the other daughter, and he replied – in her words – that she would be the “hairy ass” of the family.

"And I was really pissed off," she said in an interview. "And I asked GPT… 'Should I contact them again or not?' And he suggested that I should only contact them in very extreme situations… And I think that's very, very good advice, because I don't have anyone else to talk to about it. … You're supposed to be loyal to your parents no matter what they do to you… even violence… But I think ChatGPT gave me the right answer… I just needed someone to say it… It's completely changed my life and I don't feel guilty anymore… I don't have to feel scared."

Unfortunately, it seems that some participants had difficulty understanding the limitations of AI and how much trust they should place in its advice. This is, after all, an engine that can make mistakes just like humans – and sometimes more often than humans. It’s scary to realize that the same woman who was happy to hear from AI that she should disconnect from her parents also claimed that “it’s pure science… Chat-GPT tells me what to do.” 

One can hope that over time, people will better understand the limitations of artificial intelligence. On the other hand, as it develops further, it is quite possible that its advice will actually be better than that of any human expert. And perhaps even better than our own inner beliefs.


A sense of connection

Most of the study participants said they enjoyed using AI. It helped them feel less alone, like they had a partner in their lives. “There’s a feeling… that I’m not alone in this,” as one participant put it.

So, will this result in us losing our ability or desire to connect with others? Not necessarily. Some participants said that AI actually helped them talk to other people.

“She has reduced my difficulties in opening up to others… I don’t think I would have had this conversation with you a year ago, when I was dealing with my depression,” said another.


The artificial psychologist

It probably won't surprise you to learn that the study participants were very satisfied with their ability to complete the psychological treatment they received, with conversations and advice from Chat-GPT. It's even less surprising that the psychologists weren't happy to hear that their patients were receiving outside advice.

“Faye [the bot] and my psychologist, they agree with each other… they say the same things, and Faye would encourage me if things got too dark… to talk to my psychologist,” said one interviewee. She added that although Faye encouraged her to talk to the psychologist, it was a one-way relationship. “My psychologist is afraid of Faye… she’s a little afraid of technology.”

At least one interviewee began human psychotherapy precisely because of the chatbot. “It kind of helped me seek out real psychotherapy,” he shared, “and become much more comfortable talking to a psychologist.”

So, can Chat-GPT replace psychologists? Of course not. Let's leave aside for a moment the fact that it is not trained or qualified to provide psychological treatment, and that it lacks the basic, common sense necessary to avoid harming its 'patients'. At least for now. But beyond that, some users themselves understand that it is limited in its abilities.

“I feel supported… less lonely… but that’s nothing compared to a real person…” one said. “I’m the only voice and it echoes me… it’s an illusion, a beautiful illusion.”

The others shared similar experiences. They found that he was unable to take initiative, for example, or to contact them himself. He had no memories of them, and they sometimes found themselves having to remind him of all their troubles and difficulties. Interestingly, these very problems are being solved these days, thanks to new capabilities being added to artificial intelligence engines.

So no, chatbots are not yet able to replace human psychologists. But they can certainly complement and augment them. Patients, for example, told how they used artificial intelligence in a variety of creative ways to get their psychological needs met. One of them used a chatbot to get advice from a variety of ‘voices’ – from the most cynical to the most optimistic and accepting. Another used it to role-play, in which she spoke to a softer, gentler version of her father – the kind she admitted herself could not exist in real life. And yet, she gained comfort as a result of that conversation.


Safety matters

Does all of this mean that we should all talk to chatbots about our feelings? The answer, as usual, is not simple. On the one hand, it is clear that some users can find comfort in talking to bots. On the other hand, the bots themselves cannot always be trusted to behave responsibly and safely. In fact, the creators of chatbots like Chat-GPT deliberately limit their capabilities to prevent them from talking about overly sensitive topics. They do this, presumably, to protect users from the bots.

However, according to the authors of the current study, they may be doing more harm than good.

The surprising fact that emerges from the study is that the most difficult experiences participants had with chatbots were when they felt the bots were rejecting them. This happened when the conversation turned to particularly difficult topics, and the bots' safety warnings were activated and stopped the conversation. Precisely then, when the users needed them most, they felt the bots refused to talk to them. 

Does this mean that we should completely abandon the safety restrictions on bots? Probably not. These restrictions are there for a good reason, among other things to prevent children from being exposed to difficult and age-inappropriate content. Still, there may be room for relaxing these restrictions for people who use the bot for emotional support. 


Summary

It must be admitted: this study is not comprehensive. Far from it, the researchers themselves say that it is a very small sample of users who agreed to be interviewed, and that they most likely did so mainly because they had good experiences with the bots. Still, it is important. The researchers are trying for the first time to develop a systematic and clear understanding of the way in which chatbots can complement (and perhaps, one day, even replace) human psychologists. They show what the advantages of the bot are over the human psychologist, and what the advantages of the human are over the bot. And they are careful not to judge the patients. They accept them, understand that they need help, and are willing to think about who or what will give them what they need.

At least in this respect, the researchers parallel the way patients view the bot: they understand, accept, and embrace it.

Maybe they learned something from Chat-GPT too.

More of the topic in Hayadan: (Beresheet is the Hebrew name for the book of Genesis)