The growing reliance on ChatGPT for mental health support is transforming the way people navigate emotional struggles.
A growing trend on social media, particularly TikTok, features individuals using ChatGPT as a therapeutic tool. Users describe sharing their thoughts with the AI and receiving feedback or guidance, often expressing surprise at how perceptive and emotionally intelligent it appears to be.
Although AI driven platforms like ChatGPT offer immediate, round-the-clock assistance, making them convenient and cost affective for mental health care, the use of AI in mental health care brings ethical challenges, such as concerns over data privacy, the accuracy of its responses, and the risk of users becoming overly dependent on AI.
Newsweek has reached out to experts in psychology and technology to discuss whether ChatGPT has the capability to replace therapists in the near future.
From Dr. Daniel Lowd: AI Could Fill Gaps Where Therapists Are Unavailable
People have found comfort in talking to AI since the ELIZA chatbot, 60 years ago. People also find mental health support through journaling, exercise, prayer, talking with trusted friends, and reading self-help books. While none of that replaces a good therapist, good therapists are hard to come by. The waits are long, the costs can be high, and a therapist that’s right for one person could be totally wrong for another. So if people can find some support and perspective by talking to ChatGPT or Claude, then I think that’s wonderful.
Daniel Lowd, Associate Professor in the Department of Computer and Information Science at the University of Oregon.
From Dr. Pim Cuijpers: AI Will Transform Mental Health Care but Won’t Replace Therapists
I do not think that AI will replace therapists. There will always be a need for human support and therapy, and that need is so big that the number of therapists will not go down because of AI.
We have had this discussion in The Netherlands 20 years ago when the first results of our digital interventions (not based on AI) became available and showed that these interventions were as effective as face-to-face treatments. And although digital tools have found their way to routine practice, there is no impact whatsoever on the number of therapists.
But AI will change mental health care. For some people an AI therapist will be enough (even preferred above human therapists), but for many people it will not be enough. But AI will have impact in all kinds of different ways.
There is research being done on examining which patient will benefit from which type of treatment, using machine learning (AI). There is research on ecological momentary assessments through smartphone, also aimed at improving outcomes and using machine learning, and there are studies using digital tools as add-on to regular therapy, and for example Avatar therapy in psychotic disorders.
So, no AI will not replace therapists, but it will change and improve mental health care considerably.
Dr. Pim Cuijpers, Professor emeritus, Department of Clinical, Neuro and Developmental Psychology, Vrije University Amsterdam.
From Dr. Richard Lachman: Relying on AI For Mental Health Could Impact Young and Vulnerable People
AI chatbots will respond as therapists if asked, without any of the oversight, training, or responsibility of a human councillor or even a specialized and tested piece of software.
This readiness to engage on topics without expertise is a major weakness in both LLMs and the nature of the conversational interface: they give the illusion of understanding and competence through facile fluency and a conversational eagerness to please, fine-tuned by reinforcement learning and human responses.
We have already seen unsafe and unhealthy interactions with people in crisis through relationship-focussed chatbots, and as they become more a part of our interaction landscape this will only increase.
Ease of access and a radically cheaper cost may also lead organizations to fund AI chatbots rather than engaging in the long-term systemic changes needed to make available qualified, responsible and ethically-governed human therapists. This will likely affect young people and vulnerable segments of society who may not have the financial resources or wherewithal to advocate for themselves.
Dr. Richard Lachman, Associate Professor, Media Production & Director, Zone Learning & Director, Creative Technology Research and Development, Toronto Metropolitan University.
From Dr. Ben Levinstein: AI Could Shape The Development of Mental Health in The Future, If Used With Caution
The question of whether AI will replace therapists isn’t a simple yes or no – rather, it’s about understanding how AI will transform the mental healthcare landscape.
While some patients will continue to prefer human therapists, valuing the aspects of human connection, others may gravitate toward AI systems, finding them more accessible and feeling more comfortable sharing their deepest struggles without fear of human judgment.
The dramatically lower cost of AI therapy, combined with its constant availability and elimination of waitlists, will likely accelerate this shift – particularly as insurance companies recognize the potential for both improved outcomes and reduced expenses.
The mental healthcare field will likely develop in multiple directions simultaneously. Some practices might adopt hybrid approaches, where AI handles initial assessments and provides between-session support while human therapists focus on core therapeutic work.
Other services might be fully AI-driven, particularly as these systems become increasingly sophisticated in their ability to understand human psychology and develop treatment plans. In psychiatry, while early AI adoption might focus on routine cases, advancing AI systems will likely surpass human capabilities in handling even the most complex diagnostic and medication management challenges, potentially leading to AI becoming the primary psychiatric care provider.
However, the rise of AI therapy also presents serious concerns. One critical issue is “therapist shopping” – where patients seek out therapists who simply validate their existing views rather than providing challenging but necessary therapeutic work.
This problem, already present with human therapists, could be amplified with AI systems as patients could easily switch between different AI models until they find one that tells them what they want to hear.
Additionally, there are complex ethical and legal questions about how AI systems would handle situations involving suicidal ideation or mandatory reporting requirements. While these challenges won’t prevent AI from revolutionizing mental healthcare, they will need to be carefully addressed as the field evolves.
Dr. Ben Levinstein, Associate Professor at University of Illinois, Urbana-Champaign, Department of Philosophy.
From Dr. Randy Goebel: Humans Will Remain Irreplaceable as Therapists Despite AI’s Rise
The reality is that humans will remain the best human therapists for the foreseeable future, although chatbot-like triage will still provide some graded access to real counsellors. It may likely be that the proliferation of mis and dis information which is accelerated by AI systems will make the need for mental health counselling even more intense than before.
As usual, the public and private systems will respond in different ways, but not before there are many negative consequences.
Dr. Randy Goebel, Professor of Computing Science and Adjunct Professor of Medicine, University of Alberta.
From Dr. John Torous: AI Will Support, Not Replace, Therapists
AI will not replace therapists. In the last decades, we have had many excellent self-help therapy books, CD-ROMs, websites, and apps. They have not replaced therapists. AI can do many wonderful things, but people want to connect with other people for therapy. However, AI will have a role in augmenting therapy by making it effective and impactful.
One danger of AI is we will see an initial wave of people with no training or licensure perhaps claiming to be therapists and offering therapy. Using AI does not mean any person can act as a therapist, just like using a flight simulator does not mean any person can act as a pilot. It is always good to ask any therapist about their credentials, experience, and training – and more so than ever today.
Dr. John Torous, Director of Digital Psychiatry, Beth Isreal Deaconess Medical Center, Associate Professor of Psychiatry, Harvard Medical School
Read the full article here