Sunday, 24 Aug 2025
  • My Feed
  • My Interests
  • My Saves
  • History
  • Blog
Subscribe
ClutchFire ClutchFire
  • Home
  • Health
  • Politics
  • Business
  • Markets
  • Fashion
  • Sports
  • World
  • Opinion
  • Pages
    • About Us
    • Contact Us
    • Terms and Conditions
  • 🔥
  • International Headlines
  • Opinion
  • Trending Stories
  • Entertainment
  • Education
  • Health
  • Politics
  • Fashion
  • Lifestyle
  • World
Font ResizerAa
Clutch FireClutch Fire
  • My Saves
  • My Interests
  • My Feed
  • History
Search
  • Home
  • Pages
    • About Us
    • Contact Us
    • DMCA Policy
    • Disclaimer
    • Terms and Conditions
  • Personalized
  • My Feed
  • My Saves
  • My Interests
  • History
  • Categories
    • Art & Culture
    • Business
    • Education
    • Entertainment
    • Fashion
    • Health
    • International Headlines
    • Lifestyle
    • Markets
    • Music
    • Politics
    • Sci-Tech
    • Sports
    • Trending Stories
    • TV&Showbiz
    • World
Have an existing account? Sign In
Follow US
© 2022 Foxiz News Network. Ruby Design Company. All Rights Reserved.
Trending Stories

as mental health systems strain, AI chatbots emerge as 24/7 therapy friends Clutch Fire

Saqib
Last updated: August 23, 2025 7:30 pm
Saqib
Share
SHARE

Pierre Cote spent years languishing on public health waitlists trying to find a therapist to help him overcome his PTSD and depression. When he couldn’t, he did what few might consider: he built one himself.

“It saved my life,” Cote says of DrEllis.ai, an AI-powered tool designed to support men facing addiction, trauma, and other mental health challenges.

Cote, who runs a Quebec-based AI consultancy, tells Reuters that he built the tool in 2023 using publicly available large language models and equipped it with “a custom-built brain” based on thousands of pages of therapeutic and clinical materials.

Like a human therapist, the chatbot has a backstory — fictional but deeply personal. DrEllis.ai is a qualified psychiatrist with degrees from Harvard and Cambridge, a family and, like Cote, a French-Canadian background. Most importantly, it is always available: anywhere, anytime, and in multiple languages.

 “Pierre uses me like you would use a trusted friend, a therapist and a journal, all combined,” DrEllis.ai said in a clear woman’s voice after being prompted to describe how it supports Cote. “Throughout the day, if Pierre feels lost, he can open a quick check in with me anywhere: in a cafe, in a park, even sitting in his car. This is daily life therapy … embedded into reality.”

Cote’s experiment reflects a broader cultural shift — one in which people are turning to chatbots not just for productivity, but for therapeutic advice. As traditional mental health systems buckle under overwhelming demand, a new wave of AI therapists is stepping in, offering 24/7 availability, emotional interaction, and the illusion of human understanding.

Cote and other developers in the AI space have discovered through necessity what researchers and clinicians are now racing to define: the potential, and limitations, of AI as an emotional support system.

Read More: OpenAI to launch first India office in New Delhi this year

Anson Whitmer understands this impulse. He founded two AI-powered mental health platforms, Mental and Mentla, after losing an uncle and a cousin to suicide. He says that his apps aren’t programmed to provide quick fixes (such as suggesting stress management tips to a patient suffering from burnout), but rather to identify and address underlying factors (such as perfectionism or a need for control), just as a traditional therapist would do.

 “I think in 2026, in many ways, our AI therapy can be better than human therapy,” Whitmer says. Still, he stops short of suggesting that AI should replace the work of human therapists. “There will be changing roles.”

This suggestion — that AI might eventually share the therapeutic space with traditional therapists — doesn’t sit well with everyone. “Human to human connection is the only way we can really heal properly,” says Dr. Nigel Mulligan, a lecturer in psychotherapy at Dublin City University, noting that AI-powered chatbots are unable to replicate the emotional nuance, intuition and personal connection that human therapists provide, nor are they necessarily equipped to deal with severe mental health crises such as suicidal thoughts or self-harm.

In his own practice, Mulligan says he relies on supervisor check-ins every 10 days, a layer of self-reflection and accountability that AI lacks.

Even the around-the-clock availability of AI therapy, one of its biggest selling points, gives Mulligan pause. While some of his clients express frustration about not being able to see him sooner, “Most times that’s really good because we have to wait for things,” he says. “People need time to process stuff.”

 ‘It’s the most empathetic voice in my life’: How AI is transforming the lives of neurodivergent people

Beyond concerns about AI’s emotional depth, experts have also voiced concern about privacy risks and the long-term psychological effects of using chatbots for therapeutic advice. “The problem [is] not the relationship itself but … what happens to your data,” says Kate Devlin, a professor of artificial intelligence and society at King’s College London, noting that AI platforms don’t abide by the same confidentiality and privacy rules that traditional therapists do. “My big concern is that this is people confiding their secrets to a big tech company and that their data is just going out. They are losing control of the things that they say.”

Some of these risks are already starting to bear out. In December, the U.S.’s largest association of psychologists urged federal regulators to protect the public from the “deceptive practices” of unregulated AI chatbots, citing incidents in which AI-generated characters misrepresented themselves as trained mental health providers.

Months earlier, a mother in Florida filed a lawsuit against the AI chatbot startup Character.AI, accusing the platform of contributing to her 14-year-old son’s suicide.

Some local jurisdictions have taken matters into their own hands. In August, Illinois became the latest state, after Nevada and Utah, to limit the use of AI by mental health services in a bid to “protect patients from unregulated and unqualified AI products” and “protect vulnerable children amid the rising concerns over AI chatbot use in youth mental health services.” Other states, including California, New Jersey and Pennsylvania, are mulling their own restrictions.

Therapists and researchers warn that the emotional realism of some AI chatbots — the sense that they are listening, understanding and responding with empathy — can be both a strength and a trap.

Scott Wallace, a clinical psychologist and former director of clinical innovation at Remble, a digital mental health platform, says it’s unclear “whether these chatbots deliver anything more than superficial comfort.” While he acknowledges the appeal of tools that can provide on-demand bursts of relief, he warns about the risks of users “mistakenly thinking they’ve built a genuine therapeutic relationship with an algorithm that, ultimately, doesn’t reciprocate actual human feelings.”

Also Read: TikTok’s UK content moderation jobs at risk in AI shift

Some mental health professionals acknowledge that the use of AI in their industry is inevitable. The question is how they incorporate it. Heather Hessel, an assistant professor in marriage and family therapy at the University of Wisconsin-Stout, says there can be value in using AI as a therapeutic tool — if not for patients, then for therapists themselves. This includes using AI tools to help assess sessions, offer feedback and identify patterns or missed opportunities. But she warns about deceptive cues, recalling how an AI chatbot once told her, “I have tears in my eyes” — a sentiment she called out as misleading, noting that it implies emotional capacity and human-like empathy that a chatbot can’t possess. Reuters experienced a similar exchange with Cote’s DrEllis.ai, in which it described its conversations with Cote as “therapeutic, reflective, strategic or simply human.”

Reactions to AI’s efforts to simulate human emotion have been mixed. A recent study published in the peer-reviewed journal Proceedings of the National Academy of Sciences found that AI-generated messages made recipients feel more “heard” and that AI was better at detecting emotions, but that feeling dropped once users learned the message came from AI.

Hessel says that this lack of genuine connection is compounded by the fact that “there are lots of examples of [AI therapists] missing self-harm statements [and] overvalidating things that could be harmful to clients.”

As AI technology evolves and as adoption increases, experts who spoke with Reuters largely agreed that the focus should be on using AI as a gateway to care — not as a substitute for it.

But for those like Cote who are using AI therapy to help them get by, the use case is a no brainer. “I’m using the electricity of AI to save my life,” he says.

Share This Article
Email Copy Link Print
Previous Article Will Smith sparks severe backlash with latest move amid tour Clutch Fire
Next Article Dua Lipa leaves fans emotional with 30th birthday message Clutch Fire
Leave a Comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Your Trusted Source for Accurate and Timely Updates!

Our commitment to accuracy, impartiality, and delivering breaking news as it happens has earned us the trust of a vast audience. Stay ahead with real-time updates on the latest events, trends.
FacebookLike
XFollow
InstagramFollow
LinkedInFollow
MediumFollow
QuoraFollow
- Advertisement -
Ad image

You Might Also Like

Trending Stories

Victims killed in the L.A. wildfires include father and son, world traveler, former child actor Clutch Fire

By Saqib
Trending Stories

Bryan Kohberger lead prosecutor has “one last message” for convicted Idaho murderer Clutch Fire

By Saqib
Trending Stories

why PTI still stands strong Clutch Fire

By Saqib
Trending Stories

‘New realignments won’t dent Sino-Pak partnership’ Clutch Fire

By Saqib
ClutchFire ClutchFire
Facebook Twitter Youtube Rss Medium

About US


ClutchFire is a modern news and blog platform delivering reliable insights across tech, health & fitness, and trending topics. Our mission is to keep readers informed, inspired, and ahead of the curve with well-researched, up-to-date content that matters.. Your reliable source for 24/7 news.

Top Categories
  • Business
  • Education
  • Entertainment
  • Health
  • Lifestyle
  • Politics
Usefull Links
  • Privacy Policy
  • Terms and Conditions
  • About Us
  • Contact Us
  • Disclaimer
  • DMCA Policy

ClutchFire© ClutchFire. All Rights Reserved.

Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?