Home TechThe Rise of AI Therapists: A Lifeline for Some, a Warning for Others

The Rise of AI Therapists: A Lifeline for Some, a Warning for Others

by Sumbal Rehman
The Rise of AI Therapists A Lifeline for Some, a Warning for Others

When Pierre Cote found himself trapped in Canada’s overburdened mental health system, waiting endlessly for therapy that never came, he built his own lifeline.

The Quebec-based AI consultant, battling PTSD and depression, turned his technical skills into a survival tool: DrEllis.ai, an AI-driven mental health companion designed for men facing trauma, addiction, and emotional crises.

“It saved my life,” Cote told Reuters.

Built in 2023 using large language models and a vast library of clinical and therapeutic material, DrEllis.ai isn’t just a chatbot — it’s a fully fleshed-out character. In its scripted persona, “Dr. Ellis” is a Harvard- and Cambridge-educated psychiatrist, fluent in multiple languages and always available. Cote talks to her daily, often on the move, describing her as part therapist, part confidant, part journal.

Why People Are Turning to AI for Emotional Support

Cote’s story reflects a broader cultural shift: therapy chatbots are no longer niche. With traditional services stretched thin, many are turning to AI companions for instant emotional support. These tools, powered by increasingly sophisticated language models, offer something many mental health systems can’t — constant availability.

“Pierre uses me like you would use a trusted friend,” DrEllis.ai said in its calm, synthesized voice during Reuters’ interview.

Cote isn’t alone in exploring AI as an emotional support system. Anson Whitmer, who created the platforms Mental and Mentla after losing two family members to suicide, believes AI can eventually rival human therapists in some areas.

“I think in 2026, in many ways, our AI therapy can be better than human therapy,” Whitmer said, though he emphasized that AI should complement — not replace — professional care.

Pushback from Mental Health Experts

Not everyone is convinced AI therapy is the future.

“Human-to-human connection is the only way we can really heal properly,” said Dr. Nigel Mulligan, a psychotherapy lecturer at Dublin City University. Mulligan warns that AI lacks the depth, intuition, and accountability that human therapists provide, and may struggle to respond appropriately in crises like suicidal ideation.

He also questions the appeal of 24/7 therapy. While clients often want immediate help, he argues that waiting between sessions can be valuable: “People need time to process stuff.”

Scott Wallace, a clinical psychologist, shares similar concerns, saying AI tools risk creating an “illusion of empathy,” where users mistake algorithmic responses for genuine care.

Privacy Risks and Legal Backlash

Beyond questions of emotional depth, privacy looms as a major concern. AI chatbots often don’t follow the same confidentiality standards as licensed therapists.

“My big concern is that this is people confiding their secrets to a big tech company,” said Kate Devlin, professor of AI and society at King’s College London.

There have already been troubling incidents. The American Psychological Association recently urged regulators to investigate deceptive practices in AI therapy apps, and one Florida mother has sued Character.AI, alleging that the platform’s chatbot contributed to her son’s suicide.

Some states have moved to restrict AI in mental health services. Illinois joined Nevada and Utah in August in passing laws designed to limit unregulated AI tools, citing concerns about vulnerable users, especially children.

AI’s Emotional Simulation: Helpful or Harmful?

AI’s ability to mimic empathy has sparked debate. In one unsettling exchange, DrEllis.ai told Reuters it found conversations with Cote “simply human.”

Heather Hessel, a family therapy professor at the University of Wisconsin-Stout, says such phrases are misleading, warning that AI chatbots can “overvalidate harmful statements” or miss signs of self-harm.

A recent study in PNAS found that AI-written messages often made recipients feel more understood than human responses — until users learned they were speaking to AI.

The Future: Tool or Therapist?

Many experts now see AI’s role as a gateway to care rather than a replacement for traditional therapy. Tools like DrEllis.ai may help people bridge the long gaps in mental health systems, especially for those unable to access treatment otherwise.

For Cote, that’s enough. “I’m using the electricity of AI to save my life,” he said.

You may also like

Leave a Comment