Do algorithms hold the key to long-term mental well-being?
Gen Z and millennials are seeking comfort in the laps of AI chatbots for their mental well-being. But can algorithms really provide long-term help?

There was a time when the future meant hoverboards, robotic butlers, and flying cars, and while The Jetsons got some of it right, the thought that machines could take over was simply considered paranoia. Machines couldn’t feel and think, or at least, that’s what we told ourselves. It’s 2025, and things have changed. Currently, technology doesn’t just assist the human race; it steers it. Maybe that’s why people feel comfortable confiding their innermost thoughts to an AI therapist, rather than one in flesh and bone. But this raises a complicated question: Can it truly replace a human connection?
THE AI APPEAL
Immediacy, anonymity, accessibility, affordability—AI therapy checks boxes that human therapy often can’t. In a culture driven by instant gratification, this expedited option feels like a no-brainer. A few clicks, a few prompts, and voila! your ‘therapist’ is ready to see you.
“I see the appeal, especially when I think about the state of mind someone may be in when they’re seeking the help of AI,” says Mumbai-based psychologist Sanam Devidasani.“I imagine someone who is experiencing a mix of emotional fatigue and a desire to feel heard without the energy or resources to take the more vulnerable step of reaching out to another human being.” AI offers just enough distance to feel safe—people can express themselves without being judged.“It gives you the illusion of a connection without the risk that comes with building a real connection with someone.”
In a society where mental health still carries stigma, the anonymity, accessibility, and affordability of AI therapy make it a more feasible option. As Devidasani points out, it’s less about rejecting therapy and more about reaching for some form of support, which is better than nothing at all.
And that’s a win worth acknowledging—AI in therapy helps normalise mental health support and encourages more people to seek help. “It helps educate the masses,” says Yesha Mehta, psychologist and Founder of The Shift Studio, an online talk-therapy platform. But it also risks setting a dangerous precedent if it starts being seen as a replacement for real therapy.
THE HUMAN TOUCH
Let’s be real—at the end of the day, AI therabots are not human. They lack the instinct, creativity, spontaneity, and deep emotional attunement essential for healing. Psychologist and published researcher Tanya Vasunia says no technology, no matter how advanced, can replace a genuine human connection.
“Social media has conditioned us to accept surface-level interactions as meaningful, offering the illusion of connection without the depth. Similarly, AI can simulate a conversation with remarkable sophistication, and in moments of distress, this can feel like real support. But that sense of ‘being heard’ can be deceptive,” she warns.
Here’s the irony: AI and social platforms are marketed as tools to feel more connected, but over time they often
breed isolation, especially for those who are vulnerable, believes Vasunia. It’s a validation machine, describes Devidasani. It mirrors back your thoughts in an articulate and empathetic language.
But here’s the catch: AI may sound empathetic, but it does not feel empathy. And that’s often what people using AI for therapy miss out on. While the chatbot may pick up a narrative from your cues, they are unable to go beyond that.
“An important aspect of therapy is being able to see and understand non-verbal cues. These cues help therapists understand underlying triggers and allow them to probe further,” adds Mehta. “Missing out on these cues significantly impacts the therapeutic process. It cannot assess danger, recognise suicidal ideation, or interpret when a person’s tone has shifted from reflective to tensed.”
According to Vasunia, it’s not just your story, but your shifts, your defences, and growth. “AI can’t sit with your silence, reflect your pain, or hold space for your vulnerability.” As a relationship therapist, Devidasani emphasises that much of therapy is in what is felt between two people—something AI can’t offer. She also warns that AI can reinforce emotional avoidance, since it doesn’t cause friction or challenge growth. “It encourages you to stay in your comfort zone. But therapy is not meant to be comfortable. Real work happens when you’re confronted and challenged.”
Unlike licensed therapists, AI platforms often lack clinical confidentiality. Even with data laws in place, sensitive information can be stored, analysed, or shared under the company’s terms of use.
BRIDGING THE GAP
Experts agree it can be a helpful starting point, especially for those hesitant about therapy. As a wellness tool,it offers support through exercises, reflection, journaling, psychoeducation, habit-building, and daily structure. But when it comes to persistent hopelessness, self-harm, trauma, or clinical issues like depression, anxiety, OCD, eating disorders, or anything that disrupts daily life, professional help is essential.
“It isn’t about what AI lacks; it’s about what that human absence can create. The problem arises when people rely solely on it for their mental health,” says Vasunia.
Just like the idea of machines taking over still feels dystopian, AI can’t yet replace psychotherapists. There’s space for both to coexist, especially in destigmatising mental health.AI should be seen for what it is: a tool.“Just like a fitness tracker supports physical health without replacing a doctor, AI can boost emotional well-being,” reminds Vasunia.“But its real value is in complementing— not replacing—human connection.”
All images: Getty Images
This article first appeared in the August-September 2025 issue of Harper's Bazaar India
Also read: Kajol in all her glory—untamed, unfiltered, unforgettable
Also read: Defining tomorrow: Four voices shaping a new cultural ethos