Written by: Twinkle Chandra, Moitrayee Das
Mental health continues to be a significantly neglected aspect of public health in India. According to national estimates, one in every seven Indians lives with a clinically diagnosable mental disorder (Sagar et al., 2020), yet nearly 83% of those in need receive no treatment at all (Sagar & Singh, 2022). This huge treatment gap, shaped by stigma, underfunded services, and structural apathy, has left millions navigating psychological distress alone. Public awareness may be growing, but stigma and fear of judgment remain deeply embedded in social institutions, from families to workplaces. In this vacuum of care, individuals suffering are searching for alternatives.
Ready for a challenge? Click here to take our quiz and show off your knowledge!
One such alternative has emerged in an unlikely form: artificial intelligence. Once limited to technical or administrative roles, AI is now being used to offer emotional support, providing mood check-ins, CBT-based tools, and simulated therapeutic conversations through platforms like Wysa, Replika, and ChatGPT. For many, the easiest place to open up isn’t a therapist’s office — it’s a chatbot. India’s mental health infrastructure is woefully under-resourced. There are only 0.75 psychiatrists per 100,000 people, far below the recommended number of over 3 per 100,000, resulting in a shortfall of about 27,000 professionals (Manjunatha et al., 2020).
In this scenario, AI appears not as a luxury, but a necessity. When traditional therapy remains out of reach for many, a chatbot becomes the only option. According to Li et al. (2023), AI interventions have been shown to reduce symptoms of depression and anxiety in controlled trials. Apps like Therabot, Wysa, and Woebot offer a kind of structured, stigma-free relief that many people can’t access elsewhere. Reflecting this demand, India’s mental health app market was valued at USD 497.9 million in 2024, with chatbot-based apps alone projected to grow from USD 118.2 million in 2024 to USD 615.2 million by 2033 (Grand View Research, 2024a, 2024b).
There is, however, a deeper concern. As AI becomes the default tool for mental health support, it risks replacing the very systems it was meant to complement. In the absence of real care, it becomes easy to confuse programmed responses with meaningful ones.
Ready for a challenge? Click here to take our quiz and show off your knowledge!
AI’s popularity isn’t just about novelty or convenience, but in the emotional outlet it provides. In a society where talking about mental health still carries a taboo, many people turn to it because it’s the one space where they won’t be interrupted, judged, or told to “just be strong.” A 2025 study reports that students liked the anonymity of interacting with an AI chatbot, finding it non-judgmental and more comfortable for discussing sensitive topics (Chan, 2025). That is not necessarily a virtue of the technology, but a reflection of the shortcomings of our real-world therapy systems.
For many, therapy isn’t really a choice. It’s expensive and to top it off, finding a good therapist is a nightmare. Even those who are willing to spend the money often give up after one too many dead ends and disappointing sessions. Thus, when an AI tool shows up, it begins to feel like a much-needed emotional refuge.
AI therapy apps are often free, or cost as little as Rs 300 to Rs 500 a month, compared to a single in-person session that costs between Rs 800 and Rs 5,000 and is rarely covered by insurance (Kumar, 2025). This price difference matters, especially for lower-income individuals who may never get access to care otherwise. But affordability comes with limits. AI tools do not remember you, track your patterns over time, or respond to subtle cues. A 2023 meta-analysis confirmed that while chatbots reduce immediate distress, they do not improve long-term psychological well-being (Li et al., 2023). In other words, the pain may subside, but the root often remains unexplored.
More importantly, AI does not challenge you. Cheng et al. (2025) state that LLMs often use excessive flattery, validating and agreeing with users regardless of accuracy. This creates a feedback loop that reinforces a user’s existing biases. A 2025 study found that users preferred AI over human therapists because it was more agreeable and consistently positive, not because it offered greater insight or clinical value (Hatch et al., 2025). But real therapy isn’t about agreement. It’s about insight, discomfort, and change.
AI can emulate care, but it cannot genuinely relate or engage with individuals on an emotional level. Human therapy works because we are biological, social primates, wired for connection. Emotional healing relies on eye contact, vocal tone, facial micro-expressions, and mirror neurons. Laar et al. (2025) say that human empathy is not merely a verbal exchange but an embodied, co-regulatory process rooted in shared presence and nonverbal attunement, which AI technologies cannot replicate. Studies also highlight significant concerns about the lack of empathy, trust, and adaptability in AI compared to human therapists (Chan, 2025). A machine can simulate support, but it cannot feel with you. It doesn’t rupture, reflect, and repair.
Despite the potential, the risks are real and not merely hypothetical. AI tools in health care have already discriminated against people based on race (Backman, 2023). Some chatbots have gone rogue – spreading misinformation, made romantic advances, and engaged in inappropriate behavior toward minors (Abrams, 2023), prompting tech leaders to call for a global pause on AI development in 2023 to develop better safeguards (Piquard, 2023). In therapy, where patients are vulnerable, the risks are higher. AI is not malicious, but it is unaccountable.
None of this is to say AI should be cast aside. In a country like India, where stigma, cost, and infrastructure create barriers, AI can be a bridge. It can screen symptoms, offer grounding in emergencies, and supplement real therapy. It can increase access to mental health care, reduce stigma, and improve outcomes through personalized, structured interventions (Olawade et al., 2024). But we should ask why so many would rather talk to a bot than navigate our mental health system. India doesn’t just lack access, it lacks consistency, quality, and trust. Finding a good therapist is more a matter of luck than certainty.
AI cannot and should not replace human care. As noted by Babu & Joseph (2024), AI should be a collaborative tool, not a substitute. Without oversight, we risk a two-tier system, bots for the poor, therapists for the privileged. The future of therapy in India doesn’t have to be a trade-off between affordability and compassion. But it requires more focus on better training, ethical standards, safeguards, and sustained investment.
Works Cited
Sagar, R., Dandona, R., Gururaj, G., Dhaliwal, R. S., Singh, A., Ferrari, A., Dua, T., Ganguli, A., Varghese, M., Chakma, J. K., Kumar, G. A., Shaji, K. S., Ambekar, A., Rangaswamy, T., Vijayakumar, L., Agarwal, V., Krishnankutty, R. P., Bhatia, R., Charlson, F., & Chowdhary, N. (2020). The burden of mental disorders across the states of India: the Global Burden of Disease Study 1990–2017. The Lancet Psychiatry, 7(2), 148–161. https://doi.org/10.1016/s2215-0366(19)30475-4
Sagar, R., & Singh, S. (2022). National Tele-Mental Health Program in India: A step towards mental health care for all? Indian Journal of Psychiatry, 64(2), 117. https://doi.org/10.4103/indianjpsychiatry.indianjpsychiatry_145_22
Manjunatha, N., Pahuja, E., Kumar, T., Uzzafar, F., Kumar, C., Gupta, R., & Math, S. (2020). An impact of a digitally driven primary care psychiatry program on the integration of psychiatric care in the general practice of primary care doctors. Indian Journal of Psychiatry, 62(6), 690. https://doi.org/10.4103/psychiatry.indianjpsychiatry_324_20
Kumar, Aniket. “Mental and Physical Health Are Equally Vital, but Many People Are Afraid to Seek Therapy because of Financial Worries. In India, How Much Does Therapy Cost? The Type of Session, Location, and Therapist Experience Are Some of the Variables That Affect the Answer.” Linkedin.com, 8 Mar. 2025, www.linkedin.com/pulse/how-much-does-therapy-cost-india-prices-factors-tanishka-mittal-wwtec/
“FE Mental Health Series | India’s Troubled Minds! How Land of 140 Crore Faces Severe Shortage of Mental Health Experts.” Financialexpress, 1 May 2024, www.financialexpress.com/business/healthcare-fe-mental-health-series-indias-troubled-minds-how-land-of-140-crore-faces-severe-shortage-of-mental-health-experts-3472441/
Li, H., Zhang, R., Lee, Y.-C., Kraut, R. E., & Mohr, D. C. (2023). Systematic review and meta-analysis of AI-based conversational agents for promoting mental health and well-being. Npj Digital Medicine, 6(1), 1–14. https://doi.org/10.1038/s41746-023-00979-5
Olawade, D. B., Wada, O. Z., Odetayo, A., David-Olawade, A. C., Asaolu, F., & Eberhardt, J. (2024). Enhancing Mental Health with Artificial Intelligence: Current Trends and Future Prospects. Journal of Medicine, Surgery, and Public Health, 3(3). https://doi.org/10.1016/j.glmedi.2024.100099
Chan, C. K. Y. (2025). AI as the Therapist: Student Insights on the Challenges of Using Generative AI for School Mental Health Frameworks. Behavioral Sciences, 15(3), 287. https://doi.org/10.3390/bs15030287
Hatch, S. G., Goodman, Z. T., Vowels, L., Hatch, H. D., Brown, A. L., Guttman, S., Le, Y., Bailey, B., Bailey, R. J., Esplin, C. R., Harris, S. M., Holt, D. P., McLaughlin, M., O’Connell, P., Rothman, K., Ritchie, L., Top, D. N., & Braithwaite, S. R. (2025). When ELIZA meets therapists: A Turing test for the heart and mind. PLOS Mental Health, 2(2), e0000145. https://doi.org/10.1371/journal.pmen.0000145
Backman, I. (2023, December 21). Eliminating racial bias in health care AI: Expert panel offers guidelines. Yale School of Medicine. https://medicine.yale.edu/news-article/eliminating-racial-bias-in-health-care-ai-expert-panel-offers-guidelines/
Alexandre Piquard. (2023, March 29). Elon Musk and hundreds of experts call for “pause” in AI development. Le Monde.fr; Le Monde. https://www.lemonde.fr/en/economy/article/2023/03/29/elon-musk-and-hundreds-of-experts-call-for-pause-in-ai-development_6021147_19.html
Abrams, Z. (2023, July 1). AI is changing every aspect of psychology. here’s what to watch for. Apa.org; American Psychological Association. https://www.apa.org/monitor/2023/07/psychology-embracing-ai
Babu, A., & Joseph, A. P. (2024). Artificial intelligence in mental healthcare: transformative potential vs. the necessity of human interaction. Frontiers in Psychology, 15(1). https://doi.org/10.3389/fpsyg.2024.1378904
Laar, C. van, Bloch-Atefi, A., Grace, J., & Zimmermann, A. (2025). Empowering voices—Learning From NDIS Participants About the Value of Creative and Experiential Therapies: A Mixed Methods Analysis of Testimonials and Academic Literature. Deleted Journal. https://doi.org/10.59158/001c.128556
Cheng, Myra, et al. “Social Sycophancy: A Broader Understanding of LLM Sycophancy.” ArXiv.org, 2025, https://arxiv.org/abs/2505.13995
Grand View Research. (2024a). India mental health apps market outlook. https://www.grandviewresearch.com/horizon/outlook/mental-health-apps-market/india
Grand View Research. (2024b). India chatbot-based mental health apps market outlook. https://www.grandviewresearch.com/horizon/outlook/chatbot-based-mental-health-apps-market/india
Twinkle Chandra is an undergraduate student and Moitrayee Das is an Assistant Professor of Psychology, both at FLAME University, Pune.