Written by: Muskaan Grover, Moitrayee Das
In the past two years, the role that AI plays in my life has changed dramatically. ChatGPT, the first of many open AI platforms, was my key to citations in college. Now, as a working professional, it plays multiple roles such as an editor, a proofreader, a friend I bounce pitch ideas off, or a coach who helps me plan my future.
Ready for a challenge? Click here to take our quiz and show off your knowledge!
AI chatbots have existed for some time, bringing their own set of issues, such as false positives, regulatory issues, data privacy, and consent. However, recently, I used OpenAI amid a stressful evening filled with anxiety when I turned to the ChatGPT app on my phone. It felt like a non-judgmental outlet that could potentially give me actionable advice or research-backed reasons for my feelings—essentially, a robotic logic-based response that could snap me out of my state without biases.
While it did exactly that, to my surprise, the suggestions were coupled with words of affirmation I have often heard therapists use. It is not shocking to consider that popular phrases used to express empathy are easily available online. However, it raises the question: if AI can mimic advice received from mental health professionals and (to a degree) provide comfort, what would this mean for the industry?
There is no denying that many still carry the weight of stigma when it comes to traditional therapy. For those reluctant to visit a counselor or therapist, whether due to fear of societal judgment or the discomfort of sharing struggles with another person, an accessible digital companion can feel like a solution. Studies indicate that AI chatbots can act as a safe (temporary) solution (Fitzpatrick et al., 2017; Vaidyam et al., 2019). In addition, research has shown that before reaching out to human professionals, individuals may find these digital interactions with conversational agents comforting (Provoost et al., 2017).
Ready for a challenge? Click here to take our quiz and show off your knowledge!
Early studies suggest that digital interventions can provide relief from symptoms of worry by employing techniques similar to cognitive-behavioral therapy (Fitzpatrick et al., 2017; Fulmer et al., 2018). The chatbots have the ability to give structured, yet algorithmic, responses that can offer a semblance of empathy and understanding (Fulmer et al., 2018; Vaidyam et al., 2019). However, these interventions are not without challenges. The advice provided by AI can often be too generic or out-of-context, failing to capture the nuanced, personal nature of emotional distress being faced by an individual (Provoost et al., 2017; Hoermann et al., 2017).
However, in the current age of digital evolution, mental health apps powered by AI aren’t confined solely to specialized chatbots. The evolution of platforms such as ChatGPT and Claude on our smartphones acts as quick tools providing us comfort and solutions at the palm of our hand. Whether one is navigating a difficult conversation at work or feeling overwhelmed by personal challenges, these AI channels have the increasing potential to be adopted as immediate, on-demand stress busters. Prior to the invention of OpenAI, conversational tools like Siri and Cortana were utilized for similar purposes (Miner et al., 2016).
Despite the growing AI tools, it must be considered that mental health concerns are complex, with each situation being unique and influenced by an individual’s histories, emotions, and experiences, which no algorithm can fully decipher. In addition, while AI chatbots or conversational platforms can provide us with a space to express our stress, worry, or sadness and potentially provide temporary relief in the moment, they cannot comprehend or rightly address deeper-level mental health concerns such as psychosis, manic or depressive episodes, or any other mental health concern truly plaguing an individual’s life. Thus, while AI might offer a temporary fix, it does not have the ability to replace the human touch and lens of genuine empathy and expertise essential for understanding and resolving the intricacies of our inner dialogue. To put it simply, the human condition is best understood by another human (at least that is what we know for now).
Another popular concern in the digital age is data privacy. Research has discussed that the rapid evolution of technology is overtaking the rate at which ethical guidelines are developed to protect our data (Torous & Nebeker, 2017). Additionally, while the intricate implications of language models and data privacy are still being understood, a study conducted on ChatGPT’s o3 model suggested the need for robust ethical guidelines and regulatory frameworks to govern the deployment and further development of these AI systems, ensuring that their benefits do not come at an unacceptable ethical cost (Floridi & Chiriatti, 2020).
Looking ahead, as AI continues to evolve and integrate more seamlessly into our daily routines, its potential seems boundless. While no one can know how it will revolutionize our lives, in the context of mental health, I believe it will always be a step behind the nuanced care offered by a mental health practitioner.
References
Fitzpatrick, K. K., Darcy, A., &Vierhile, M. (2017). Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): A randomized controlled trial. JMIR Mental Health, 4(2), e19.
Floridi, L., &Chiriatti, M. (2020). GPT-3: Its nature, scope, limits, and consequences. Minds and Machines, 30(4), 681–694.
Fulmer, R., Joerin, A., Gentile, B., Lakerink, L., &Rauws, M. (2018). Using psychological artificial intelligence (Tess) to relieve symptoms of depression and anxiety: Randomized controlled trial. JMIR Mental Health, 5(4), e64.
Hoermann, S., McCabe, K. L., Milne, D. N., & Calvo, R. A. (2017). Application of synchronous text-based dialogue systems in mental health interventions: Systematic review. Journal of Medical Internet Research, 19(8), e267.
Huckvale, K., Torous, J., & Larsen, M. E. (2019). Assessment of the data sharing and privacy practices of smartphone apps for depression and smoking cessation. JAMA Network Open, 2(4), e192542.
Inkster, B., Sarda, S., & Subramanian, V. (2018). An empathy-driven, emotionally intelligent chatbot: Examining the potential of AI in mental health interventions. Frontiers in Psychiatry, 9, 561.
Miner, A. S., Milstein, A., & Schueller, S. M. (2016). Smartphone-based conversational agents and responses to questions about mental health, interpersonal violence, and physical health. JAMA Internal Medicine, 176(5), 619–625.
Provoost, S., Lau, H. M., Ruwaard, J., & Riper, H. (2017). Embodied conversational agents in clinical psychology: A scoping review. Journal of Medical Internet Research, 19(5), e151.
Torous, J., & Nebeker, C. (2017). Navigating ethics in the digital age: Introducing connected and digital mental health. NPJ Digital Medicine, 1, 1–7.
Vaidyam, A. N., Wisniewski, H., Halamka, J. D., &Torous, J. (2019). Chatbots and conversational agents in mental health: A review of the psychiatric landscape. Canadian Journal of Psychiatry, 64(7), 456–464.
Muskaan Grover is an Associate HR Consultant with a Psychology Undergraduate degree from FLAME University, Pune. Moitrayee Das is an Assistant Professor of Psychology at FLAME University, Pune