Writer Regina Pochocki explores the uses and defends the values of text-generating AI models.
Writer Regina Pochocki explores the uses and defends the values of text-generating AI models.
Talking through challenging emotions with artificial intelligence may seem dystopian, but given the recent proliferation of AI, it’s really not.
AI has been incorporated seamlessly into daily life, like Spotify’s AI-curated daylists for your “Emo Mallgoth Sunday Evening,” or Google’s new generative AI search augmenter.
Despite these banal implementations of AI, many have expressed skepticism or even worry over the pervasiveness and potential of the technology. But my question to AI skeptics is — why can’t ChatGPT be a therapy tool as well?
Although it can’t replace human-based mental health resources, it’s an underutilized but powerful tool for supporting mental health.
The platform has the potential to offer a wide assortment of methods for individuals seeking mental health support, especially in a world where access to professionals is limited, expensive and even stigmatized.
In these situations, ChatGPT presents an innovative solution — it’s free and discreet. Accessible via both app and web browser, ChatGPT is available 24/7, meaning it offers on-demand support that can help people in their most vulnerable moments.
For those who don’t have the time, resources or confidence to research mental health conditions independently, ChatGPT is an asset to condense complex strategies into more digestible terms.
This could include coping mechanisms, therapeutic exercises or recommendations for further professional help, providing users with an entry-level understanding of how to manage their mental health.
This convenience is especially helpful for individuals with high-stress jobs, tight schedules or personal crises that require accessible but flexible coping strategies.
Despite its convenience, it’s crucial to acknowledge ChatGPT’s limitations. It’s important to remember while the chatbot may offer convenient emotional support, artificial intelligence isn’t a replacement for medical professionals or for human connection in general.
Sewell Setzer III from Orlando was a 14-year-old boy who became attached to Daenerys Tatagaryen, a Character.AI chatbot. Over the course of their socializing, Setzer’s infatuation with the chatbot overtook his life, and he began to feel his relationship with the chatbot was more important than with his friends or family, according to The Wall Street Journal.
Eventually, the teenager confided in the chatbot about his suicidal ideations before taking his own life Feb. 28.
This tragic instance of AI and human interaction highlights how dangerous a lack of human connection can be.
Over reliance on AI for emotional support could keep individuals from seeking tailored care provided by trained professionals, or in the most extreme cases like Setzer’s, propel them to impulsive and potentially harmful acts.
Unlike human therapists, AI can’t truly empathize or understand the depth of human emotions — including subtle cues like tone or body language.
AI will never replace human-based therapy — but it’s making mental health resources more accessible. ChatGPT can act as the initial push, especially for those who are hesitant,but ready to take the first step toward traditional therapy practices.