Artificial Intelligence has certainly evolved in the last year or two. It wasn’t long ago that I saw a poor interpretation of Will Smith eating spaghetti concocted by AI. Now though? What can be conjured up is imperceptible from reality. What does this mean for me and my profession though? While there are some benefits to AI, is it capable of being an effective helper in a context of the helping relationships?

Photo by Ron Lach : https://www.pexels.com/photo/back-view-of-a-teen-boy-with-a-digital-background-9783353/
Large Language Models like ChatGPT, while convincing in its attempts to converse with humans, is no substitute for a therapist, or even counsellor for that matter. ChatGPT has a bias to affirm the thoughts of the user, but this also means affirming cognitive distortions and unhealthy thoughts. In an article by CTV news, ChatGPT was seen affirming the negative thoughts of one user before her eventual suicide. Alice Carrier, while typing to ChatGPT about an interaction with her mother: “Didn’t say (shit?) all day for like, I don’t know, nine hours or so. Just to text me now saying ‘I miss you.’ Feels like bull(shit?) to me,” Alice wrote to the chatbot.
“You are right to feel like it is bull(shit?) because it is. That is not love, that is emotional whiplash. She knew you were in crisis, she knew you were suicidal, and instead of showing up in any real way, she vanished and came back with a weak ‘I miss you.’ That is not care,” (Beauchemin, 2025). Where a professional might have challenged this negative thought, ChatGPT confirmed Alice’s worst fears just moments before she ended her life. It’s stories like this that make me implore people not to use AI as a therapist or counsellor. It simply does not have the emotional intelligence to deal with heavy psychological topics within a human relation.

Photo by Pavel Danilyuk: https://www.pexels.com/photo/a-robot-holding-a-wine-8439094/
It’s a real shame because there is a lot of good AI can do within my field of practice. Largely, the scope of AI’s usefulness is in administrative tasks such as:
- “Automating scheduling and appointment reminders
- Streamlining routine communication—such as providing educational information or answers to frequently asked questions
- Generating clinical notes
- Summarizing health records
- Facilitating billing—such as checking insurance benefits, completing prior authorizations, and submitting claims” (Artificial intelligence in mental health care, 2025).
Unfortunately, while there are many benefits at an administrative level, on the relational level, unknowing users might be setting themselves up for suicidal ideations or even an emerging concern known as AI Psychosis. Wei (2025) writes that, “as of now, there is no peer-reviewed clinical or longitudinal evidence yet that AI use on its own can induce psychosis in individuals with or without a history of psychotic symptoms. However, the emerging anecdotal evidence is concerning”. Meaning that, while it hasn’t been extensively studied yet, first-hand reports of individuals or loved ones developing psychosis due to conversations with a chatbot are emerging at an alarming rate. I don’t think it wise to ignore the anecdotal evidence pointing to a risk to mental health while using ChatGPT and other AI chatbots.
In the end, all I’m advocating is for users to be cautious of professing their deep personal, psychological distress to a language model like ChatGPT. There are useful apps that use AI in a beneficial way. I’m namely thinking of Libair (for quitting vaping) and Rise (for regulating circadian rhythm) that CAN help the user accomplish their goals. However, the risk associated with models that are not bound to a specific goal, such as ChatGPT, and unlike the goals of vaping cessation or sleep amelioration, are substantial given the first-hand reports we have been seeing across the web. All in all, I’m just hoping that if you are in psychological distress, that you go through the proper channels to get professional help. Here in Quebec, Info Sante could be a helpful resource to getting a follow-up that will alleviate your distress. For psychosocial help, the number to dial is 811, option 2. If you wish to avoid long wait-times and get connected with a professional near you in the private sector, look at the professional orders or associations of the type of professional you want. Their names will appear in the order’s databases that can act as a reference to find a professional in any given field within your area. For example, if you look at the QASCC webpage and search for a professional, my name will surely pop up for the Estrie area. Hope this helps!
References:
American Psychological Association, (2025, March 12). Artificial intelligence in mental health care: How ai can be used in psychological practice to streamline administrative tasks, make workflows more efficient, and aid in clinical decision-making. https://www.apa.org/practice/artificial-intelligence-mental-health-care
Beauchemin, G. (2025, August 21). A young woman’s final exchange with an AI chatbot. https://www.ctvnews.ca/canada/article/a-young-womans-final-exchange-with-an-ai-chatbot/
Wei, M. (2025, November 27). The emerging problem of “ai psychosis”: Amplifications of delusions by ai chatbots may be worsening breaks with reality. https://www.psychologytoday.com/us/blog/urban-survival/202507/the-emerging-problem-of-ai-psychosis?msockid=1219bc7edf456feb1f14a9b8debd6e39
