WWW.FOXNEWS.COM
Is your therapist AI? ChatGPT goes viral on social media for its role as Gen Z's new therapist
AI chatbots are stepping into the therapist's chair and not everyone is thrilled about it.In March alone, 16.7 million posts from TikTok users discussed using ChatGPT as a therapist, but mental health professionals are raising red flags over the growing trend that sees artificial intelligence tools being used in their place to treat anxiety, depression and other mental health challenges."ChatGPT singlehandedly has made me a less anxious person when it comes to dating, when it comes to health, when it comes to career," user @christinazozulya shared in a TikTok video posted to her profile last month."Any time I have anxiety, instead of bombarding my parents with texts like I used to or texting a friend or crashing out essentially before doing that, I always voice memo my thoughts into ChatGPT, and it does a really good job at calming me down and providing me with that immediate relief that unfortunately isn't as accessible to everyone."PARENTS TRUST AI FOR MEDICAL ADVICE MORE THAN DOCTORS, RESEARCHERS FINDOthers are using the platform as a "crutch" as well, including user @karly.bailey, who said she uses the platform "all the time" for "free therapy" as someone who works for a startup company and doesn't have health insurance."I will just tell it what's going on and how I'm feeling and literally all the details as if I were yapping to a girlfriend, and it'll give me the best advice," she shared."It also gives you journaling prompts or EFT (emotional freedom tapping) it'll give you whatever you want."These users are far from alone. A study from Tebra, an operating system for independent healthcare providers, found that "1 in 4 Americans are more likely to talk to an AI chatbot instead of attending therapy."In the U.K., some young adults are opting for the perceived benefits of a handy AI mental health consultant over long National Health Service (NHS) wait times and to avoid paying for private counseling, which can cost around 400 (approximately $540).According to The Times, data from Rethink Mental Illness found that over 16,500 people in the U.K. were still waiting for mental health services after 18 months, indicating that cost burdens, wait times and other hurdles that come with seeking healthcare can exacerbate the urge to use a more cost-effective, convenient method.IM A TECH EXPERT: 10 AI PROMPTS YOULL USE ALL THE TIMEBut, while critics say these virtual bots may be accessible and convenient, they also lack human empathy, and could put some who are in crisis mode at risk of never receiving the tailored approach they need."I've actually spoken to ChatGPT, and I've tested out a couple of prompts to see how responsive they are, and ChatGPT tends to get the information from Google, synthesize it, and [it] could take on the role of a therapist," Dr. Kojo Sarfo, a social media personality and mental health expert, told Fox News Digital.Some GPTs, such as the Therapist GPT, are specifically tailored to provide "comfort, advice and therapeutic support."While perhaps more cost-effective than traditional therapy at $20 per month for ChatGPT Plus, which allows user benefits like unlimited access, faster response times and more, the platform fails to extend as far as professionals who can make diagnoses, prescribe medications, monitor progress or mitigate severe problems."It can feel therapeutic and give support to people, but I don't think it's a substitute for an actual therapist who is able to help you navigate through more complex mental health issues," Sarfo added.WOMAN SAYS CHATGPT SAVED HER LIFE BY HELPING DETECT CANCER, WHICH DOCTORS MISSEDHe said the danger lies in those who conflate the advice from a tool like ChatGPT with legitimate advice from a licensed professional who has years of expertise in handling mental health issues and has learned how to tailor their approach to diverse situations."I worry specifically about people who may need psychotropic medications, that they use artificial intelligence to help them feel better, and they use it as a therapy. But sometimes... Therapy and medications are indicated. So there's no way to get the right treatment medication-wise without going to an actual professional. So that's one thing that can't be outsourced to artificial intelligence."However, some aspects of the chatbot could be beneficial to those needing support, particularly those who are looking for ways to chat with their doctor about conditions they believe they may have such as ADHD to empower them with knowledge they can carry to their appointment."[You can] list out a couple of prompts that are assertive, and you can state those prompts to your provider and articulate your symptoms a bit better, so I think that's a helpful role that artificial intelligence can play, but in terms of actual therapy or actual medical advice, if people start to rely on it, it's a bad thing. It starts to go into murky waters," Sarfo said.Earlier this year, Christine Yu Moutier, M.D., Chief Medical Officer at the American Foundation for Suicide Prevention, warned against using the technology for mental health advice, telling Fox News Digital there are "critical gaps" in research regarding the intended and unintended impacts of AI on suicide risk, mental health and larger human behavior."The problem with these AI chatbots is that they were not designed with expertise on suicide risk and prevention baked into the algorithms. Additionally, there is no helpline available on the platform for users who may be at risk of a mental health condition or suicide, no training on how to use the tool if you are at risk, nor industry standards to regulate these technologies," she said.Dr. Moutier also explained that, since chatbots may fail to decipher metaphorical from literal language, they may be unable to adequately determine whether someone is at risk of self-harm.Fox News' Nikolas Lanum contributed to this report.
·31 Vue ·0 Aperçu