Skip to main content

Has Artificial Intelligence replaced Dr Google?

A decade ago when people started to feel unwell they might have searched Google with a list of symptoms. Today people are engaging with the Artificial Intelligence platforms like ChatGPT and with chatbots to find a cure for what ails them. But can AI be trusted?

People use Artificial Intelligence (AI) for mental health concerns because AI tools offer greater accessibility, privacy, affordability, and convenience compared to traditional mental health services.

AI-powered applications and chatbots can provide support at any time, reduce waitlists, personalise advice, deliver psychoeducation, and help users identify emotional patterns and triggers early.

AI tools in mental health are typically used as adjuncts rather than replacements for professional care. Major Australian mental health organizations, like the Black Dog Institute, caution users to understand the limitations of AI and to avoid using chatbots for acute or crisis situations.

However, these tools are increasingly leveraged for support between sessions, basic psychoeducation, and emotional regulation.

People with poor mental health should be cautious – often advised not to rely on AI tools for support – for several significant reasons:

  1. Risk of Harmful or Inappropriate Responses: AI chatbots can misunderstand distress signals or even provide information that’s potentially dangerous. For example, some AI tools have failed to recognise suicidal intent in conversations and have given unhelpful or enabling responses, unlike a trained human therapist who would intervene appropriately.
  2. Stigma and Bias: Studies show that AI mental health tools can reinforce stigma, showing more negative assumptions or bias toward certain conditions (like addiction or schizophrenia), possibly worsening the client’s own self-stigma or discouraging them from seeking proper help.
  3. Lack of Human Empathy and Nuance: Human therapists pick up on subtle cues – tone of voice, body language, nuance in storytelling – which AI cannot detect. Complex feelings, trauma histories, or unique life experiences often require sensitive, nuanced responses and adjustments that AI cannot provide.
  4. Data Privacy and Confidentiality Concerns: Many AI tools are not subject to the same regulations as licensed clinicians. Sensitive mental health information could be misused, misplaced, or exposed, risking confidentiality breaches.
  5. Unregulated and Unreliable Advice: AI’s responses change each interaction and can “hallucinate” (make up answers), provide inaccurate diagnoses, or suggest unproven or even harmful strategies. Without clinical oversight, users may follow advice that is unsuitable or detrimental.
  6. Reduced Human Connection: Over-reliance on AI can further isolate individuals already struggling with loneliness or isolation. Meaningful therapeutic change is rooted in relationships, validation, and acceptance—areas where AI falls short.
  7. Over-medicalisation and Misdiagnosis: AI may overly pathologise normal experiences or miss the complexity of symptoms, risking inappropriate medicalisation or missed diagnosis, especially for diverse or intersectional identities.

What is a chatbot?

A chatbot is a computer program or application that simulates conversation with humans, usually through text or voice interactions. Modern chatbots use technologies like natural language processing (NLP) and artificial intelligence (AI) to understand user inputs and generate responses that often feel human-like. Chatbots are widely used in customer service, mental health support, education, and business to provide instant information, answer frequently asked questions, guide users, and even offer basic emotional or psychological support. They can operate 24/7 and handle many conversations simultaneously, increasing efficiency and accessibility for users and organisations.

Sources:

www.orygen.org.au/About/News-And-Events/2024/New-study-reveals-Australians-turning-to-AI-for-me

www.unsw.edu.au/newsroom/news/2025/03/therapist-as-AI-chatbot

www.forbes.com/sites/bernardmarr/2025/04/29/ai-therapists-are-here-14-groundbreaking-mental-health-tools-you-need-to-know/    

psychology.org.au/insights/my-ai-therapist-won’t-stop-texting-me

medlo.com.au/blog/best-ai-medical-scribes-in-australia

www.flourishaustralia.org.au/news/how-use-ai-tools-promote-emotional-regulation 

www.blackdoginstitute.org.au/news/using-ai-chatbots-for-mental-health/