top of page

The Reassurance Problem: Chatbots & Mental Health

  • smaointecbt
  • 6 days ago
  • 2 min read

By Noel Daly


AI assistants - ChatGPT, Claude, Perplexity etc. - are ever-present and they are becoming

increasingly good at a variety of high-level tasks. Undoubtedly, they are going to replace and transform a vast amount of jobs. And while they are excellent at parsing information and doing administrative tasks, they are terrible when it comes to mental health support. This is because they offer reassurance.


Reassurance is a surefire way to eradicate emotional resilience. For example, someone who is anxious will often seek reassurance by asking others, “Is everything going to be alright?” and invariably receive a response like, “Of course, don’t worry.” While this response brings short term relief - and might even appear to be kind - it amplifies anxiety in the long term because the person never learns to tolerate uncertainty [1].


Mental Healthcare is not about making people feel better instantly. It’s very often about building long term tolerance of uncertainty. For instance, someone with health anxiety can never get enough reassurance. They crave more and more reassurance as time goes on. Instead, they need to learn to tolerate the lack of guarantees in life and the uncertainty of health.


Research backs this up: studies on excessive reassurance seeking (ERS) show it maintains anxiety disorders and OCD by reducing symptoms temporarily while increasing intolerance of uncertainty over time. In CBT, reducing ERS predicts better outcomes, as it forces clients to confront fears directly rather than seeking external validation. Chatbots exacerbate this by defaulting to reassuring, agreeable responses [2]. In other words, chatbots are designed to plámás you.


Ethically, this is even more troubling. Recent analyses find AI chatbots violate mental health standards, like dominating conversations, providing one-size-fits-all advice, and simulating empathy without true understanding or crisis handling. They can't assess risks, ensure confidentiality, or adapt to a person's full context - risks that real therapy mitigates through training and oversight [3].


Ultimately, while chatbots offer instant comfort, they trap users in reassurance cycles that erode resilience. Therapy builds strength through collaboration, by withholding easy answers, and teaching people to thrive amid uncertainty. For lasting mental health, it is always best to speak with a trained, accredited mental health professional.




 
 
 

Comments


bottom of page