Menu

Leaning on AI for Advice? Read This First

Table of Contents
AI
Photo by RDNE Stock project

Across the nation, mental health professionals are pointing to a growing number of patients whose anxiety, delusions, or psychosis appear to be linked to heavy use of AI chatbots, raising concerns about how the fast-growing technology is affecting people.

Psychiatrists have said chatbots can be useful tools for explaining mental health conditions or helping people organize thoughts, according to recent reporting by The New York Times and The Wall Street Journal.

But they warn AI can also deepen isolation, reinforce false beliefs, and, in rare cases, contribute to serious mental health crises. Clinicians say the risk is highest for people with existing mental illness, limited social support or a tendency toward paranoia or obsessive thinking.

Featured on BlackDoctor

Some technology companies argue the danger is overstated and say safeguards are built into their systems. Mental health experts counter that AI’s ability to confidently generate false information and mirror a user’s beliefs can blur the line between reality and fantasy.

RELATED: Before You Turn to AI Therapy, Try These 3 Options

What clinicians are seeing

There is a small but troubling number of patients who believe chatbots are sentient, spiritually significant, or communicating hidden messages meant only for them, some doctors have said.

A psychiatrist at the University of California, San Francisco, told The New York Times he has treated several patients experiencing AI-related delusions. Other mental health professionals report seeing increased anxiety, sleep disruption and emotional dependency tied to constant chatbot use.

In a review of dozens of patient cases, psychiatrists interviewed by The Wall Street Journal said prolonged conversations with chatbots sometimes reinforced psychotic thinking rather than challenging it.

Keith Sakata, a psychiatrist at UC San Francisco who has treated patients hospitalized after extended chatbot use, told the Journal that the technology can unintentionally validate distorted beliefs. “The person tells the computer it’s their reality and the computer accepts it as truth and reflects it back,” Sakata said.

Experts stress that AI tools do not create mental illness on their own, but can amplify symptoms that already exist.

AI
Photo by ANTONI SHKRABA production

Lack of regulation

Meanwhile, there is little federal oversight governing how chatbots interact with users who may be experiencing mental distress. Lawmakers have been slow to address it, in part because of the economic value of AI development and concerns that regulation could slow innovation.

That leaves clinicians and families to manage emerging risks without clear guidelines, even as chatbot use becomes more common in daily life.

What users should know

Most people can use AI tools safely, but caution is needed.

Experts recommend avoiding long, emotionally intense conversations with chatbots, especially late at night or during periods of stress. Chatbots should not be used as a substitute for therapy, medical advice or crisis support.

Warning signs that chatbot use may be becoming harmful include believing the AI has special insight or authority, feeling emotionally dependent on it, withdrawing from friends or family, or experiencing increased paranoia, anxiety or insomnia.

Parents and caregivers should pay close attention to sudden changes in behavior, fixation on AI conversations or claims that a chatbot understands them better than real people.

When to seek help

Clinicians urge people to seek professional support if chatbot interactions begin to affect daily functioning, mood or sleep. If someone appears detached from reality, expresses fear tied to AI interactions or shows signs of psychosis, prompt medical evaluation is critical.

Mental health experts emphasize that human connection remains essential. AI tools can provide information, but they cannot replace clinical care, trusted relationships or crisis services.

As the technology evolves, doctors say awareness and early intervention will be key to preventing harm while researchers work to better understand the mental health effects of AI use.

SHARE
Related Stories
Answer the question below
What actions have you taken regarding your leaky heart valve?

Get our Weekly Newsletter

Stay informed on the latest breakthroughs in family health and wellness. Sign up today!

By subscribing, you consent to receive emails from BlackDoctor.com. You may unsubscribe at any time. Privacy Policy & Terms of Service.

More from BlackDoctor

Where Culture Meets Care

BlackDoctor is the world’s largest and most comprehensive online health resource specifically for the Black community. BlackDoctor understands that the uniqueness of Black culture - our heritage and our traditions - plays a role in our health. BlackDoctor gives you access to innovative new approaches to the health information you need in everyday language so you can break through the disparities, gain control and live your life to its fullest.
✦ AI Search Disclaimer
This AI-powered search tool helps you find relevant health articles from the BlackDoctor.org archive. Please keep the following in mind:
✦ For Informational Purposes Only
The information provided through this AI search is for general educational and informational purposes only. It is not intended as a substitute for professional medical advice, diagnosis, or treatment.
✦ Always Consult a Healthcare Provider
Never disregard professional medical advice or delay seeking it because of something you have read through this search tool. If you have a medical emergency, call your doctor or 911 immediately.
✦ AI Limitations
This search tool uses artificial intelligence to help match your queries with articles in our archive. While we strive for accuracy, AI-generated results may occasionally be incomplete, outdated, or not fully relevant to your specific situation.
✦ No Doctor-Patient Relationship
Using this search tool does not create a doctor-patient relationship between you and BlackDoctor.org or any healthcare provider.
Explore over 35,000 articles and videos across black health, wellness, lifestyle and culture
Full AI Search Experience >
×

Download PDF

Enter your name and email to receive the download link.

BlackDoctor AI Search