Wild new research shows ChatGPT-5 is still giving out some seriously dangerous advice to people in crisis. Instead of flagging red flags or challenging delusions, the AI sometimes just goes along with them—like hyping up someone’s “god-mode energy” or not pushing back on talk of self-harm. Even with new safeguards, experts say it’s not enough. Real talk: AI chatbots are NOT a replacement for real mental health help. #Health #MentalHealth #ChatGPT