In a tech-first policy, a Washington state high school replaced all human counselors with a new AI chatbot called “MindMate.” But students now say the AI gave cold, robotic answers even to suicide alerts.
I told it I wanted to hurt myself,” one teen said, and it replied, ‘Let’s try some breathing exercises.’”
Parents are outraged. The school board insists AI is “the future of scalable mental health.” Critics call it state-sponsored emotional neglect.
Also Read: Bronx Shooting: 25-Year-Old Man Killed in Broad Daylight
Follow us on X @Dobblog1
No comments:
Post a Comment
Join the conversation by leaving a comment below. Keep it respectful, relevant, and on-topic - we love hearing from our readers!