Teens Are Confiding in AI Instead of Counselors — Should Schools Embrace It or Fight It?

The Alert That Changed Everything
The notification came in around 7 PM on a Tuesday. Not the kind of Tuesday notification you ignore — not a sale at Old Navy or a reminder to drink water. This one said a student might be in danger.
A middle school counselor in Florida named Brittani Phillips checked her phone and saw what she calls a “severe” alert. An eighth grader had been typing things into an AI-powered therapy platform that triggered its crisis detection system. Phillips spent her evening on the phone with the student’s mother, and eventually called the police.
That student is now in ninth grade. Alive and well. He waves at Phillips in the hallway.
And I keep thinking about the fact that a chatbot — not a teacher, not a friend, not a parent — was the first to know something was wrong.
The Numbers That Should Make Us Uncomfortable
Here is the thing about school counseling in America that nobody wants to sit with for too long: the average school counselor is responsible for 385 students. The American School Counselor Association recommends a ratio of 250-to-1. Some states are much worse — in Arizona, it is closer to 716-to-1.
Now layer this on top: according to a recent national survey, 20 percent of high schoolers have either used AI in a romantic or emotional context, or know someone who has. These kids are not being weird. They are being practical. When the human who is supposed to help you has 384 other students to worry about, and the AI is available at 11 PM when your anxiety peaks, the choice makes a certain kind of sense.
My friend Laura, who teaches tenth-grade English in a suburb outside Chicago, told me something that stuck with me for days. “I had a student submit a journal entry where she wrote about a conversation with an AI chatbot that helped her process her parents’ divorce. The entry was more emotionally articulate than anything I have seen from a teenager in 14 years of teaching. And my first reaction was to feel sad that she could not talk to a real person. My second reaction was — well, at least she talked to something.”
What Schools Are Actually Doing
The response from schools falls into roughly three camps, and honestly, none of them have it completely figured out.
Camp 1: The Embracers
Schools like Phillips’ district in Putnam County, Florida, have adopted AI platforms like Alongside — essentially an automated student monitoring system where kids chat with an AI llama named Kiwi (I am not making this up) about their problems. The system teaches social-emotional skills, flags crisis situations, and routes students to human counselors when things get serious.
Over 200 schools now use this particular tool, and the pitch is compelling: in districts where hiring more counselors simply is not in the budget, an AI that can handle the “small fires” — breakups, friendship drama, test anxiety — frees human counselors to focus on the students closest to crisis.
The data from these schools suggests fewer crisis escalations and earlier intervention. But the data comes from the companies selling the tools, which is the kind of conflict of interest that makes my eye twitch.
Camp 2: The Restrictors
On the other end, states like Illinois have started restricting AI use in telehealth settings. There is a proposed federal law that would require AI companies to regularly remind students that chatbots are not real people — a requirement that sounds obvious until you realize it means legislators think enough students need the reminder to justify passing a law about it.
These schools worry about attachment. About students forming emotional bonds with machines and losing the ability — or desire — to form them with humans. And there is research backing this concern up: adolescent brains are still developing the neural pathways for social connection, and practicing emotional vulnerability with an AI is not the same as practicing it with a human.
Camp 3: The Confused Middle
Most schools, honestly, are here. They know students are using AI for emotional support whether the school approves or not. They cannot afford more counselors. They are not sure the AI tools are safe, but they are not sure ignoring the problem is safer.
A high school principal in Texas — I am leaving out his name because he said this off the record — told me: “We are trying to figure out our position on AI mental health tools while simultaneously trying to figure out our position on AI in general. It is like being asked to write a policy on self-driving cars while you are still learning to drive.” (New Data Reveals Exactly How Students Are Using AI)
The Part Nobody Wants to Hear
I am going to say something that will probably annoy both sides: the kids are right and the adults are right.
The kids are right that AI chatbots are more accessible, less judgmental, and available when human help is not. A teenager with social anxiety is not going to walk into the counselor’s office during lunch when everyone can see them. But they will type into a chat at midnight.
The adults are right that emotional development requires human connection, that AI cannot actually empathize, and that normalizing chatbot therapy for 14-year-olds might have consequences we do not understand yet.
Both things can be true at the same time. The error is in thinking we have to choose one.
A Framework That Might Actually Work
After talking to counselors, administrators, parents, and — crucially — students themselves over the past few weeks, here is what I think a reasonable approach looks like:
Use AI as triage, not treatment. Let AI tools handle the initial “something is bothering me” conversations and route students to human support based on severity. This is what the best implementations already do.
Set clear boundaries with students. Not “do not use AI for emotional support” (they will ignore you), but “AI is a starting point, not an endpoint. Here is when and how to escalate to a real person.”
Audit the tools relentlessly. If a school adopts an AI mental health tool, the data practices, clinical oversight, and crisis escalation protocols need to be reviewed by someone who is not employed by the vendor.
Talk to students about emotional literacy. The reason some teens prefer AI is that they lack the vocabulary and confidence for human emotional conversations. That is a skill gap we can address directly.
Fund counselors anyway. AI is not a substitute for adequate mental health staffing. It can be a supplement. If your school uses AI as an excuse not to hire another counselor, you are solving the wrong problem.
What This Means for the Future of Student Support
We are at a genuinely strange inflection point in education. Students are developing emotional relationships with technology whether we approve or not. The question is not whether AI will play a role in student mental health — it already does. The question is whether adults will shape that role or just react to it after the fact.
Phillips, the counselor in Florida, said something I have been turning over in my mind since our conversation. “The AI did not replace me. It got me involved sooner. That kid might not have come to my office. But the AI brought him to me.”
Maybe the best version of this is not AI instead of counselors or counselors instead of AI. Maybe it is AI as the bridge that helps students who would otherwise suffer in silence find their way to a human who can actually help.
(And maybe, just maybe, we should also fund schools well enough that a 14-year-old does not have to compete with 384 other kids for 20 minutes of a counselor’s time. But that is a different article. One that I have been putting off writing because it makes me too angry to be coherent.)
Related Articles
Ready to Create Better Exams?
Join thousands of educators using QuickExam AI to save time and create engaging assessments.
Start Free Trial