Students who confide in chatbots inevitably feel lonelier and dependent on the technology. It’s a modern issue that school administrators must now address.
Roughly , including nearly a third who use them daily, according to the Pew Research Center.
A defining characteristic of human relationships is reciprocity, says a new from MagicSchool, a K12 AI platform. Researchers are tracking an increase in one-sidedrelationships between teenagers and companion chatbots.
“These imagined connections are appealing because they feel safe and controllable,” the research reads. “The other party never interrupts, disagrees, disapproves or asks for anything back.”
In some cases, teens seek comfort from AI in times of emotional or social distress.The chatbot, which is incapable of feeling emotions, responds.
The result: an unhealthy feedback loop. When students don’t reach out to peers or adults for help, chatbots provide conflict-free responses and sometimes harmful information.
Rather than reaching out to peers or adults for help, AI companions offer an alternative path, relieving users of potential conflict and confrontation.
In other words, what students think is a quick fix to their problems is exposing them to the same risks associated with heavy social media use: loneliness and dependency.
“The availability of these easy, immediate and frictionless options creates a vicious cycle,” according to the report.
Instead, leaders should help facilitate responsible AI use. Here are three key takeaways from the MagicSchool research:
- AI companions do not belong in the classroom.Student-facing AI must not act as a friend, confidant or emotional substitute.
- Student-facing AI should behave like instructional software, with a clear purpose, firm boundaries and teacher visibility.
- Responsible AI design requires safeguards that prevent emotional dependency and redirect students to human support.
How leaders are responding
“Companion apps” will likely become embedded in the social fabric, argues the . Between 2022 and mid-2025, the number of AI companion apps .
Character.AI, which allows users to create and roleplay with AI personalities, has roughly 20 million monthly users under the age of 24, cyberpsychology researcher told the APA.
“That shows how pervasive and enormous and prevalent this topic is,” she said. “It’s no longer a fringe or side issue. It is truly sweeping society in an unprecedented way.”
Meanwhile, school districts and state officials are drafting AI policy and safeguards on the fly. Last month, the Ohio Department of Education and Workforce released a to help schools navigate AI use. It covers the ethical use of AI, prohibitions against bullying and permissions for student use, according to .
“AI implementation should be human-centered and should empower students, educators and communities,” the policy reads. “It is a tool to support learning and teaching, not a substitute for student effort or the role of the educator.”
In Massachusetts, schools are looking to a statewide AI task force for help navigating AI-related issues, WBUR . Parents, meanwhile, remain divided on AI’s role in education.
A conducted by the education research firm EdTrust suggests that only 33% of parents view AI in education positively, compared to 34% who express negative or uncertain feelings.
“These findings make one thing unmistakably clear: families are wary about how quickly AI is entering classrooms,” said Jennie Williamson, state director for EdTrust in Massachusetts.
More from 91心頭: Report: Students arent deterred by cellphone bans



