Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Going ‘beyond GDP’ should not mean sidelining the SDGs

    Global economy must stop pandering to ‘frivolous desires of ultra-rich’, says UN expert | Environment

    Arts Council England faces legal threat over magazine’s withdrawal of poet’s work | Books

    Facebook X (Twitter) Instagram
    Facebook X (Twitter) YouTube LinkedIn
    Naija Global News |
    Tuesday, March 3
    • Business
    • Health
    • Politics
    • Science
    • Sports
    • Education
    • Social Issues
    • Technology
    • More
      • Crime & Justice
      • Environment
      • Entertainment
    Naija Global News |
    You are at:Home»Education»Schools are using AI counselors to track students’ mental health. Is it safe? | AI (artificial intelligence)
    Education

    Schools are using AI counselors to track students’ mental health. Is it safe? | AI (artificial intelligence)

    onlyplanz_80y6mtBy onlyplanz_80y6mtMarch 3, 2026009 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Email
    Schools are using AI counselors to track students’ mental health. Is it safe? | AI (artificial intelligence)
    ‘You can’t replace human connection, human judgment,’ warns Sarah Caliboso-Soto, a licensed clinical social worker. Illustration: Derek Abella/The Guardian
    Share
    Facebook Twitter LinkedIn Pinterest Email

    The alert came around 7pm.

    Brittani Phillips checked her phone. A middle school counselor in Putnam county, Florida, Phillips receives messages from an artificial intelligence-enabled therapy platform that students use during nonschool hours. It flags when a student may be at risk for harming themself or others based on what the student types into a chat.

    Phillips saw that this was a “severe” alert for an eighth-grader.

    So, Phillips spent her evening on the phone with the student’s mom, probing her to figure out what was going on and how vulnerable the student was. Phillips also called the police, she says, noting that she tells students that the chats are confidential until they can’t be.

    That was last school year, in the spring.

    “He’s alive and well. He’s in ninth grade this year,” Phillips says. She believes that the interaction built trust between her and the family. When the student passes her in the hall now, he makes a point to greet her, she adds.

    Navigating budget shortfalls and limited mental health staff, Interlachen Jr-Sr high school, where Phillips works, is using an AI platform to vet students’ mental health needs.

    Phillips’s district has used Alongside, an automated student monitoring system, for three years. It’s an example of the growing category of tools that are marketed to K-12 schools for similar purposes, with at least nine companies getting funding deals since 2022.

    Alongside says its tool is used by more than 200 schools around the US and argues that its platform offers better services than typical telehealth options because it has a social and emotional skill-building chat tool – where students yak about their life problems with a llama called Kiwi that tries to teach them to build up resilience – and its AI-generated content is monitored by clinicians. The system offers resource-tapped schools, especially in rural areas, access to critical mental health resources, company representatives say.

    AI is a main component of the Trump administration’s national education agenda. Yet, some parents, educators and, increasingly, lawmakers, are wary of increasing teens’ time in front of screens. States have also started restricting the use of AI in telehealth.

    Many experts and families also worry that students attach to AI too strongly. Even as a recent national survey found that 20% of high schoolers have used AI romantically or know someone who has, there’s significant interest in keeping students from emotionally connecting with bots. That even includes a proposed federal law that would force AI companies to remind students that chatbots aren’t real people.

    Still, in her job, Phillips says the tool her school uses is exceptional at putting out the “small fires”. With about 360 middle schoolers to support, having this tool to hand-hold them through the breakups and other routine problems they face allows her to focus her time with students nearing crisis. Plus, students sometimes find it easier to turn to AI for dealing with emotional problems, she says.

    On the digital couch

    Student nervousness plays into why they are comfortable confiding in these technologies, school counselors say.

    Speaking with a mental health professional can be intimidating, especially for adolescents, says Sarah Caliboso-Soto, a licensed clinical social worker who serves as the assistant director of clinical programs at the University of Southern California Suzanne Dworak-Peck School of social work and the clinical director for the school’s trauma recovery center and telebehavioral health online clinic.

    There’s a generational component as well. For students who’ve grown up encountering chat interfaces through social media and websites, AI interfaces can feel familiar. And kids today find that it’s easier to text than call someone on the phone, says Linda Charmaraman, director of the Youth, Media & Wellbeing Research Lab at Wellesley Centers for Women.

    Using AI to work through emotions also allows students to avoid watching facial expressions, which they may worry will carry judgment, she adds. Also, chatbots are available at times when a human might not be, without the hassle of having to make an appointment, Charmaraman says.

    “It’s almost more natural than interacting with another human being,” Caliboso-Soto says.

    double quotation markCan you think of another time in history when people have been so lonely, when our communities have been so weak?Sam Hiner

    In her work with a telehealth clinic, Caliboso-Soto has seen a rise in crisis text lines and chat lines. The clinic doesn’t use AI of any kind, she says, but it often gets approached by companies looking to get AI into the therapy sessions as notetakers.

    It’s not necessarily bad in Caliboso-Soto’s opinion. For resource-strapped schools, AI can be used “as a first line of defense”, regularly checking in with students and pointing them in the right direction when they need more help, she says.

    The starting price for a school to use Alongside’s services is about $10 per student a year, according to the company. Larger districts usually receive volume-based discounts.

    But Caliboso-Soto worries about using AI as a substitute counselor. It lacks the discernment that clinicians provide when interacting with students, she notes. While large language models can be trained to notice symptoms in text, they cannot see or hear what a human clinician can when interacting with a student, the inflections of the voice and the movements of the body, nor can it reliably catch subtle observations or behaviors. “You can’t replace human connection, human judgment,” she adds.

    While AI can speed up the diagnostic process or free up time for school counselors, it’s crucial not to over rely on it for mental health, says Charmaraman. The technology can miss some of the nuances that a human counselor would catch, and it can give students unrealistic positive reinforcement. Schools need to adopt a holistic approach that includes families and caregivers, she argues.

    Plus, if a school is increasingly using AI intervention to filter serious cases, it’s worth paying attention to whether students are having less frequent contact with clinically trained humans, Caliboso-Soto says.

    For its part, Alongside representatives say that the platform is not meant as a replacement for human therapy. The app is a stepping stone to seeking help from adults, says Ava Shropshire, a junior at Washington University who serves as a youth adviser for Alongside. She argues that the app makes mental health and social-emotional learning feel more normal for students and can lead them to seek out human help.

    Still, some students think it’s at best a Band-Aid.

    Social accountability

    “Can you think of another time in history when people have been so lonely, when our communities have been so weak?” asks Sam Hiner, executive director of Young People’s Alliance, a North Carolina-based organization that lobbies for more youth participation in politics and policymaking.

    During a time of economic upheaval, technology and social media have manipulated and isolated students from one another, and that’s led to a deep yearning for community and belonging, Hiner says.

    Students will get it wherever they can, even if that’s through ChatGPT, he adds.

    Young People’s Alliance released a framework for regulating AI that allows for some therapeutic uses of the technology.

    But in general, the organization is striving to rebuild the human community and is set against use of AI when it threatens to replace human companionship, Hiner says. “That’s a critical aspect of therapy and of living a fulfilled life and having social connection and having mental wellbeing,” he adds.

    So for Hiner, the main concern is what’s called a “parasocial relationship”, when students develop a one-sided emotional attachment, especially when the technology enters schools for therapeutic purposes. It might be valuable to have an AI that can provide feedback or conduct analysis, even to mental health, but Hiner says that the AI should not hint or convey that it has its own emotional state – for instance, saying “I’m proud of you” to a student user – because that encourages attachment.

    Even though platforms often claim to decrease loneliness, they don’t really measure whether people are more connected and are more set up to live fulfilled, connected, happy lives in the long term, says Hiner: “All [tech platforms are] measuring is whether this bot is serving as an effective crutch for the immediate feelings of loneliness that they’re experiencing.”

    What advocates want to prevent is these bots fueling the loss of social skills because they pull people away from relationships with other people, where they have social accountability, Hiner says.

    Pushing boundaries

    Privacy experts note that these chatbots do not generally carry the same privacy protections of conversations with a licensed therapist. And when concerns about student privacy and encounters with the police are high, use of these tools raise “messy” privacy concerns, even when supervised by people with clinical training, a privacy law expert says.

    Both the company and Phillips, the counselor in Putnam county, stress that, to work, these systems need human oversight. Phillips feels like this tool is an improvement over other monitoring tools the district has used, which point students toward in-school discipline rather than mental health help.

    This school year, Phillips noted 19 “severe” alerts from the AI health tool as of February (from a total of 393 active users). The company doesn’t separate the incidents by which students caused them. So some of the same students are causing multiple of those 19 “severe alerts”, Phillips notes.

    Phillips has learned, in using the tool, that it takes a human to perceive teenage humor, too.

    That’s because some alerts aren’t genuine. On occasion, middle school students – usually boys – will test the boundaries of this technology, Phillips says. They type “my uncle touches me” or “my mom beat me with a pole” into the chat to test whether Phillips will follow up on it.

    These boys are just trying to see if anyone is listening, to test whether anyone cares, she says. Sometimes, they just find it funny.

    When she pulls them aside to discuss it, she can observe their body language, and whether it changes, which might suggest that the comment was real. If it was a joke, they often become apologetic. When a student doesn’t seem remorseful, Phillips will call and let the parents know what happened. But even in these cases, Phillips feels she has more options than provided by other monitoring systems, which would refer the student to in-school suspension.

    Because Phillips is keeping her eye on the interactions, the students also learn to trust that she’s actually monitoring the system, she adds.

    And, she says, the number of boys who do test the system in that way goes down every year.

    • This article was produced in partnership with EdSurge, a non-profit newsroom that covers education through original journalism and research. Sign up for their newsletters.

    Artificial Counselors Health Intelligence mental Safe Schools Students Track
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleCan you solve it? You won’t believe these optical illusions! | Mathematics
    Next Article Saturday Night Live Bafta sketch branded ‘horrific’ by leading Tourette syndrome charity | Saturday Night Live
    onlyplanz_80y6mt
    • Website

    Related Posts

    How Diversity Gained—and Lost—Its Place in Higher Ed

    March 3, 2026

    3 Questions for Katie Shaver, Managing Director at EAB

    March 3, 2026

    Cheating machine or powerful assistant? The AI anxieties of a trainee teacher | Education

    March 3, 2026
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    Watch Lady Gaga’s Perform ‘Vanish Into You’ on ‘Colbert’

    September 9, 20251 Views

    Advertisers flock to Fox seeking an ‘audience of one’ — Donald Trump

    July 13, 20251 Views

    A Setback for Maine’s Free Community College Program

    June 19, 20251 Views
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Latest Reviews

    At Chile’s Vera Rubin Observatory, Earth’s Largest Camera Surveys the Sky

    By onlyplanz_80y6mtJune 19, 2025

    SpaceX Starship Explodes Before Test Fire

    By onlyplanz_80y6mtJune 19, 2025

    How the L.A. Port got hit by Trump’s Tariffs

    By onlyplanz_80y6mtJune 19, 2025

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Most Popular

    Watch Lady Gaga’s Perform ‘Vanish Into You’ on ‘Colbert’

    September 9, 20251 Views

    Advertisers flock to Fox seeking an ‘audience of one’ — Donald Trump

    July 13, 20251 Views

    A Setback for Maine’s Free Community College Program

    June 19, 20251 Views
    Our Picks

    Going ‘beyond GDP’ should not mean sidelining the SDGs

    Global economy must stop pandering to ‘frivolous desires of ultra-rich’, says UN expert | Environment

    Arts Council England faces legal threat over magazine’s withdrawal of poet’s work | Books

    Recent Posts
    • Going ‘beyond GDP’ should not mean sidelining the SDGs
    • Global economy must stop pandering to ‘frivolous desires of ultra-rich’, says UN expert | Environment
    • Arts Council England faces legal threat over magazine’s withdrawal of poet’s work | Books
    • How Diversity Gained—and Lost—Its Place in Higher Ed
    • Punch the monkey and his plushie re-create a famous psychological experiment
    © 2026 naijaglobalnews. Designed by Pro.
    • About Us
    • Disclaimer
    • Get In Touch
    • Privacy Policy
    • Terms and Conditions

    Type above and press Enter to search. Press Esc to cancel.