Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    What to Know About AI and Campus Mental Health (opinion)

    Skilled Foreign Workers Think About Leaving the U.S.

    Overthinking is rarely an advantage | Letter

    Facebook X (Twitter) Instagram
    Facebook X (Twitter) YouTube LinkedIn
    Naija Global News |
    Friday, April 3
    • Business
    • Health
    • Politics
    • Science
    • Sports
    • Education
    • Social Issues
    • Technology
    • More
      • Crime & Justice
      • Environment
      • Entertainment
    Naija Global News |
    You are at:Home»Education»What to Know About AI and Campus Mental Health (opinion)
    Education

    What to Know About AI and Campus Mental Health (opinion)

    onlyplanz_80y6mtBy onlyplanz_80y6mtApril 3, 2026008 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Email
    What to Know About AI and Campus Mental Health (opinion)
    Share
    Facebook Twitter LinkedIn Pinterest Email

    I regularly meet with a group of students from across the state, representing all five campuses in the University of Tennessee system. I like to use these conversations for a pulse check to understand what’s on their minds and what they’re experiencing on campus in real time.

    Recently, we talked about mental health and AI. Many students shared broad concerns about AI like ethical issues and fears of environmental impact, but a few comments stood out in ways that genuinely surprised me.

    One student told me that ChatGPT was “better” than any therapist they had ever seen: more supportive, more validating and more comforting. Several students described friends who were in what they called “romantic relationships” with AI, something I’d previously assumed was just fodder for sensational headlines. They also estimated that 30 to 40 percent of their peers use AI for companionship—sometimes as their only source of companionship.

    Taken together, and paired with reports about AI and suicidality, I became increasingly concerned. Recent surveys show the use of AI for mental health support is not rare and in fact is growing quickly. For example, one survey found that more than 13 percent of adolescents and young adults aged 12 to 21 have already used generative AI for mental health advice, with rates exceeding 22 percent in those aged 18 to 21. Most users also reported seeking advice regularly (monthly or more) and overwhelmingly finding the advice somewhat or very helpful (92.7 percent).

    At the same time, research from Common Sense Media paints a troubling picture: Major chatbots routinely miss warning signs of mental health distress and foster misplaced trust, including through use of an empathetic tone. They prioritize engagement over safety, and safety guardrails were found to fail most dramatically in the kinds of extended conversations teens and young adults actually have.

    To me, this conversation feels eerily familiar and echoes what we’ve witnessed with the evolution of social media and mental health. At first, we excitedly embraced the new technology. Only later, once harms became clearer, we tried to build guardrails, and not always successfully, as the recent jury verdicts against Meta underscore. We need to approach AI with more foresight.

    Nina Vasan, clinical assistant professor of psychiatry at Stanford University and founder and director of Brainstorm: The Stanford Lab for Mental Health Innovation, which focuses on the study of how technology shapes mental health and how to design it more responsibly, told me higher education can’t just ignore AI and pretend students aren’t using it. “That ship has sailed,” she said. “The question is whether we help them do it wisely. Silence from institutions does not stop behavior; it just removes guardrails. The faster an institution can figure out how to best use AI, the better for students and faculty.”

    Here are some things to consider for how colleges and universities can better support our students and our employees as we navigate this evolving landscape of mental health and AI.

    • Understand it is not just a student problem; it’s a campuswide one. We like to believe it is only our students who are using AI, but AI use is pervasive among faculty and staff, too. Unlike therapy, it is always available (and often free!), and the increasing use of AI highlights gaps in our on-campus resources and knowledge of how to find and use them. As Vasan said, “Here’s the uncomfortable truth: Students often turn to AI precisely because campus resources feel inaccessible, whether due to wait lists or stigma. If we ignore AI, we’re ignoring why students are seeking alternatives in the first place.”
    • Know what AI can and can’t do for mental health and what its role should be. Just as we have with telehealth or mental health apps, members of the campus community need to understand what AI can and can’t do for mental health and talk openly about it. Vasan said AI is good for lower-severity mental health needs, like processing emotions or practicing hard conversations, and for general psychoeducation, like looking up what a panic attack is, but not for higher-risk symptoms. She said, “I tell students to think of AI like a study buddy, not a therapist. It can help you brainstorm, organize your thoughts, draft an email or rehearse a hard conversation. But when you’re in crisis, you need a human who can actually assess risk, prescribe medication or call your emergency contact.”

    John Torous, director of the digital psychiatry division at Beth Israel Deaconess Medical Center, equated AI to “very powerful self-help books.” Like those books, he said, AI “can deliver important and useful content, but just like with a self-help book, it will be more impactful if you apply and practice those skills/lessons in the real world.” He added that knowing the limits of self-help is important, too, as you wouldn’t rely on a book in an emergency.

    • Ask your students and colleagues about their use. We need to get comfortable asking and talking about AI and mental health. As Vasan said, “You don’t need to become an AI expert, but you do need to be curious enough to ask students what they’re using and why.” This might be something that could even bring about new connections, through conversations like mine with my student group.
    • Understand the potential warning signs of harmful AI use. Headlines warn of people in crisis using AI, and of something that has become known as “AI psychosis,” where users form emotional relationships with AI and can’t distinguish between human interaction and machine responses. Torous suggested that individuals monitor their use of AI and if they “ever note use harming real-world relationships (e.g., preferring AI to people) or getting in the way of health habits (e.g., up all night because of AI use), that is a good sign to reduce or stop.”

    Vasan added that language around replacement and avoidance is another warning sign. She said, “The biggest red flag is substitution—when AI becomes a replacement for human connection rather than a supplement to it. If a student says, ‘My AI is the only one who really gets me,’ that’s not a success story. That’s an isolation story.”

    • Universities should educate, train and prepare their communities on AI and mental health. The only way for universities to know their people understand the risks, benefits and role of AI in mental health is to train them themselves. There should be directed outreach, education and even professional development sessions on these topics. Vasan said, “We’ve trained RAs to spot eating disorders and recognize signs of alcohol misuse. We need the same basic fluency around AI and mental health.”

    Of course, this doesn’t mean we all suddenly become fluent in AI and machine learning, but we should know what questions to ask. “An hour [of training] is enough to move someone from ‘I don’t know what to say about this’ to ‘I know the right questions to ask and where to refer,’” Vasan said.

    • Be wary of sales pitches, but weigh opportunities to invest in new mental health tools. As higher education administrators, we are constantly bombarded with sales pitches, in person at conferences and over our LinkedIn direct messages. Torous said to be wary of these pitches and know that right now no AI systems claim to offer mental health care, despite marketing suggesting otherwise, and none are cleared by the Food and Drug Administration to offer it. He added, “There is no clear evidence that mental health–specific AI systems are better, or safer, than larger general AI models (e.g. Gemini, ChatGPT), so work to verify any claims. If it sounds too good to be true, it likely is.”

    Vasan said before any investment a university should ask for evidence like, “Has this tool been tested with vulnerable populations? What happens when a user is in crisis? Is there human backup? Is data truly private?”

    “Mental health AI that does not know when to escalate to humans is not support; it is a liability,” Vasan said. “Investment should focus on tools that connect students to care, not keep them talking to machines indefinitely.”

    • Where possible, universities should get in on the regulation conversations. In the midst of lawsuits, there are ongoing conversations at state and national levels about the regulation of AI, specifically for mental health use. Universities should advocate and participate in these conversations as they can, because they can’t keep pace as the regulator themselves. As Vasan noted, “Universities are filling a vacuum. Because there’s no federal oversight of AI mental health tools, every campus is essentially running its own safety evaluation. That’s not sustainable.”

    In higher education, we can’t simply ignore the new, evolving and continuously growing use of AI for mental health purposes on our campuses. We should be wary of the risks, and educate about them often, but also be thoughtful about how to better deploy AI to integrate it with our current offerings, not prevent use of it. As Vasan told me, “AI isn’t inherently good or bad for mental health. It’s a mirror that reflects how we deploy it. If we’re thoughtful, we have an opportunity to extend support to students who would never walk into a counseling center. If we’re careless, we could deepen the very isolation we’re trying to solve.”

    Jessi Gold is the chief wellness officer for the University of Tennessee system and an associate professor of psychiatry at the University of Tennessee Health Science Center.

    campus Health mental Opinion
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleSkilled Foreign Workers Think About Leaving the U.S.
    onlyplanz_80y6mt
    • Website

    Related Posts

    Michigan Research Center Faces Local Opposition

    April 3, 2026

    NHS rehabilitation care staff shortage fails stroke patients, say health leaders | Stroke

    April 3, 2026

    Rethinking Aging on College Campuses

    April 3, 2026
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    Watch Lady Gaga’s Perform ‘Vanish Into You’ on ‘Colbert’

    September 9, 20251 Views

    Advertisers flock to Fox seeking an ‘audience of one’ — Donald Trump

    July 13, 20251 Views

    A Setback for Maine’s Free Community College Program

    June 19, 20251 Views
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Latest Reviews

    At Chile’s Vera Rubin Observatory, Earth’s Largest Camera Surveys the Sky

    By onlyplanz_80y6mtJune 19, 2025

    SpaceX Starship Explodes Before Test Fire

    By onlyplanz_80y6mtJune 19, 2025

    How the L.A. Port got hit by Trump’s Tariffs

    By onlyplanz_80y6mtJune 19, 2025

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Most Popular

    Watch Lady Gaga’s Perform ‘Vanish Into You’ on ‘Colbert’

    September 9, 20251 Views

    Advertisers flock to Fox seeking an ‘audience of one’ — Donald Trump

    July 13, 20251 Views

    A Setback for Maine’s Free Community College Program

    June 19, 20251 Views
    Our Picks

    What to Know About AI and Campus Mental Health (opinion)

    Skilled Foreign Workers Think About Leaving the U.S.

    Overthinking is rarely an advantage | Letter

    Recent Posts
    • What to Know About AI and Campus Mental Health (opinion)
    • Skilled Foreign Workers Think About Leaving the U.S.
    • Overthinking is rarely an advantage | Letter
    • Trump Struck Iran. Now Farmers Are Paying the Price.
    • ‘If he’d stayed on the golf course, we’d be in a better place’: experts on Trump’s tariffs, one year on | Tariffs
    © 2026 naijaglobalnews. Designed by Pro.
    • About Us
    • Disclaimer
    • Get In Touch
    • Privacy Policy
    • Terms and Conditions

    Type above and press Enter to search. Press Esc to cancel.