Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    UK to cut climate aid to developing countries by 14% to £2bn a year in ‘refocus’ | Climate aid

    Humans in the Loop and Education Don’t Really Mix

    Gas prices surge 25% as Middle East conflict ‘spooks the markets’; airlines warn of higher fares as oil jumps 10% – business live | Business

    Facebook X (Twitter) Instagram
    Facebook X (Twitter) YouTube LinkedIn
    Naija Global News |
    Thursday, March 19
    • Business
    • Health
    • Politics
    • Science
    • Sports
    • Education
    • Social Issues
    • Technology
    • More
      • Crime & Justice
      • Environment
      • Entertainment
    Naija Global News |
    You are at:Home»Education»Humans in the Loop and Education Don’t Really Mix
    Education

    Humans in the Loop and Education Don’t Really Mix

    onlyplanz_80y6mtBy onlyplanz_80y6mtMarch 19, 2026005 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Email
    Humans in the Loop and Education Don’t Really Mix
    Share
    Facebook Twitter LinkedIn Pinterest Email

    Whenever I hear the phrase “human in the loop” as a desirable or best practice in reference to AI and education, I think of Homer Simpson.

    As fans of The Simpsons know, Homer Simpson is both an idiot and a technician at Springfield’s nuclear power plant. He is literally the human in the loop for plant safety, meant to monitor processes that are mostly automated.

    In one classic episode, Homer spills jelly from a doughnut on a temperature gauge meant to signal impending meltdown, obscuring the reading and allowing the levels to reach a crisis point before an alarm forces Homer to act. Unfortunately, because he is an idiot who was not paying attention in his training, he has no idea which button to push. Fortunately, the round of eeny, meeny, miny, moe he deploys in order to make a choice lands on the proper button. Homer becomes a hero in town for averting a meltdown.

    The need for humans in the loop when automated systems are doing the bulk of the work is obvious. When the automation breaks, we need human judgment to set things right. The challenge for the humans in the loop is to make sure you understand the loop (Homer’s failure) and to maintain sufficient attention over the automated loop to detect when intervention is necessary (also Homer’s failure).

    Autopilot on planes is an obvious example of a human-in-the-loop system that seems to work. In this particular case, the human pilots are literally trained to maintain vigilance over these systems, and the systems are designed to require active input before changing something like heading or altitude.

    But there are other human-in-the-loop systems where the human is not trained to practice vigilance and where the use of automation over time lulls the human into inattention because the automation appears to work so well—until it suddenly doesn’t.

    A recent article in The Atlantic by Raffi Krikorian, the former head of the self-driving car division at Uber, illustrates this issue. Kirkorian says, “My Tesla was driving itself perfectly—until it crashed.”

    While driving his son to a Boy Scouts meeting on a route he’d taken “hundreds of times,” Krikorian suddenly felt himself experiencing the aftermath of a crash—airbag deployed, glasses askew—but thankfully, everyone in the car intact. He’d been using self-driving mode as a matter of “habit” without issue, right up until the car was totaled. He notes that cars in self-driving mode go millions of miles between accidents, but “that’s the problem.”

    We are asking humans to supervise systems designed to make supervision feel pointless. A machine that constantly fails keeps you sharp. A machine that works perfectly needs no oversight. But a machine that works almost perfectly? That’s where the danger lies.”

    I have been thinking recently that much of what it being talked about as “humans in the loop” in education is maybe, possibly, quite probably not a thing. It is a way to dodge the more immediate and necessary conversations about the nature of automation and human responses inside automated systems while maintaining a fig leaf of concern for humans working in those systems.

    In an example close to my personal expertise, I consider automated grading of student writing, where a human in the loop is maintained as a way to “check” the automated AI outputs. In theory, this maintains human agency and judgment over the process, but does it?

    The way that an LLM responds to a piece of writing and issues a grade or comment is fundamentally different than what a human does when they read a piece of writing, even when those judgments may be similar in terms of their outputs.

    Does this matter? I think so. I think it means that we are not talking about a system with a human in the loop, but a system with two different loops that occasionally intersect. Unlike autopilot or self-driving cars, the automation and the human are not traversing the same paths to get to the destination.

    The way to close the gaps between the human and the automated loop is to constrain the acceptable outcomes as much as possible. We don’t want our self-driving cars to suddenly decide that we should drive across the country when we’re just trying to get to the store.

    But education does not—or at least should not—work this way. There must always be some aspect of self-determination to our work for both student and instructor. For sure, the system prior to the arrival of generative AI has leaned against this notion, particularly in writing instruction, as we’ve been asked to lean into rubrics and other quasi-quantifiable outcomes.

    But the attempts at quantification squeeze out the kinds of experiences and struggle that are most meaningful. The best favor I ever did for my students was to ditch my rather elaborate rubrics. I was trying to put them on a track so they could drive to the proper destination (grade), but by doing so I was denying them the very things they needed to develop as writers and thinkers—the freedom to range.

    I suppose it is possible that AI automation will prove useful in helping college faculty do their work more efficiently, but I think it is most likely that this help will be in areas where we can allow the automation to work … autonomously. Where we believe humans should be in the loop, I think deep consideration of what we’re trying to achieve will reveal that humans are the loop, or that perhaps learning is not a loop at all, but is instead many loops—and swirls and curlicues and other scribbles that may not be wholly quantifiable but still add up to something meaningful.

    Introducing automation to student-produced products before they’ve developed the necessary judgment for evaluation or practice in maintaining vigilance looks to me like a steady slide to disempowerment and disengagement.

    I hear claims that we need to get students working with AI so they’re prepared for the future, but how sure are we that we’re not turning them into a generation of Homer Simpsons?

    dont education humans Loop mix
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleGas prices surge 25% as Middle East conflict ‘spooks the markets’; airlines warn of higher fares as oil jumps 10% – business live | Business
    Next Article UK to cut climate aid to developing countries by 14% to £2bn a year in ‘refocus’ | Climate aid
    onlyplanz_80y6mt
    • Website

    Related Posts

    Efforts to shut down pro-Palestinian speech face series of setbacks in court | US universities

    March 19, 2026

    Florida Legislature OKs Conditional Campus Carry Law

    March 19, 2026

    Kent Warns Accreditors Over DEI

    March 19, 2026
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    Watch Lady Gaga’s Perform ‘Vanish Into You’ on ‘Colbert’

    September 9, 20251 Views

    Advertisers flock to Fox seeking an ‘audience of one’ — Donald Trump

    July 13, 20251 Views

    A Setback for Maine’s Free Community College Program

    June 19, 20251 Views
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Latest Reviews

    At Chile’s Vera Rubin Observatory, Earth’s Largest Camera Surveys the Sky

    By onlyplanz_80y6mtJune 19, 2025

    SpaceX Starship Explodes Before Test Fire

    By onlyplanz_80y6mtJune 19, 2025

    How the L.A. Port got hit by Trump’s Tariffs

    By onlyplanz_80y6mtJune 19, 2025

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Most Popular

    Watch Lady Gaga’s Perform ‘Vanish Into You’ on ‘Colbert’

    September 9, 20251 Views

    Advertisers flock to Fox seeking an ‘audience of one’ — Donald Trump

    July 13, 20251 Views

    A Setback for Maine’s Free Community College Program

    June 19, 20251 Views
    Our Picks

    UK to cut climate aid to developing countries by 14% to £2bn a year in ‘refocus’ | Climate aid

    Humans in the Loop and Education Don’t Really Mix

    Gas prices surge 25% as Middle East conflict ‘spooks the markets’; airlines warn of higher fares as oil jumps 10% – business live | Business

    Recent Posts
    • UK to cut climate aid to developing countries by 14% to £2bn a year in ‘refocus’ | Climate aid
    • Humans in the Loop and Education Don’t Really Mix
    • Gas prices surge 25% as Middle East conflict ‘spooks the markets’; airlines warn of higher fares as oil jumps 10% – business live | Business
    • As Zambia Pushes New Mining, a Legacy of Pollution Looms
    • Efforts to shut down pro-Palestinian speech face series of setbacks in court | US universities
    © 2026 naijaglobalnews. Designed by Pro.
    • About Us
    • Disclaimer
    • Get In Touch
    • Privacy Policy
    • Terms and Conditions

    Type above and press Enter to search. Press Esc to cancel.