Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Texas Pauses Use of H-1B Visas at State Universities

    Scraps of viral DNA in biobank samples reveal secrets of Epstein–Barr virus

    UK probably needs large new factory to meet target of 1.3m cars a year, say industry boss | Automotive industry

    Facebook X (Twitter) Instagram
    Facebook X (Twitter) YouTube LinkedIn
    Naija Global News |
    Thursday, January 29
    • Business
    • Health
    • Politics
    • Science
    • Sports
    • Education
    • Social Issues
    • Technology
    • More
      • Crime & Justice
      • Environment
      • Entertainment
    Naija Global News |
    You are at:Home»Health»We must not let AI ‘pull the doctor out of the visit’ for low-income patients | Leah Goodridge and Oni Blackstock
    Health

    We must not let AI ‘pull the doctor out of the visit’ for low-income patients | Leah Goodridge and Oni Blackstock

    onlyplanz_80y6mtBy onlyplanz_80y6mtJanuary 27, 2026006 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Email
    We must not let AI ‘pull the doctor out of the visit’ for low-income patients | Leah Goodridge and Oni Blackstock
    ‘Given the barriers that people who are unhoused and have low incomes face, it is crucial they receive patient-centered care ... ’ Photograph: Chris Rout/Alamy
    Share
    Facebook Twitter LinkedIn Pinterest Email

    In southern California, where rates of homelessness are among the highest in the nation, a private company, Akido Labs, is running clinics for unhoused patients and others with low incomes. The caveat? The patients are seen by medical assistants who use artificial intelligence (AI) to listen to the conversations, then spit out potential diagnoses and treatment plans, which are then reviewed by a doctor. The company’s goal, its chief technology officer told the MIT Technology Review, is to “pull the doctor out of the visit”.

    This is dangerous. Yet it’s part of a larger trend where generative AI is being pushed into healthcare for medical professionals. In 2025, a survey by the American Medical Association reported that two out of three physicians used AI to assist with their daily work, including diagnosing patients. One AI startup raised $200m to provide medical professionals with an app dubbed “ChatGPT for doctors”. US lawmakers are considering a bill that would recognize AI as able to prescribe medication. While this trend of AI in healthcare affects almost all patients, it has a deeper impact on people with low incomes who already face substantial barriers to care and higher rates of mistreatment in healthcare settings. People who are unhoused and have low incomes should not be testing grounds for AI in healthcare. Instead, their voices and priorities should drive if, how, and when AI is implemented in their care.

    The rise of AI in healthcare didn’t happen in a vacuum. Overcrowded hospitals, overworked clinicians and relentless pressure for medical offices to run seamlessly, shuttling patients in and out of a large for-profit healthcare system, set the conditions. The demands on healthcare workers are often compounded in economically disadvantaged communities where healthcare settings are often under-resourced and patients are uninsured, with a greater burden of chronic health conditions due to racism and poverty.

    Here is where someone might ask, “Isn’t something better than nothing?” Well, actually, no. Studies show that AI-enabled tools generate inaccurate diagnoses. A 2021 study in Nature Medicine examined AI algorithms trained on large, chest X-ray datasets for medical imaging research and found that these algorithms systematically under-diagnosed Black and Latinx patients, patients recorded as female and patients with Medicaid insurance. This systematic bias risks deepening health inequities for patients already facing barriers to care. Another study, published in 2024, found that AI misdiagnosed breast cancer screenings among Black patients – the odds of false positives for Black patients screened for breast cancer was greater than for their white counterparts. Due to algorithmic bias, some clinical AI tools have notoriously performed worse on Black patients and other people of color. That’s because AI isn’t independently “thinking”; it relies on probabilities and pattern recognition, which can reinforce bias for already marginalized patients.

    Some patients aren’t even informed that their health provider or healthcare system is using AI. A medical assistant told the MIT Technology review that his patients know an AI system is listening, but he does not tell them that it makes diagnostic recommendations. This harkens back to an era of exploitative medical racism where Black people were experimented on without informed consent and often against their will. Can AI help health providers by speedily giving them information that may allow them to move on to the next patient? Possibly. But the problem is that it might come at the expense of diagnostic accuracy and worsening health inequities.

    And the potential impact goes beyond diagnostic accuracy. TechTonic Justice, an advocacy group working to protect economically marginalized communities from the harms of AI, published a groundbreaking report that estimates 92 million Americans with low incomes “have some basic aspect of their lives decided by AI”. Those decisions range from how much they receive from Medicaid to whether they are eligible for Social Security administration’s disability insurance.

    A real-life example of this is playing out in federal courts right now. In 2023, a group of Medicare Advantage customers sued UnitedHealthcare in Minnesota, alleging they were denied coverage because the company’s AI system, nH Predict, mistakenly deemed them ineligible. Some of the plaintiffs are the estates of Medicare Advantage customers; these patients allegedly died as a result of the denial of medically necessary care. UnitedHealth sought to dismiss the case, but in 2025, a judge ruled that the plaintiffs can move forward with some of the claims. A similar case was filed in federal court in Kentucky against Humana. There, Medicare Advantage customers alleged that Humana’s use of nH Predict “spits out generic recommendations based on incomplete and inadequate medical records”. That case is also ongoing, with a judge ruling that the plaintiffs’ legal arguments are enough to move forward, surviving the insurance company’s motion to dismiss. While the final decision for these two cases remains pending, they indicate a growing trend of AI being used to decide the health coverage of people with low incomes – and its pitfalls. If you have financial resources, you can get quality healthcare. But if you are unhoused or have a low income, AI may bar you from even accessing the healthcare entirely. That’s medical classism.

    We should not experiment on patients who are unhoused or have low incomes for an AI rollout. The documented harms are greater than the potential, unproven benefits promised by start-ups and other tech ventures. Given the barriers that people who are unhoused and have low incomes face, it is crucial they receive patient-centered care with a human healthcare provider who listens to their health-related needs and priorities. We cannot create a standard where we rely on a health system in which health practitioners take a backseat while AI – run by private companies – takes the lead. An AI system that “listens” in and is developed without rigorous evaluation by the communities themselves disempowers patients by removing their decision-making authority to determine what technologies, including AI, are implemented in their health care.

    • Leah Goodridge is a lawyer who worked in homeless prevention litigation for 12 years

    • Oni Blackstock, MD, MHS, is a physician, founder and executive director of health justice, and a Public Voices Fellow on technology in the public interest with The OpEd Project

    Blackstock doctor Goodridge Leah LowIncome Oni patients pull visit
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleEHRC single-sex spaces guidance being adapted under ‘constructive’ new chair | Equality and Human Rights Commission (EHRC)
    Next Article ‘Abdication’: Trump formally takes US out of Paris climate agreement for a second time | Trump administration
    onlyplanz_80y6mt
    • Website

    Related Posts

    ‘I was violated and put in extreme danger’: women denied abortions sue over Arkansas ban | Abortion

    January 28, 2026

    Calls for heart disease clinics to be rolled out NHS-wide to address ethnic disparity in treatment | Heart disease

    January 28, 2026

    Ofsted finds no political bias at Bristol school criticised for cancelling MP’s visit | Ofsted

    January 28, 2026
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    Watch Lady Gaga’s Perform ‘Vanish Into You’ on ‘Colbert’

    September 9, 20251 Views

    Advertisers flock to Fox seeking an ‘audience of one’ — Donald Trump

    July 13, 20251 Views

    A Setback for Maine’s Free Community College Program

    June 19, 20251 Views
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Latest Reviews

    At Chile’s Vera Rubin Observatory, Earth’s Largest Camera Surveys the Sky

    By onlyplanz_80y6mtJune 19, 2025

    SpaceX Starship Explodes Before Test Fire

    By onlyplanz_80y6mtJune 19, 2025

    How the L.A. Port got hit by Trump’s Tariffs

    By onlyplanz_80y6mtJune 19, 2025

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Most Popular

    Watch Lady Gaga’s Perform ‘Vanish Into You’ on ‘Colbert’

    September 9, 20251 Views

    Advertisers flock to Fox seeking an ‘audience of one’ — Donald Trump

    July 13, 20251 Views

    A Setback for Maine’s Free Community College Program

    June 19, 20251 Views
    Our Picks

    Texas Pauses Use of H-1B Visas at State Universities

    Scraps of viral DNA in biobank samples reveal secrets of Epstein–Barr virus

    UK probably needs large new factory to meet target of 1.3m cars a year, say industry boss | Automotive industry

    Recent Posts
    • Texas Pauses Use of H-1B Visas at State Universities
    • Scraps of viral DNA in biobank samples reveal secrets of Epstein–Barr virus
    • UK probably needs large new factory to meet target of 1.3m cars a year, say industry boss | Automotive industry
    • How Trump Is Transforming the Oval Office
    • Psychiatrists plan to overhaul the mental health bible—and change how we define ‘disorder’
    © 2026 naijaglobalnews. Designed by Pro.
    • About Us
    • Disclaimer
    • Get In Touch
    • Privacy Policy
    • Terms and Conditions

    Type above and press Enter to search. Press Esc to cancel.