* . *
  • About
  • Advertise
  • Privacy & Policy
  • Contact
Saturday, March 7, 2026
Earth-News
  • Home
  • Business
  • Entertainment

    Las Vegas A’s, Will Guidara, and Aramark Sports + Entertainment Reveal Vision for First-of-its-Kind Athletic Club Behind Home Plate of A’s New Ballpark – Business Wire

    SBCC Theatre Group Brings ‘A Small Family Business’ to Life on Stage

    Play, Relax & Have Fun: Enjoy Your Spring Break in Arlington – City of Arlington (.gov)

    What Caused Webtoon Entertainment Stock to Plummet on Wednesday?

    Opening date set for Cosm entertainment venue at Centennial Yards – WALB

    Banijay, All3Media to merge entertainment businesses – WKZO

  • General
  • Health
  • News

    Cracking the Code: Why China’s Economic Challenges Aren’t Shaking Markets, Unlike America’s” – Bloomberg

    Trump’s Narrow Window to Spread the Truth About Harris

    Trump’s Narrow Window to Spread the Truth About Harris

    Israel-Gaza war live updates: Hamas leader Ismail Haniyeh assassinated in Iran, group says

    Israel-Gaza war live updates: Hamas leader Ismail Haniyeh assassinated in Iran, group says

    PAP Boss to Niger Delta Youths, Stay Away from the Protest

    PAP Boss to Niger Delta Youths, Stay Away from the Protest

    Court Restricts Protests In Lagos To Freedom, Peace Park

    Court Restricts Protests In Lagos To Freedom, Peace Park

    Fans React to Jazz Jennings’ Inspiring Weight Loss Journey

    Fans React to Jazz Jennings’ Inspiring Weight Loss Journey

    Trending Tags

    • Trump Inauguration
    • United Stated
    • White House
    • Market Stories
    • Election Results
  • Science
  • Sports
  • Technology

    A Century and a Half of Connectivity: Professor Mojtaba Vaezi Reflects on the Evolution and Future of Communication Technology

    The Technology Patients and Clinicians Truly Want: What You Need to Know

    Shift Technology and AXA Join Forces for Five More Years to Drive AI-Powered Insurance Innovation

    Middle Bucks Institute of Technology Shines as National Rookie of the Year at NAHB Student Competition

    Brainhole Technology Elevates Portfolio with $1.3 Million Investment in Applied Optoelectronics

    Upway Accelerates Innovation with Exciting New Chief Technology Officer Appointment

    Trending Tags

    • Nintendo Switch
    • CES 2017
    • Playstation 4 Pro
    • Mark Zuckerberg
No Result
View All Result
  • Home
  • Business
  • Entertainment

    Las Vegas A’s, Will Guidara, and Aramark Sports + Entertainment Reveal Vision for First-of-its-Kind Athletic Club Behind Home Plate of A’s New Ballpark – Business Wire

    SBCC Theatre Group Brings ‘A Small Family Business’ to Life on Stage

    Play, Relax & Have Fun: Enjoy Your Spring Break in Arlington – City of Arlington (.gov)

    What Caused Webtoon Entertainment Stock to Plummet on Wednesday?

    Opening date set for Cosm entertainment venue at Centennial Yards – WALB

    Banijay, All3Media to merge entertainment businesses – WKZO

  • General
  • Health
  • News

    Cracking the Code: Why China’s Economic Challenges Aren’t Shaking Markets, Unlike America’s” – Bloomberg

    Trump’s Narrow Window to Spread the Truth About Harris

    Trump’s Narrow Window to Spread the Truth About Harris

    Israel-Gaza war live updates: Hamas leader Ismail Haniyeh assassinated in Iran, group says

    Israel-Gaza war live updates: Hamas leader Ismail Haniyeh assassinated in Iran, group says

    PAP Boss to Niger Delta Youths, Stay Away from the Protest

    PAP Boss to Niger Delta Youths, Stay Away from the Protest

    Court Restricts Protests In Lagos To Freedom, Peace Park

    Court Restricts Protests In Lagos To Freedom, Peace Park

    Fans React to Jazz Jennings’ Inspiring Weight Loss Journey

    Fans React to Jazz Jennings’ Inspiring Weight Loss Journey

    Trending Tags

    • Trump Inauguration
    • United Stated
    • White House
    • Market Stories
    • Election Results
  • Science
  • Sports
  • Technology

    A Century and a Half of Connectivity: Professor Mojtaba Vaezi Reflects on the Evolution and Future of Communication Technology

    The Technology Patients and Clinicians Truly Want: What You Need to Know

    Shift Technology and AXA Join Forces for Five More Years to Drive AI-Powered Insurance Innovation

    Middle Bucks Institute of Technology Shines as National Rookie of the Year at NAHB Student Competition

    Brainhole Technology Elevates Portfolio with $1.3 Million Investment in Applied Optoelectronics

    Upway Accelerates Innovation with Exciting New Chief Technology Officer Appointment

    Trending Tags

    • Nintendo Switch
    • CES 2017
    • Playstation 4 Pro
    • Mark Zuckerberg
No Result
View All Result
Earth-News
No Result
View All Result
Home Science

Study reveals why AI models that analyze medical images can be biased

July 1, 2024
in Science
Study reveals why AI models that analyze medical images can be biased
Share on FacebookShare on Twitter

Artificial intelligence models often play a role in medical diagnoses, especially when it comes to analyzing images such as X-rays. However, studies have found that these models don’t always perform well across all demographic groups, usually faring worse on women and people of color.

These models have also been shown to develop some surprising abilities. In 2022, MIT researchers reported that AI models can make accurate predictions about a patient’s race from their chest X-rays — something that the most skilled radiologists can’t do.

That research team has now found that the models that are most accurate at making demographic predictions also show the biggest “fairness gaps” — that is, discrepancies in their ability to accurately diagnose images of people of different races or genders. The findings suggest that these models may be using “demographic shortcuts” when making their diagnostic evaluations, which lead to incorrect results for women, Black people, and other groups, the researchers say.

“It’s well-established that high-capacity machine-learning models are good predictors of human demographics such as self-reported race or sex or age. This paper re-demonstrates that capacity, and then links that capacity to the lack of performance across different groups, which has never been done,” says Marzyeh Ghassemi, an MIT associate professor of electrical engineering and computer science, a member of MIT’s Institute for Medical Engineering and Science, and the senior author of the study.

The researchers also found that they could retrain the models in a way that improves their fairness. However, their approached to “debiasing” worked best when the models were tested on the same types of patients they were trained on, such as patients from the same hospital. When these models were applied to patients from different hospitals, the fairness gaps reappeared.

“I think the main takeaways are, first, you should thoroughly evaluate any external models on your own data because any fairness guarantees that model developers provide on their training data may not transfer to your population. Second, whenever sufficient data is available, you should train models on your own data,” says Haoran Zhang, an MIT graduate student and one of the lead authors of the new paper. MIT graduate student Yuzhe Yang is also a lead author of the paper, which will appear in Nature Medicine. Judy Gichoya, an associate professor of radiology and imaging sciences at Emory University School of Medicine, and Dina Katabi, the Thuan and Nicole Pham Professor of Electrical Engineering and Computer Science at MIT, are also authors of the paper.

Removing bias

As of May 2024, the FDA has approved 882 AI-enabled medical devices, with 671 of them designed to be used in radiology. Since 2022, when Ghassemi and her colleagues showed that these diagnostic models can accurately predict race, they and other researchers have shown that such models are also very good at predicting gender and age, even though the models are not trained on those tasks.

“Many popular machine learning models have superhuman demographic prediction capacity — radiologists cannot detect self-reported race from a chest X-ray,” Ghassemi says. “These are models that are good at predicting disease, but during training are learning to predict other things that may not be desirable.” In this study, the researchers set out to explore why these models don’t work as well for certain groups. In particular, they wanted to see if the models were using demographic shortcuts to make predictions that ended up being less accurate for some groups. These shortcuts can arise in AI models when they use demographic attributes to determine whether a medical condition is present, instead of relying on other features of the images.

Using publicly available chest X-ray datasets from Beth Israel Deaconess Medical Center in Boston, the researchers trained models to predict whether patients had one of three different medical conditions: fluid buildup in the lungs, collapsed lung, or enlargement of the heart. Then, they tested the models on X-rays that were held out from the training data.

Overall, the models performed well, but most of them displayed “fairness gaps” — that is, discrepancies between accuracy rates for men and women, and for white and Black patients.

The models were also able to predict the gender, race, and age of the X-ray subjects. Additionally, there was a significant correlation between each model’s accuracy in making demographic predictions and the size of its fairness gap. This suggests that the models may be using demographic categorizations as a shortcut to make their disease predictions.

The researchers then tried to reduce the fairness gaps using two types of strategies. For one set of models, they trained them to optimize “subgroup robustness,” meaning that the models are rewarded for having better performance on the subgroup for which they have the worst performance, and penalized if their error rate for one group is higher than the others.

In another set of models, the researchers forced them to remove any demographic information from the images, using “group adversarial” approaches. Both of these strategies worked fairly well, the researchers found.

“For in-distribution data, you can use existing state-of-the-art methods to reduce fairness gaps without making significant trade-offs in overall performance,” Ghassemi says. “Subgroup robustness methods force models to be sensitive to mispredicting a specific group, and group adversarial methods try to remove group information completely.”

Not always fairer

However, those approaches only worked when the models were tested on data from the same types of patients that they were trained on — for example, only patients from the Beth Israel Deaconess Medical Center dataset.

When the researchers tested the models that had been “debiased” using the BIDMC data to analyze patients from five other hospital datasets, they found that the models’ overall accuracy remained high, but some of them exhibited large fairness gaps.

“If you debias the model in one set of patients, that fairness does not necessarily hold as you move to a new set of patients from a different hospital in a different location,” Zhang says.

This is worrisome because in many cases, hospitals use models that have been developed on data from other hospitals, especially in cases where an off-the-shelf model is purchased, the researchers say.

“We found that even state-of-the-art models which are optimally performant in data similar to their training sets are not optimal — that is, they do not make the best trade-off between overall and subgroup performance — in novel settings,” Ghassemi says. “Unfortunately, this is actually how a model is likely to be deployed. Most models are trained and validated with data from one hospital, or one source, and then deployed widely.”

The researchers found that the models that were debiased using group adversarial approaches showed slightly more fairness when tested on new patient groups that those debiased with subgroup robustness methods. They now plan to try to develop and test additional methods to see if they can create models that do a better job of making fair predictions on new datasets.

The findings suggest that hospitals that use these types of AI models should evaluate them on their own patient population before beginning to use them, to make sure they aren’t giving inaccurate results for certain groups.

The research was funded by a Google Research Scholar Award, the Robert Wood Johnson Foundation Harold Amos Medical Faculty Development Program, RSNA Health Disparities, the Lacuna Fund, the Gordon and Betty Moore Foundation, the National Institute of Biomedical Imaging and Bioengineering, and the National Heart, Lung, and Blood Institute.

>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : Science Daily – https://www.sciencedaily.com/releases/2024/06/240628125210.htm

Tags: Revealssciencestudy
Previous Post

Depictions of depression are often misleading

Next Post

Wireless receiver blocks interference for better mobile device performance

Inside the Daring Mission to Rescue Indian Creek

March 7, 2026

Unlocking Innovation: How Chemist Lily Robertson is Revolutionizing Autonomous Discovery to Accelerate Scientific Breakthroughs

March 7, 2026

NASA Confirms: No Asteroid Threat to the Moon in 2032

March 7, 2026

Sign up for North Jersey Living; Our real estate, lifestyle newsletter – Yahoo

March 7, 2026

Boston’s World Cup games still in doubt after funding shortfall proposal rejected – The New York Times

March 7, 2026

Alaska 2025 summer tourism was ‘soft’ amid economic jitters and reduced marketing money – Anchorage Daily News

March 7, 2026

Las Vegas A’s, Will Guidara, and Aramark Sports + Entertainment Reveal Vision for First-of-its-Kind Athletic Club Behind Home Plate of A’s New Ballpark – Business Wire

March 7, 2026

Governor Newsom announces major transformation of six vacant buildings in Los Angeles County into mental health and housing communities – California State Portal | CA.gov

March 7, 2026

Popular prediction markets take heat from lawmakers – Spectrum News

March 7, 2026

A Century and a Half of Connectivity: Professor Mojtaba Vaezi Reflects on the Evolution and Future of Communication Technology

March 7, 2026

Categories

Archives

March 2026
M T W T F S S
 1
2345678
9101112131415
16171819202122
23242526272829
3031  
« Feb    
Earth-News.info

The Earth News is an independent English-language daily published Website from all around the World News

Browse by Category

  • Business (20,132)
  • Ecology (1,105)
  • Economy (1,124)
  • Entertainment (22,001)
  • General (20,270)
  • Health (10,162)
  • Lifestyle (1,138)
  • News (22,149)
  • People (1,129)
  • Politics (1,141)
  • Science (16,339)
  • Sports (21,626)
  • Technology (16,106)
  • World (1,116)

Recent News

Inside the Daring Mission to Rescue Indian Creek

March 7, 2026

Unlocking Innovation: How Chemist Lily Robertson is Revolutionizing Autonomous Discovery to Accelerate Scientific Breakthroughs

March 7, 2026
  • About
  • Advertise
  • Privacy & Policy
  • Contact

© 2023 earth-news.info

No Result
View All Result

© 2023 earth-news.info

No Result
View All Result

© 2023 earth-news.info

Go to mobile version