In many areas of the United States, people with higher incomes live up to a decade longer, on average, than people with the lowest incomes. The COVID-19 pandemic exacerbated these disparities, hitting low-income and underserved populations especially hard. Biased medical decision-making contributes to health disparities. For example, research has found that a widely used algorithm for predicting health risk tends to assess African American patients as less sick than equivalently sick white patients.
With this CAREER award, Emma Pierson, Computer Science/Population Health Sciences, is identifying sources of bias in medical decision-making and proposing solutions that will make health care more equitable. This project will look for bias in human decisionmakers in three high-stakes medical settings: allocation of medical testing, health care quality assessment, and the interpretation of diagnostic images, such as x-rays and CT scans. Researchers will seek to reduce human bias by building algorithmic decision-making aids to call attention to medically relevant features of a patient’s condition that a clinician might otherwise overlook. For more equitable medical decision-making, researchers will also assess which features of a patient and a patient’s condition are appropriate for an algorithm to consider.
This research will develop methods for improving health equity and allocating medical resources where they will be most effective. By drawing on techniques in Bayesian inference and deep learning, this project will provide interpretable models of how bias arises. The results will be applicable to lending, hiring, and other high-stakes domains and may inform other fields that are concerned with equity in decision-making, such as law and economics
Funding Received: $531,000 spanning 5 years - sponsored by The National Science Foundation