title | software | abstract | layout | series | publisher | issn | id | month | tex_title | firstpage | lastpage | page | order | cycles | bibtex_author | author | date | address | container-title | volume | genre | issued | extras | |||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Auditing Fairness under Unobserved Confounding |
A fundamental problem in decision-making systems is the presence of inequity along demographic lines. However, inequity can be difficult to quantify, particularly if our notion of equity relies on hard-to-measure notions like risk (e.g., equal access to treatment for those who would die without it). Auditing such inequity requires accurate measurements of individual risk, which is difficult to estimate in the realistic setting of unobserved confounding. In the case that these unobservables “explain” an apparent disparity, we may understate or overstate inequity. In this paper, we show that one can still give informative bounds on allocation rates among high-risk individuals, even while relaxing or (surprisingly) even when eliminating the assumption that all relevant risk factors are observed. We utilize the fact that in many real-world settings (e.g., the introduction of a novel treatment) we have data from a period prior to any allocation, to derive unbiased estimates of risk. We apply our framework to a real-world setting of Paxlovid allocation to COVID-19 patients, finding that observed racial inequity cannot be explained by unobserved confounders of the same strength as important observed covariates. |
inproceedings |
Proceedings of Machine Learning Research |
PMLR |
2640-3498 |
byun24a |
0 |
Auditing Fairness under Unobserved Confounding |
4339 |
4347 |
4339-4347 |
4339 |
false |
Byun, Yewon and Sam, Dylan and Oberst, Michael and Lipton, Zachary and Wilder, Bryan |
|
2024-04-18 |
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics |
238 |
inproceedings |
|