Conformal Classification with Equalized Coverage for Adaptively Selected Groups

Published in arXiv preprint, 2024

Abstract

This paper introduces a conformal inference method to evaluate uncertainty in classification by generating prediction sets with valid coverage conditional on adaptively chosen features. These features are carefully selected to reflect potential model limitations or biases. This can be useful to find a practical compromise between efficiency – by providing informative predictions – and algorithmic fairness – by ensuring equalized coverage for the most sensitive groups. We demonstrate the validity and effectiveness of this method on simulated and real data sets.

Download paper here