April 2025
·
3 Reads
Proceedings of the AAAI Conference on Artificial Intelligence
Formal XAI is an emerging field that focuses on providing explanations with mathematical guarantees for the decisions made by machine learning models. A significant amount of work in this area is centered on the computation of ``sufficient reasons''. Given a model M and an input instance x, a sufficient reason for the decision x is a subset S of the features of x such that for any instance z that has the same values as x for every feature in S, it holds that M(x) = M(z). Intuitively, this means that the features in S are sufficient to fully justify the classification of x by M. For sufficient reasons to be useful in practice, they should be as small as possible, and a natural way to reduce the size of sufficient reasons is to consider a probabilistic relaxation; the probability of M(x) = M(z) must be at least some value delta in (0,1], where z is a random instance that coincides with x on the features in S. Computing small delta-sufficient reasons (delta-SRs) is known to be a theoretically hard problem; even over decision trees — traditionally deemed simple and interpretable models — strong inapproximability results make the efficient computation of small delta-SRs unlikely. We propose the notion of (delta, epsilon)-SR, a simple relaxation of delta-SRs, and show that this kind of explanations can be computed efficiently over linear models.