An analysis of bias against black defendants in criminal risk scores has prompted research showing that the disparity can be addressed — if the algorithms focus on the fairness of outcomes.
via Pocket http://ift.tt/2lX34At
Huston, we have a (mathematical) problem:
The scholars set out to address this question: Since blacks are rearrested more often than whites, is it possible to create a formula that is equally predictive for all races without disparities in who suffers the harm of incorrect predictions?
..they realized that the problem was not resolvable. A risk score, they found, could either be equally predictive or equally wrong for all races — but not both.
The reason was the difference in the frequency with which blacks and whites were charged with new crimes. ““If you have two populations that have unequal base rates,’’ Kleinberg said, “then you can’t satisfy both definitions of fairness at the same time.”
The currently used formula inaccurately identifies black defendants as future criminals more frequently than white defendants – reinforcing existing inequalities.