Racist Algorithms and Fair Sentencing

16 March 2018

Can an algorithm be racist?ProPublicahas argued as much against an algorithm used to determine bail sentencing. The algorithm assesses the risk that an individual will reoffend. The creators of the algorithm defend it saying that black and white individuals with similar risk assessments have similar chances of reoffending.

But critics deride the algorithm because, looking just at individuals deemed high risk whodidn'tend up reoffending, a lot more black individuals were incorrectly assessed as high risk in this way. (These are complicated points, and the article below should clarify them.) Can both of these claims in fact be true?

The followingWashington Postarticle argues in the affirmative. The article points to how the creators and critics may disagree more fundamentally on what the relevant standard of fairness is.

Check out the article here:https://www.washingtonpost.com/news/monkey-cage/wp/2016/10/17/can-an-algorithm-be-racist-our-analysis-is-more-cautious-than-propublicas/

Comments(1)


Harold G. Neuman's picture

Harold G. Neuman

Saturday, March 17, 2018 -- 11:12 AM

I think the notion of, in

我认为,在这种情况下,预测算法的概念是有缺陷的,因为对有色人种和白人犯罪者的明显偏见。就这一点而言,有缺陷的想法可能并不等于种族主义,但正如我的律师朋友们会指出的那样:它给人这样做的感觉,也就是说,结果实际上是一样的。这进一步让我想起完形谜题:所谓的鸭子-兔子。当你看这幅画的时候,它时而表现为一只鸭子,时而表现为一只兔子。也就是说,事情并不总是像它们看起来那样。现在,这种错觉可以通过增加图片中眼睛的大小来很大程度上减少,这样兔子的方面就占了主导地位。但这也是幻觉的一个方面。我惊讶于人们为创造公平的竞争环境或使系统的某些部分更公平所做的努力,但最终他们只会强化他们原本希望消除的不公平。很有趣,你不觉得吗?