The Newest Jim Crow

Michelle Alexander: Under new policies in California, New Jersey, New York and beyond, “risk assessment” algorithms recommend to judges whether a person who’s been arrested should be released. These advanced mathematical models — or “weapons of math destruction” as data scientist Cathy O’Neil calls them — appear colorblind on the surface but they are based on factors that are not only highly correlated with race and class, but are also significantly influenced by pervasive bias in the criminal justice system.

As O’Neil explains, “It’s tempting to believe that computers will be neutral and objective, but algorithms are nothing more than opinions embedded in mathematics.”

Challenging these biased algorithms may be more difficult than challenging discrimination by the police, prosecutors and judges. Many algorithms are fiercely guarded corporate secrets. Those that are transparent — you can actually read the code — lack a public audit so it’s impossible to know how much more often they fail for people of color. More here.

Leave a Reply