Skip to main content
 

Crime Prediction Software Promised to Be Free of Biases. New Data Shows It Perpetuates Them

"Millions of crime predictions left on an unsecured server show PredPol mostly avoided Whiter neighborhoods, targeted Black and Latino neighborhoods. [...] “No one has done the work you guys are doing, which is looking at the data,” said Andrew Ferguson, a law professor at American University who is a national expert on predictive policing. “This isn’t a continuation of research. This is actually the first time anyone has done this, which is striking because people have been paying hundreds of thousands of dollars for this technology for a decade.”"

[Link]

· Links · Share this post

Email me: ben@werd.io

Signal me: benwerd.01

Werd I/O © Ben Werdmuller. The text (without images) of this site is licensed under CC BY-NC-SA 4.0.