Skip to main content
 

These Wrongly Arrested Black Men Say a California Bill Would Let Police Misuse Face Recognition

[The Markup]

"Now all three men are speaking out against pending California legislation that would make it illegal for police to use face recognition technology as the sole reason for a search or arrest. Instead it would require corroborating indicators."

Even with mitigations, it will lead to wrongful arrests: so-called "corroborating indicators" don't assist with the fact that the technology is racially biased and unreliable, and in fact may provide justification for using it.

And the stories of this technology being used are intensely bad miscarriages of justice:

“Other than a photo lineup, the detective did no other investigation. So it’s easy to say that it’s the officer’s fault, that he did a poor job or no investigation. But he relied on (face recognition), believing it must be right. That’s the automation bias this has been referenced in these sessions.”

"Believing it must be right" is one of core social problems widespread AI is introducing. Many people think of computers as being coldly logical deterministic thinkers. Instead, there's always the underlying biases of the people who built the systems and, in the case of AI, in the vast amounts of public data used to train them. False positives are bad in any scenario; in law enforcement, it can destroy or even end lives.

[Link]

· Links · Share this post