I came across this video from Skylark Labs today: automated identification of suspicious activity in a crowd via drone. Of course, neither the drone nor the system is really thinking: it's simply drawing on an existing corpus of data in order to draw conclusions. We already know that machine learning algorithms are often biased against black people; there is nothing to suggest that anything will be different here.
But even if these systems were completely accurate, their presence should be unwelcome to all of us.
As granular, algorithmic surveillance becomes more popular, it’s going to become more dangerous to act in a way that sits outside the expectations of dumb machine learning algorithms. You’ll attract more scrutiny from people we can’t expect to have nuance or compassion.
Even if you trust the administrators of algorithmic surveillance to be just, which based on the activities of law enforcement as we know it is a stretch, every person should have the right to an unobserved life. Without freedom from surveillance, we are not free.
People who build surveillance technologies for any purpose - whether law enforcement or advertising - are complicit in building tyranny. In 2020, we’ve got to get serious about forcing technology to protect our freedoms, through technical, social, and legislative means.