I know "don't be evil" is from another era of Google, but still, this rankles:
"Google removed a pledge to not build AI for weapons or surveillance from its website this week. The change was first spotted by Bloomberg. The company appears to have updated its public AI principles page, erasing a section titled “applications we will not pursue,” which was still included as recently as last week."
This dovetails with a piece from earlier this year about how AI is speeding up the military's kill chain:
"The “kill chain” refers to the military’s process of identifying, tracking, and eliminating threats, involving a complex system of sensors, platforms, and weapons. Generative AI is proving helpful during the planning and strategizing phases of the kill chain, according to [the Pentagon's Chief Digital and AI Officer]."
So AI might not be used to pull the trigger, but it is being used to identify who should be in the crosshairs. All our concerns about AI hallucinations, and particularly about bias inherent in training data and therefore outcomes, apply.
[Link]
· Links · Share this post
I’m writing about the intersection of the internet, media, and society. Sign up to my newsletter to receive every post and a weekly digest of the most important stories from around the web.