That an AI model trained on Google Street View photos can look at a picture and figure out where it is isn't much of a surprise, but it's still jarring to see that it's here.
I think the real lesson is that AI undermines security through obscurity, which any security professional will tell you is not a sound approach. It's not enough to assume that information is hidden enough to not be usable; if you want to remain private, you need to actually secure your information.
This has obvious implications for pictures of vulnerable people (children, for example) on social media. But, of course, you can extrapolate: public social media posts could probably be analyzed for identifying details too, regardless of the medium. All of it could be used for identity theft or to cause other harm.
A human probably isn't going to painstakingly go through your posts to figure out information about you. But if it can be done in one click with a software agent, suddenly we're playing a whole other ball game. #AI
[Link]
· Links · Share this post
I’m writing about the intersection of the internet, media, and society. Sign up to my newsletter to receive every post and a weekly digest of the most important stories from around the web.