Skip to main content
 

Generations

I'm starting to see a bunch of startups that offer to speed you up by completing your work using GPT-3. It's a hell of a promise: start writing something and the robot will finish it off for you. All you've got to do is sand off the edges.

Imagine if the majority of content was made like this. You set a few key words and the topic, and then a machine learning algorithm finishes off the work, based on what it's learned from the corpus of data underpinning its decisions, which happens to be the sum total of output on the web. When most content is actually made by machine learning, the robot is learning from other robots; rinse and repeat until, statistically speaking, almost all content derives from robot output, a photocopy of a photocopy of a photocopy of human thought and emotion.

Would it be gibberish? I'd like to think so. I'd like to assume that it would lose all sense of meaning and the original topics would fade out, as photocopies of photocopies do as the series goes on. But what if it's not? What if, as the human fades out, the content makes more sense, and new, more logical structures emerge from the biological static? Would we stop creating? Would we destroy the robots? Or would we see these things as separate and different, almost as if software had a culture of its own?

What if a robot learned how to be a human based on data gathered on the behavior of every connected human in the world? That data exists; it's just not centralized yet. What if, then, we started to build artificial humans whose behaviors were based on that machine learning corpus? Eventually, when the artificial humans vastly outnumber natural humans, and new artificial humans are learning to be human from older artificial humans, what behaviors would emerge? How would they change across the generations? Would they devolve into gibberish, or turn into something new?

What if we were all cyborgs, a combination of robot and human? Imagine if we had access to the sum total of all human knowledge virtually any time we wanted, and access to the form of that data changed the way we behaved. And then new humans would learn to be human from the cyborgs, and become cyborgs themselves, using hardware and software designed by other cyborgs, which in turn would change their behavior even more. What does that look like after generations? Does it devolve into gibberish? Or does it turn into something completely new?

· Posts