The central question of this era just might be how do we co-exist with AI? AI presents manifold challenges to human identity. The prospect of an entity or series of entities that exceed human intelligence, intuition, creativity, and even empathy, in every possible way already feels like a singularity in 2024, it’s bewildering imagining what this could look like in 2034, let alone 2044. 2004 seems like yesterday by comparison.
In many ways, we already live in a singularity of sorts, a point at which we cannot predict nor comprehend what happens next. Somewhere between 2001 and 2024, the world became incoherent, the zeitgeist became impossible to pinpoint, and the maelstrom of information overwhelmed us absolutely. There are moments when things seem to feel coherent, flash points and lightning rods that seem to be a focal point for news, social media focus, and global conversation. 9/11! Arab Spring! Ukraine/Russia! January 6th! Sam Altman was fired! These were all moments that seemed to signify a flash point that symbolized those eras, a crystallization of themes and actors, but those eras come and go so fast now, what did they really signify? The moments and themes weave in and out of each other so exponentially and so fast that there’s little to no coherency. I think we underestimate the extent to which our own bubble amplifies the things we think are important and symbolic. And all of these moments get swept up by the next thing. We can’t stay on one point, we can’t connect the dots into themes. The theme is chaos.
It’s within that above context that AI arrives on our shores. And I don’t think scifi really prepared us for this. It’s fascinating that scifi narratives obsessed so much over space, dystopia, and robots but only a few really explored the implications of things like the internet, mobile phones, and now AI that can do nearly everything we thought humans were good at. It’s like that adage, when they were building the highways, no one could have predicted Walmart. The knock-on effects of systemic change are unpredictable and AI is a systemic change of exponential proportions.
We already live in a world that is too complex for our Cro-Magnon brains to comprehend, how do we expect to interact with entities that are trained on that very world we’re overwhelmed by? Our brains are wholly unprepared for a world seasoned with AI over everything. It’s going to be the ubiquitous ingredient. Or will the opposite occur, by funneling an overwhelming world through the filter of AI, will the world thereby become more lucid and simple to our feeble minds? Everyday, we use algorithms in social media and media to serve to us the portions of information that we can digest. The front page of my life is mediated absolutely by AI. Without those algos, the world seems incomprehensible and overwhelming. At the same time, generative AI content is leading to an explosion of content that will inevitably explode this information overload even further. It seems to me the entropy of humanity is further sociological and information complexity.
So where does that really leave us? We’re going to be in an uneasy relationship with AI where we ask it to help us comprehend what’s going on in the world (not just by interpreting but also funneling us to the people who can interpret that world) while it simultaneously causes the world to be more complex with not just the creation of synthetic data, etc. but also its unpredictable Walmart-like knock-on effects. It’s the elephant in the room that takes over the whole room.
That’s why I think this project of co-existing with AI is so futile. You’re damned if you do use AI and you’re damned if you don’t use AI. The Pandora’s Box allegory never felt so pertinent. A scenario in which the moment you open the box, everything changes and you can’t look back. We’re here now, strap in.