Our Natural Human Defenses Against A.I. Culture
Culture is made of humans, and A.I. doomsdayers haven't explained why we'll all fall in love with A.I.-made creative work, when there is so much evidence to the contrary
In the late 1950s, East German teenagers often tuned into Radio Free Berlin to hear contraband Elvis Presley records, and soon they began to imitate the frenzied dance steps of rock ‘n’ roll at their social gatherings. In order to defend socialism against these encroaching American influences, Communist authorities invented a new dance for youth: a pseudo-Latin, fast-waltz called “the Lipsi.” Yet, despite the full East German propaganda machine promoting the Lipsi, authorities couldn’t budge youth from their love of rock ‘n’ roll. In 1959 brave teenagers took to the streets of Leipzig in makeshift pro-Elvis protests, chanting, “We want no Lipsi!”
In being a designated set of arbitrary body movements that accompany music, the Lipsi was indeed a dance, and maybe even a pretty good dance. But the Lipsi failed to become culture because teenagers refused to take up those arbitrary movements and imbue them with any kind of authentic meaning or value. No matter how much power the Communist Party wielded, East German teenagers easily rejected the Lipsi. The government proposed, teens disposed.
Today the Lipsi is little more than a footnote of Cold War history, but it should guide our thinking about whether cutting-edge generative A.I. technology is going to displace genuine human creativity. Generative A.I. is suddenly able to make texts, images, and even songs that seem more or less human. This development lead Yuval Harari, Tristan Harris and Aza Raskin to write a bombastic New York Times op-ed, in which they warned, “A.I. could rapidly eat the whole of human culture — everything we have produced over thousands of years — digest it and begin to gush out a flood of new cultural artifacts. Not just school essays but also political speeches, ideological manifestos, holy books for new cults.” Since culture undergirds our interpretation of reality, they argue, A.I.-created culture will alter our collective perceptual frameworks and hurl civilization towards self-destruction.
These alarming predictions, however, aren't based on an undertanding of how culture actually works. As much as A.I. has gotten better at making human-like art, creation is arguably the easiest step in the process of cultural formation. Every day thousands if not millions of people propose new ideas and practices, but only a tiny sliver ever win attention, attract adherents, or take on collective meanings. Ironically, generative A.I. has emerged just as we're already emeshed in a human-created crisis of cultural overproduction. Most human-created art today fails to take on social value. Musicians and labels upload over 100,000 songs to Spotify each day, and yet this market chaos has only pushed consumers to concentrate on a small stable of veteran entertainers, such as Beyoncé and Taylor Swift.
Those panicked about A.I. culture offer little explanation for why computer-created pseudo-culture will instantly receive widespread interest and reverence when 99.99% of human symbolic output can’t. Harari and co. seem to believe that computers will achieve great creative prowess because they already mastered chess and go, but this equivalence is an overreach. At the moment, generative A.I. produces works by repurposing a large body of pre-existing material, which means it mostly recycles clichés and drips out stale kitsch.
But let’s say some piece of interesting mechanized culture cuts through the noise and receives human attention. Or even, computers are programmed to "jump" outside of convention and create non-conventional outputs. At that point, we would still have another strong neurological defense: Humans dislike arbitrariness. Even with rule-breaking art, we only accept the rule-breaking when we believe its creator is a genius, not a madman. Appreciating cultural innovation requires a belief there is intention behind it. There is already good research demonstrating that our brains reject visual art that seems to lack intention. Danish psychologist Ulrich Kirk and his colleagues placed subjects in an MRI and showed them abstract images either labeled as venerated works from a museum or automated generations. The subjects not only rated the supposed human-crafted images as “more attractive,” but the premise of human creation alone increased neural activity in the brain’s reward and memory systems. In other words, brains engage more with cultural artifacts when they’re perceived to be made by humans for a reason.
There is no doubt that talented humans will harness A.I. as a tool for creating vital new artifacts, just as they did with photography, motion pictures, desktop publishing software, and computer graphics. And there are certainly many legitimate worries about A.I. in terms of spreading misinformation and replacing human labor. But it’s misleading to declare generative A.I. the obvious end of human cultural creation, when culture is inherently made of humans. The entire value of culture is rooted in individuals sharing conventions with others in their communities — and we have natural barriers that prevent us from centering our communal lives around A.I.-created artifacts.
That being said, we need not sleepwalk into this conflict. The best ways to stay vigilant are to (1) push for widespread understanding of the inherent humanity required to make true culture, and (2) celebrate complex, rule-bending art rather than spending our attention on cynical commercial products made by committee, based on pre-existing formulas. There has never been a better time for the world to become experts in detecting art from kitsch. But we can take brief solace in the fact that it was organically-created rock ‘n’ roll, not the artificially-created Lipsi, that went on to shape global culture. A.I. can and will propose an infinite number of new cultural directions, but just like the Leipzig teenagers, we can always stand up and say no.