It is said that a fish, even a really smart one, cannot really grasp the meaning of the concept “wet” because it is the only condition they know. There is no “dry” to compare it to.
Humans, too, have a tendency to imagine that the way things are today is the way they’ve always been, or the way things will be from now on. It’s hard to imagine that perhaps we are merely living in a transitional period where our worldview is under a temporary spell, soon to revert to the way things have always been.
It has been observed, for example, that representational art — paintings and sculptures intended to mirror what we see with our eyes — has, for most of human history, been the exception not the rule. Optical representationalism has only been the dominant art form for a few centuries, and only in a few limited places: in Greece and Rome in ancient times, and more recently in Europe from about 1500 to 1900. Outside of those periods and places, most of our art has been highly-stylized or completely abstract, from cave paintings to hieroglyphics, from Islamic mosaics to Kandinsky’s paintings.
Viewing modern abstract art as a kind of degeneration from representational art, as many still do, presumes that representation is somehow the “normal” way of doing things. But history shows that this is simply not true. Representational art was and still is a kind of fashion or style, a way of thinking about artmaking that utterly infatuated mankind for a long while, but which eventually receded into the general pool of possible artistic expressions.
The 20th Century Fishbowl
Looking back on the 20th century and the new forms of media and culture that it produced, I’ve noticed an interesting phenomenon: Many of the fascinating social and cultural changes transforming the media right now, in the early years of the 21st century, are little more than reversions back to the ways things used to be before the 20th century. When we talk about “revolutions” in technology and media and how they impact our culture, we should remember that a revolution is a 360-degree trajectory, bringing you back where you started.
The 1900’s saw the emergence of a dozen new forms of media and communication, from mass-market publishing to television to online social networks. Each new media’s birth was followed by decades of adaptation to that media, both social (how new media changes our day to day lives) and economic (how these media have been “monetized”). And as each media reaches maturity and settles down, it’s surprising how many of the social and economic changes turned out to be less earth-shaking than we may have thought. In many cases, we’ve come full circle.
Adopt, then Adapt
The 20th century was a period of continuous infatuation with new technologies, particularly in the media, that felt so powerful that we sometimes thought that these technologies were fundamentally transforming, or even doing irreparable damage to, our culture and our world.
And the evidence for the latter is certainly compelling: Families don’t talk at dinner tables anymore, and instead gather around the TV to watch hours of game shows. We spend hours each day driving in cars by ourselves, polluting the atmosphere. Kids glued to mobile phones in schoolrooms. Reality TV. Internet porn. Britney Spears. Have technology and media really made our lives better?
I actually think we’re not doing so bad. Many of the 20th century’s most infamous technology-enabled cultural degradations may, in fact, merely be temporary effects which inevitably trend back to “normalcy”. In the early 20th century, for example, we invented the automobile and drove around with reckless abandon. But then, after countless accidents and horrific smog, we eventually licensed drivers and regulated the vehicles and roadways. Still later, we crashed our cars reading SMS messages on the freeway, but then we made driving while text messaging illegal. We adopted, then adapted. I hate to characterize this in dialectic terms, but much of it has a distinctive thesis/antithesis/sythesis feel to it.
Some examples of 20th-century phenomena whose transformation has, I think, been exaggerated:
- Reading: Much has been said about how “nobody reads anymore”. Steve Jobs recetnly scoffed at the Amazon Kindle, saying “Forty percent of the people in the U.S. read one book or less last year. The whole conception is flawed at the top because people donâ€™t read anymore”. Despite the numbers, which I don’t doubt, I’ve always been suspicious of the claim that we are less literate than we’ve been historically or than we should be. How much people were reading, say, in 1500 or 500 BC. Or even in 1850 or 1900, before mass-market paperback books and magazines were invented. Ursula LeGuin wrote a fantastic deconstruction of this accusation in February’s Harpers magazine, in a piece called “Staying awake: Notes on the alleged decline of reading” (print only –come on, Harpers!). Her gist is that most people never really read all that much anyway, and in that light people are actually reading quite a bit right now. I’ll also add that the supposed high-point of human literacy, which I gather to be the late 1800s and early 1900s, was also the point at which new information technologies exploded on the scene: telephone, phonograph, radio. If people are reading less but they are instead learning things via the spoken word in an electronic media, is that so bad? Were the books and periodicals of the fin-de-siecle any better than the electronic forms that replaced them?
- News: People complain about the increasing partisanship and corporate-bias of the news media. Most of us take for granted the idea that a news organization must be “impartial” or non-partisan. But when was this idea born? I’m not a news historian, but I’d guess that this emerged sometime around the middle of the 20th Century, in particular with the large American corporate news organizations who wanted to avoid favoritism and partisanship in order to maintain a consistent flow of advertising dollars. Before that, however, newspapers were completely dominated either by overt political interests or by their governments. Outside of the USA, too, this is still largely the case. But with the recent emergence in the US of deeply partisan mainstream news media (e.g., Fox news) and the global phenomenon of blogging and citizen/advocacy journalism, we are perhaps witnessing not the emergence of something new or unique, but rather the end of a strange and rather short (50 years?) period in the history of news and information.
- Music: I wrote about this in my last post, which is what inspired this one. Music was once something you could only enjoy as a live experience, in the presence of performing musicians. The 20th century brought us recorded music, which could be bought and sold. This gave everyone the idea that music itself could be bought and sold. With the emergence of digital file sharing, this model is being broken down again, leaving us in a place very similar to where we started, with music being un-ownable, but the experience of music enjoyment being entirely sellable.
- Food: Okay, this isn’t media, but it is definitely technology: From the 1920’s to the 1990’s, the American diet was infatuated with technologically-processed food. Michael Pollen calls this “nutritionism”, a dietary theory that values the chemical composition of food products over the integral food-ness of them, where a loaf of white bread with all the nutrients bleached out of it and then re-introduced through chemical “enrichment” is somehow better than eating a loaf of whole grain bread. The same adopt-then-adapt pattern is here: Humans become so enamored with food technologies — canning, preservatives, refrigeration, and nutritionism — that our diet turns away, for the first time in a million years, from real food. After a few generations of this, and witnessing the resulting horrific health effects, we eventually began to turn away from these foods. Supermarkets now have enormous fresh fruit and vegetable sections in them, incuding organic foods. But when I was a kid in the 1970’s, a trip to the supermarket was like going to a bomb shelter — canned, processed, and frozen foods were pretty much all you could get, because that’s what people wanted. The more the food was abstracted from nature into powders, spreads, flakes, and puffs, the more people desired it — because they perceived it as futuristic, healthy, and convenient. Once we started to realize that the old ways actually had value, when the novelty of snow-white bread and powdered milk wore off, we began to ask for regular food again.
Once I started seeing things this way, I’ve noticed the pattern everywhere: A 20th-century phenomenon is presumed to be eternal, and then its decline is lamented as if it were the end of civilization itself. I learned that nobody plays bridge anymore — but I learned, also, that contract bridge wasn’t even invented in 1925, and had a run of massive popularity for only a few decades before falling into decline by the late 1960s.
Same as it Ever Was?
My whole idea here is admittedly an optimistic argument (and a slightly conservative one, I confess) in which humanity learns valuable lessons by looking toward our past, and where the most troubling social and cultural trends of the 20th century turn out to be merely side-effects of our slow adaptation to rapidly-emerging technologies.
But the opposite is certainly possible: Humanity could continue trending towards technology-enabled illiteracy, junk food-induced decrepitude, social isolation, and retarded media completely controlled by corporate conglomerates. We could quite easily end up with Idiocracy. I could be completely wrong.
Yes, changes occur. Humanity’s greatest social and technological inventions — the wheel, writing, democracy and human rights, the printing press and the Internet — surely have fundamentally transformed the human experience. Some have even speculated that these technologies have brought about physiological changes to our brains, enabling us to use our minds in ways that our ancient ancestors simply could not (see Julian Jaynes and The Origin of Consciousness in the Breakdown of the Bicameral Mind). This may be true (I am skeptical), but I think in the case of most of the 20th century’s most interesting transformations, despite the constant seemingly earth-shattering changes, we are what we are and we will tend to adapt the technology to us, not the other way around.