klee_fish-magic.jpg

Fish Magic, 1925, Paul Klee

It is said that a fish, even a really smart one, cannot really grasp the meaning of the concept “wet” because it is the only condition they know. There is no “dry” to compare it to.

Humans, too, have a tendency to imagine that the way things are today is the way they’ve always been, or the way things will be from now on. It’s hard to imagine that perhaps we are merely living in a transitional period where our worldview is under a temporary spell, soon to revert to the way things have always been.

It has been observed, for example, that representational art — paintings and sculptures intended to mirror what we see with our eyes — has, for most of human history, been the exception not the rule. Optical representationalism has only been the dominant art form for a few centuries, and only in a few limited places: in Greece and Rome in ancient times, and more recently in Europe from about 1500 to 1900. Outside of those periods and places, most of our art has been highly-stylized or completely abstract, from cave paintings to hieroglyphics, from Islamic mosaics to Kandinsky’s paintings.

Viewing modern abstract art as a kind of degeneration from representational art, as many still do, presumes that representation is somehow the “normal” way of doing things. But history shows that this is simply not true. Representational art was and still is a kind of fashion or style, a way of thinking about artmaking that utterly infatuated mankind for a long while, but which eventually receded into the general pool of possible artistic expressions.

The 20th Century Fishbowl

Looking back on the 20th century and the new forms of media and culture that it produced, I’ve noticed an interesting phenomenon: Many of the fascinating social and cultural changes transforming the media right now, in the early years of the 21st century, are little more than reversions back to the ways things used to be before the 20th century. When we talk about “revolutions” in technology and media and how they impact our culture, we should remember that a revolution is a 360-degree trajectory, bringing you back where you started.

The 1900’s saw the emergence of a dozen new forms of media and communication, from mass-market publishing to television to online social networks. Each new media’s birth was followed by decades of adaptation to that media, both social (how new media changes our day to day lives) and economic (how these media have been “monetized”). And as each media reaches maturity and settles down, it’s surprising how many of the social and economic changes turned out to be less earth-shaking than we may have thought. In many cases, we’ve come full circle.

Adopt, then Adapt

The 20th century was a period of continuous infatuation with new technologies, particularly in the media, that felt so powerful that we sometimes thought that these technologies were fundamentally transforming, or even doing irreparable damage to, our culture and our world.

And the evidence for the latter is certainly compelling: Families don’t talk at dinner tables anymore, and instead gather around the TV to watch hours of game shows. We spend hours each day driving in cars by ourselves, polluting the atmosphere. Kids glued to mobile phones in schoolrooms. Reality TV. Internet porn. Britney Spears. Have technology and media really made our lives better?

I actually think we’re not doing so bad. Many of the 20th century’s most infamous technology-enabled cultural degradations may, in fact, merely be temporary effects which inevitably trend back to “normalcy”. In the early 20th century, for example, we invented the automobile and drove around with reckless abandon. But then, after countless accidents and horrific smog, we eventually licensed drivers and regulated the vehicles and roadways. Still later, we crashed our cars reading SMS messages on the freeway, but then we made driving while text messaging illegal. We adopted, then adapted. I hate to characterize this in dialectic terms, but much of it has a distinctive thesis/antithesis/sythesis feel to it.

Some examples of 20th-century phenomena whose transformation has, I think, been exaggerated:

  • Reading: Much has been said about how “nobody reads anymore”. Steve Jobs recetnly scoffed at the Amazon Kindle, saying “Forty percent of the people in the U.S. read one book or less last year. The whole conception is flawed at the top because people don’t read anymore”. Despite the numbers, which I don’t doubt, I’ve always been suspicious of the claim that we are less literate than we’ve been historically or than we should be. How much people were reading, say, in 1500 or 500 BC. Or even in 1850 or 1900, before mass-market paperback books and magazines were invented. Ursula LeGuin wrote a fantastic deconstruction of this accusation in February’s Harpers magazine, in a piece called “Staying awake: Notes on the alleged decline of reading” (print only –come on, Harpers!). Her gist is that most people never really read all that much anyway, and in that light people are actually reading quite a bit right now. I’ll also add that the supposed high-point of human literacy, which I gather to be the late 1800s and early 1900s, was also the point at which new information technologies exploded on the scene: telephone, phonograph, radio. If people are reading less but they are instead learning things via the spoken word in an electronic media, is that so bad? Were the books and periodicals of the fin-de-siecle any better than the electronic forms that replaced them?
  • News: People complain about the increasing partisanship and corporate-bias of the news media. Most of us take for granted the idea that a news organization must be “impartial” or non-partisan. But when was this idea born? I’m not a news historian, but I’d guess that this emerged sometime around the middle of the 20th Century, in particular with the large American corporate news organizations who wanted to avoid favoritism and partisanship in order to maintain a consistent flow of advertising dollars. Before that, however, newspapers were completely dominated either by overt political interests or by their governments. Outside of the USA, too, this is still largely the case. But with the recent emergence in the US of deeply partisan mainstream news media (e.g., Fox news) and the global phenomenon of blogging and citizen/advocacy journalism, we are perhaps witnessing not the emergence of something new or unique, but rather the end of a strange and rather short (50 years?) period in the history of news and information.
  • Music: I wrote about this in my last post, which is what inspired this one. Music was once something you could only enjoy as a live experience, in the presence of performing musicians. The 20th century brought us recorded music, which could be bought and sold. This gave everyone the idea that music itself could be bought and sold. With the emergence of digital file sharing, this model is being broken down again, leaving us in a place very similar to where we started, with music being un-ownable, but the experience of music enjoyment being entirely sellable.
  • Food: Okay, this isn’t media, but it is definitely technology: From the 1920’s to the 1990’s, the American diet was infatuated with technologically-processed food. Michael Pollen calls this “nutritionism”, a dietary theory that values the chemical composition of food products over the integral food-ness of them, where a loaf of white bread with all the nutrients bleached out of it and then re-introduced through chemical “enrichment” is somehow better than eating a loaf of whole grain bread. The same adopt-then-adapt pattern is here: Humans become so enamored with food technologies — canning, preservatives, refrigeration, and nutritionism — that our diet turns away, for the first time in a million years, from real food. After a few generations of this, and witnessing the resulting horrific health effects, we eventually began to turn away from these foods. Supermarkets now have enormous fresh fruit and vegetable sections in them, incuding organic foods. But when I was a kid in the 1970’s, a trip to the supermarket was like going to a bomb shelter — canned, processed, and frozen foods were pretty much all you could get, because that’s what people wanted. The more the food was abstracted from nature into powders, spreads, flakes, and puffs, the more people desired it — because they perceived it as futuristic, healthy, and convenient. Once we started to realize that the old ways actually had value, when the novelty of snow-white bread and powdered milk wore off, we began to ask for regular food again.

Once I started seeing things this way, I’ve noticed the pattern everywhere: A 20th-century phenomenon is presumed to be eternal, and then its decline is lamented as if it were the end of civilization itself. I learned that nobody plays bridge anymore — but I learned, also, that contract bridge wasn’t even invented in 1925, and had a run of massive popularity for only a few decades before falling into decline by the late 1960s.

Same as it Ever Was?

My whole idea here is admittedly an optimistic argument (and a slightly conservative one, I confess) in which humanity learns valuable lessons by looking toward our past, and where the most troubling social and cultural trends of the 20th century turn out to be merely side-effects of our slow adaptation to rapidly-emerging technologies.

But the opposite is certainly possible: Humanity could continue trending towards technology-enabled illiteracy, junk food-induced decrepitude, social isolation, and retarded media completely controlled by corporate conglomerates. We could quite easily end up with Idiocracy. I could be completely wrong.

Yes, changes occur. Humanity’s greatest social and technological inventions — the wheel, writing, democracy and human rights, the printing press and the Internet — surely have fundamentally transformed the human experience. Some have even speculated that these technologies have brought about physiological changes to our brains, enabling us to use our minds in ways that our ancient ancestors simply could not (see Julian Jaynes and The Origin of Consciousness in the Breakdown of the Bicameral Mind). This may be true (I am skeptical), but I think in the case of most of the 20th century’s most interesting transformations, despite the constant seemingly earth-shattering changes, we are what we are and we will tend to adapt the technology to us, not the other way around.


Comments

12 responses to “The Peculiar 20th Century”

  1. […] graphpaper.com – The Peculiar 20th Century “Many of the fascinating social and cultural changes transforming the media right now…are little more than reversions back to the ways things used to be before the 20th century.” Fahey on the anomaly that was the 20th century. (tags: 20thcentury revolutions technology innovation changes adoption adaptation) […]

  2. Dalriada Avatar
    Dalriada

    Unfortunately, we’ll never be able to eradicate this way of thinking. Society in general only remembers the time that they have living witnesses for – 100 years at most, but 70-80 in average. And they’ll only criticize those aspects that either in some subjective way really worsened or those that can be experienced only emotionally and cannot be measured to get a scientific comparison. For instance, people will criticize the deterioration of family values, but not changes in medical care (which is certainly improving each day).

    It’s particularly short-sighted to criticize changes in such things as family life, childhood, reading habits, education etc., which are closely bound to the lifestyle of the middle class. As we know, the history of the middle class is fairly short and new. Comparing, for instance, reading habits from the turn of the 18th to 19th century with nowadays is simply not fair because back then the percentage of people who had an access to education and books at all was fairly low. The rise of the middle classes, widespread access to higher education, social and medical care, subsequent low mortality rates, high literacy rates – these are all things that started flourishing in years after the WWII. So, when comparing the standards of living, people only have past 50-60 years in mind at most.

    And this is really short-sighted when juxtaposed to thousands of years afore in which humanity was developing – none of which I would’ve liked to live in more than in this one. I’ve paid attention on history classes and I’m pretty sure that now we (in general, not in some western ethnocentristic way) are better off than ever before.

    Yes, OK, so I figure we agree on that. But I must admit that the first part of the text concerning art was a real eye-opener for me. My outlook on art obviously being conservative, I did believe by now that representational art really was the standard. Thank you for giving me the chance to see it the other way. I feel so ashamed now… 🙂

  3. Dear Graphpaper.com (Hi Chris):

    Perchance, do you teach cultural studies? 🙂

    Reading your comments, I’m reminded of a brilliant quotation I read in Another Magazine a few years ago:

    “We live in decaying times. Young people no longer have respect for their elders. They inhabit taverns and have no self-control”
    -Inscription, 4000 year old Egyptian tomb

    What you are mentioning here is one of the most fundamental flaws with human perception. We base all our observations on the world and everything in it on our current state of being. It goes even further as to be influenced by our moods. If we are happy, then all the world is well, regardless of who is dying.

    I will definitely side with your optimistic take on this. The fact that we are moving back towards natural foods and away from destructive influences gives me hope. It’s as if humanity itself has a natural gravitation towards balanced human nature.

  4. Cesareripa Avatar
    Cesareripa

    recetnly scoffed at the Amazon Kindle, saying “Forty percent of the people in the U.S. read one book or less last year.

    Try proofreading.

    Try “…one book or fewer”

    time that they have living witnesses for

    Try ” …time for which they have wittneses..”

    I gave up, the writing is so bad. Living proof that reading and writing is in steep decline…thank you. Scoff as you will.

    However, music has been owned for centuries. That is called patronage, commissions, copyright, printing, etc. Beethoven, Bach, Mozart, long before the last century and the advent of the recording industry, sought to find “owners” for their music. Alas, only the “owners” could enjoy it, until it was circulated. Allegri’s great “Miserere” was protected by the Vatican, unpublished, secret. Mozart heard it as a child and went back to his lodgings and wrote it out from memory. Did he steal it?

    It is not over…it is just beginning.

    Bluebirdscastle

  5. @Cesareripa: Your pedantic and erroneous (you cannot read “fewer” than one book, especially if you are an aggregated statistic) proofreading of (a) a spoken quote by Steve Jobs and (b) of the words of another commenter whose writings here are ironically no more speckled with errors than your own, suggests that reading may, indeed, be in steep decline — towards a sad future in which only pedants are allowed to write and read.

    Anyway, patronage and commissions are fundamentally different from the kind of ownership of recorded music we unfortunately still have today. You even put the word “owner” in quotes presumably because you intuit that patronage of an artist and ownership of a CD are qualitatively different things.

    And therein lies my point — that we may be returning to an era where if you want to really exert economic control over your music one of your options (and there are many others) is to earn the respect of a patron — that is, to produce your music for a commission rather than for per-unit sales.

    If you want to protect your music from someone taking it as their own, whether for personal enjoyment (as in the case of P2P MP3 file sharing) or for appropriation into their own art (as in Mozart’s use of “Miserere”), indeed your only option will be to keep it secret and never play it for other people.

  6. It’s kind of ironic that Cesareripa felt the need to point out a few strict grammatical errors in this story. What a freakin’ tool. It only reinforces the whole point of the article. He assumes that language is fixed and that the only proper way to write and the only use of words is the way they’re used today. But if Cesareripa looked back say 100, 200 or 300 years he’d find that language and grammar have changed dramatically over the years. It has gone from relatively unstructured back in Chaucer’s time, to very structured in the past 150 years, and is now shifting into a more unstructured state. (Just look at how text messages and chat rooms have changed the way we write. LOL)
    I loved this article, finding it very insightful and thought provoking. So Cesareripa, rip people’s grammar apart all you want, you’re a dinosaur trying to keep English in a jar. Teenagers in California have a bigger influence over the language than you do, jackass. Get off your high horse and let it kick you in the nuts. Maybe it will pop your head out of your butt. Then try proofreading your own letters. They’re not exactly perfect. Bonehead.

  7. @Jamie Vandermoer: A little harsh there, eh? Plus, I don’t quite agree that ceding American English to California teenagers is such a great idea. One can be in favor of writing well without being pedantic. Your own use of capitalization, punctuation, and spelling shows that you, too, don’t exactly advocate tossing such things to the wind, either.

  8. Brant Louck Avatar
    Brant Louck

    Dear Chris, I have been thinking about this exact same phenomena. While Im happy to let some technological advances of the 20th C pass away, such as food and “owning” music, the belief in an ideal of non-biased journalism is one that I wish had a longer life span.
    There is another 20th century thing that I want to bring back: fast electric devices. I’m not sure how it became acceptable for there to be such incredible lagtimes in our consumer electronics. My phone requires 45 seconds minutes to turn off and back on. A DVD player makes me wait 12 seconds or more to eject a disc AND it makes me watch the interpol warning that I used to fast-forward through with old VHS. My iPod takes 20 seconds to sort itself out before it lets me unplug it. I have to hold a button down for 10 seconds to make it turn off. Sometimes I move through menus too fast and have to wait for the iPod to catch up. My 10gb iPod in 2001 was lightning fast compared to the one I have now. I have to hold down the “unlock” button on my car key fob for five seconds, twice, to fully unlock my car.
    I realize that I’m talking about seconds, but my CD player in 1992 could eject a disc immediately. My Sony Walkman and my beeper in 1993 could be turned on and off instantly with a sliding switch. My AMC Hornet unlocked with a quick turn of a car key.
    In this day and age where our lives are increasingly dictated by the speed and ease of our transactions and communications, this ongoing, spreading, time lag is seriously bumming me out! And (like the acceptance of watching horizontally distorted CNN on widescreen public TVs) nobody is addressing this problem.

  9. Hey Chris,

    I’d like to commend you on the insightful nature of your post. I’ve definitely found myself musing over such issues (that are at least of my immediate concern) like music, art, and education.

    As somewhat of a pessimist myself, I found it nice to read your optimism. I do agree that by historical comparison, technology has enabled the greater audience a much needed connection to information (and you’re quite right in the “adopt, then adapt” theory).But I think we’re in a significantly slow pace towards interest and understanding. Even if we are largely becoming an interconnected, knowledge sharing, media saturated culture, the adaptation to these changes are quite slow.

    Steve Jobs’ note was appropriate that a large % reads less than 1 book a year. But I believe it is an insight into an even bigger issue. The general public has no drive for education. We’re still largely hooked on stardom, from kings and princesses, to rock stars and to technological frontiers. I don’t see people considering the historical contexts that envelop such phenomena. We largely take the current world around us for granted, without thought for how it will impact the future, or why we are where we are today. There is no incentive for discovering the inner functions of a system, theory, object, artwork, or any given artifact. I think such lag has largely stemmed from a quite sluggish pace in educational reform and evolution. We generally still learn from a specific trade perspective, in the sense of how it will benefit us in our lifetime.

    Having said that, I’d like to add that Music finally is moving towards something more creative. As long as it is owned, it risks at giving up its’ own artistic license. It really is an art form, that surely enough people will pay money for. But that is only one half of the equation, and a majority of people involved with “owning” music are specifically concerned with it (which makes it a full puzzle for the rest of us). The other half is creativity and the ability to express artistic freedom. Music is something that should be explored, not commodified.

    The Internet has begun to break down many traditional regimes, but I fear that it wont be long before another comes to pick up the reign where it’s been left off… think Google.

    You might be interested in Mark Federman. He speaks alot of interaction and the modes for learning. This piece by him is quite revealing : “Why Johnny and Janey Can’t Read, and Why Mr. and Ms. Smith Can’t Teach”

    http://individual.utoronto.ca/markfederman/

    Cheers

  10. @Pavel: I like the fact that you read my post as “optimistic”. Because it certainly is. To illustrate, I want to respond to one of your propositions, namely that “The general public has no drive for education.”

    I’d argue that today’s 9th graders are far better educated than 9th graders 20, 40, 100 years ago. Maybe they don’t study Latin and Greek, but they understand computers and media and they have a far more complex world to grasp in general. I have nothing to base this on, but I’ll bet math and science classes are way harder now than they used to be. And of course, as Ursula Leguin argues, reading in the general public is alive and well.

    But I’m not trying to say you are wrong. Where you are correctish in your assertion, and where I see optimism, is in your undelying *hope* and *ambition* that we could be doing even better. That is what I really think: That when we lament the state of our culture in whatever respect, it’s not necessarily because we think things used to be better, but because we think things can and should be even better still now and in the future.

  11. Not enough battery to go deeper, but the midcentury clustering of broadcast opinion around the consensual mean had more to do with the regulatory environment (i.e. the Fairness Doctrine) as anything else. Hooah.

  12. AG: You are absolutely right. I make no case as to why news media neutrality happened, but it happened (in America). And it was, and is, kind of peculiar compared to other times and places.

    It is interesting to wonder why it happened here, that is, why there emerged a Fairness Doctrine. in the first place Was it driven by corporate competition for limited broadcast licenses? Was it the Cold War? Was it really that much different than other countries where the state had explicit control over the media? And how did American print news end up with the same need to position themselves as neutral, too? Was it osmosis from broadcast?