Blog Archives

The Sad Failure of ‘Fahrenheit 451’ to Prevent the Future

Teeming Brain readers are familiar with my longtime focus on Fahrenheit 451 and my abiding sense that we’re currently caught up in a real-world version of its dystopian vision. This is not, of course, an opinion peculiar to me. Many others have held it, too, including, to an extent, Bradbury himself. I know that some of you, my readers, share it as well.

As of a couple of weeks ago, a writer for the pop culture analysis website Acculturated has publicly joined the fold:

Ray Bradbury often said that he wrote science fiction not to predict the future but to prevent it. On this score, Fahrenheit 451 seems to have failed. The free speech wars on college campuses, the siloing effect of technology, the intolerance of diverse political opinions, and the virtual cocoon provided by perpetual entertainment all suggest that Bradbury anticipated the future with an accuracy unparalleled elsewhere in science fiction literature.

It’s a strange irony that, in the age of the Internet, which was supposed to encourage more transparency and debate, the open exchange of ideas is under threat. This was pointed out by another famous science fiction writer, Michael Crichton. “In the information society,” says Ian Malcolm in Jurassic Park, “No one thinks. We expected to banish paper, but we actually banished thought.” Bradbury saw this coming many decades earlier, and he understood why. Exposure to new ideas is uncomfortable and potentially dangerous. Staying safe, comfortable, and equal requires that everyone think identically. Liberal learning, the crucible that forms the individual, is anathema to group identity and cannot be tolerated. If you disagree, you’re morally suspect.

Which is why we need Bradbury’s message today more than ever. In a coda to the 1979 printing of Fahrenheit 451, Bradbury wrote: “There is more than one way to burn a book. And the world is full of people running about with lit matches.”

Full Text: “Ray Bradbury Wrote ‘Fahrenheit 451’ to Prevent a Dystopia. Instead, He Predicted One

(If you click through to read the full text, be aware that the first paragraph of the piece presents a slightly inaccurate potted history of Bradbury’s career trajectory that implies he only rose to literary prominence with the publication of F451 in 1953. In fact, some of his previous books and stories, including, especially, 1950’s The Martian Chronicles, had already brought him considerable attention and acclaim.)

For more on the same theme, see my previous posts “On living well in Ray Bradbury’s dystopia: Notes toward a monastic response” and “Facebook, Fahrenheit 451, and the crossing of a cultural threshold,” as well as the Strange Horizons essay “The Failure of Fahrenheit 451.”

For thoughts from the author himself, see the 2007 LA Weekly piece Ray Bradbury: Fahrenheit 451 Misinterpreted,” featuring Bradbury’s comments on the reality of F451-like trends in contemporary society. (However, Bradbury’s comments in that article/interview should be read in tandem with this context-creating response from his biographer, Sam Weller.) Also see Bradbury’s interviews for A.V. Club and the Peoria Journal Star for more observations from him about the encroaching threat of his novel’s realization in the world around us. And see especially his 1998 interview for Wired, titled “Bradbury’s Tomorrowland,” in which he said the following:

Almost everything in Fahrenheit 451 has come about, one way or the other — the influence of television, the rise of local TV news, the neglect of education. As a result, one area of our society is brainless. But I utilized those things in the novel because I was trying to prevent a future, not predict one.

American media culture as psychic predator and parasite

In a May 21 rumination for The Morning News, James A. Pearson, who “co-founded the humanitarian business Ember Arts and writes from his parallel lives in Uganda and California,” offers an uncomfortable observation about the increasingly heavy psychic net of always-on digital consumer media here in the United States — something to which he is uncommonly sensitive because of his quasi-outsider perspective:

I always binge on media when I’m in America. But this time it feels different. Media feels encroaching, circling, kind of predatory. It feels like it’s bingeing back.

. . . The basic currency of consumer media companies — Netflix, Hulu, YouTube, NBC, Fox News, Facebook, Pinterest, etc. — is hours of attention, our attention. They want our eyeballs focused on their content as often as possible and for as many hours as possible, mostly to sell bits of those hours to advertisers or to pitch our enjoyment to investors. And they’re getting better at it, this catch-the-eyeball game.

All sorts of media companies are deploying new tricks. Facebook notifications are no longer confined to Facebook; they’re on browser tabs, on phones and tablets, in as many emails as you forget to turn off, and recently started to feature an annoying little sound on my laptop (one that can thankfully be turned off, unlike Netflix’s Post-Play). It seems like every new phone app I download wants to send me push notifications, so its developers can grab my attention whenever they like. Even a competitive-cooking show my mom watches on basic cable doesn’t cut to commercial between back-to-back episodes anymore, and is designed so every mid-episode commercial break is also mid-cliffhanger.

. . . The scariest part of this new binge culture is that hours spent bingeing don’t seem to displace other media consumption hours; we’re just adding them to our weekly totals. Lump in hours on Facebook, Pinterest, YouTube, and maybe even the occasional non-torrented big-screen feature film and you’re looking at a huge number of hours per person.

. . . Then there’s the actual content. It’s probably clear to anyone over the age of 18 or so that content has undergone a sort of Incredible Hulk de-evolution that makes it both dumber and somehow also much more powerful. A good example of this (brought to my attention by a random post on Facebook) is TLC, founded as The Learning Channel by the former Dept. of Health, Education, and Welfare, together with NASA, to enrich American minds, but which now grips American eyeballs with Here Comes Honey Boo Boo. Ratings, no doubt, are up.

The media of my childhood, mostly weekly television shows and overused VHS tapes, was like a good pet. Sure, it was a little costly to keep around, but it was lovable, and I could always shut it out in the yard for a while. Now, though, media is always with me, always trying to snag my attention and siphon away as much as possible to sell to advertisers. It feels like it’s evolved from a cute little pet into a frighteningly efficient parasite.

More: “From Here You Can See Everything

Silence, solitude, and self-discovery in an age of mass distraction

“[T]he internet seizes our attention only to scatter it. We are immersed because there’s a constant barrage of stimuli coming at us and we seem to be very much seduced by that kind of constantly changing patterns of visual and auditorial stimuli. When we become immersed in our gadgets, we are immersed in a series of distractions rather than a sustained, focused type of thinking … There are messages coming at us through email, instant messenger, SMS, tweets etc. We are distracted by everything on the page, the various windows, the many applications running. You have to see the entire picture of how we are being stimulated. If you compare that to the placidity of a printed page, it doesn’t take long to notice that the experience of taking information from a printed page is not only different but almost the opposite from taking in information from a network-connected screen. With a page, you are shielded from distraction. We underestimate how the page encourages focused thinking — which I don’t think is normal for human beings — whereas the screen indulges our desire to be constantly distracted.”

— “Information and Contemplative Thought: We Turn Ourselves into Media Creations,” Interview with Nicholas Carr, The European, January 31, 2012

“Has it really come to this? In barely one generation we’ve moved from exulting in the time-saving devices that have so expanded our lives to trying to get away from them — often in order to make more time. The more ways we have to connect, the more many of us seem desperate to unplug. Like teenagers, we appear to have gone from knowing nothing about the world to knowing too much all but overnight. Internet rescue camps in South Korea and China try to save kids addicted to the screen. Writer friends of mine pay good money to get the Freedom software that enables them to disable (for up to eight hours) the very Internet connections that seemed so emancipating not long ago. Even Intel (of all companies) experimented in 2007 with conferring four uninterrupted hours of quiet time every Tuesday morning on 300 engineers and managers … [T]he average American spends at least eight and a half hours a day in front of a screen … The average American teenager sends or receives 75 text messages a day … We have more and more ways to communicate, as Thoreau noted, but less and less to say … The central paradox of the machines that have made our lives so much brighter, quicker, longer and healthier is that they cannot teach us how to make the best use of them; the information revolution came without an instruction manual.”

— Pico Iyer, “The Joy of Quiet,” The New York Times, December 29, 2011

“I am encouraged by services such as Instapaper, Readability or Freedom — applications that are designed to make us more attentive when using the internet. It is a good sign because it shows that some people are concerned about this and sense that they are no longer in control of their attention. Of course there’s an irony in looking for solutions in the same technology that keeps us distracted.”

— Carr, “Information and Contemplative Thought” Read the rest of this entry

On living well in Ray Bradbury’s dystopia: Notes toward a monastic response

Morris Berman may not have been the first person to offer simultaneous commentary on American culture and Fahrenheit 451 by observing that the former has basically transformed itself into the dystopian society depicted by the latter. Many people have noted in the decades since Fahrenheit was first published in 1953 that things have been moving eerily and strikingly in the direction Bradbury foresaw (or rather, the direction he tried to forestall; “I wasn’t trying to predict the future,” he famously said in a 2003 interview. “I was trying to prevent it.”) But it was Morris who most forcefully affected me with this line of thought when he laid it out in The Twilight of American Culture:

In 1953, Ray Bradbury published Fahrenheit 451 — later made into a movie by Francois Truffaut — which depicts a future society in which intelligence has largely collapsed and the reading of books is forbidden by law. People sit around interacting with screens (referred to as “the family”) and taking tranquilizers. Today, nearly five decades later, isn’t this largely the point at which we have arrived? Do not the data [on the collapse of American intelligence] suggest that most of our neighbors are, in fact, the mindless automatons depicted in Truffaut’s film? True, the story does contain a class of “book people” who hide in the forest and memorize the classics, to pass on to future generations — and this vignette does, in fact, provide a clue as to what just might enable our civilization to eventually recover — but the majority of citizens on the eve of the twenty-first century watch an average of four hours of TV a day, pop Prozac and its derivatives like candy, and perhaps read a Danielle Steel novel once a year

. . . [T]he society depicted in Fahrenheit 451 has banned books and immerses itself instead in video entertainment, a kind of “electronic Zen,” in which history has been forgotten and only the present moment counts . . . [The novel] is extraordinarily prescient. Leaving aside the issue of direct censorship of books — rendered unnecessary by McWorld, as it turns out, because most people don’t read anymore — most of the features of this futuristic society are virtually upon us, or perhaps no more than twenty years away. [1]

Read the rest of this entry

Facebook, ‘Fahrenheit 451,’ and the crossing of a cultural threshold

One of the most subtle and subversive pieces of social criticism in Fahrenheit 451comes early in the book when Montag, a fireman (i.e., book burner) who eventually wakes up to a recognition of his society’s essential character as a fascist-totalitarian dark age, chats with a teenaged girl named Clarisse. Or rather, it’s she who chats with him. The dumbed-down denizen’s of Bradbury’s keenly envisioned future dystopia of ignorance, repression, distraction, and dissipation are more fond of television, music, games, sports, sedatives, and other amusements than they are of real human contact, and when Clarisse suddenly shows up, introduces herself, and begins talking to Montag on a succession of evenings as he walks home from work, he’s considerably discomfited. But he finds her intriguing, and eventually he comes to look forward to their talks, so that when she unexpectedly disappears — presumably having been taken away by the repressive central government (a suspicion that’s confirmed later in the novel) — he’s deeply disturbed by it.

At one point in their conversations, he asks her why she isn’t in school. Her response reflects a profound inversion and perversion of what it means to be “antisocial” as judged by the surrounding society:

“Oh, they don’t miss me,” she said. “I’m antisocial, they say. I don’t mix. It’s so strange. I’m very social indeed. It all depends on what you mean by social, doesn’t it? Social to me means talking to you about things like this.” She rattled some chestnuts that had fallen off the tree in the front yard. “Or talking about how strange the world is. Being with people is nice. But I don’t think it’s social to get a bunch of people together and then not let them talk, do you? An hour of TV class, an hour of basketball or baseball or running, another hour of transcription history or painting pictures, and more sports, but do you know, we never ask questions, or at least most don’t; they just run the answers at you, bing, bing, bing, and us sitting there for four more hours of film-teacher. That’s not social to me at all. It’s a lot of funnels and lot of water poured down the spout and out the bottom, and them telling us it’s wine when it’s not. They run us so ragged by the end of the day we can’t do anything but go to bed or head for a Fun Park to bully people around … I guess I’m everything they say I am, all right. I haven’t any friends. That’s supposed to prove I’m abnormal.”

Although Bradbury’s critique in this passage is aimed largely at the public school system, his description of Clarisse’s ironic plight, in which her authentic human sociability earns her the label “antisocial” — a  label that, as the book later shows, is tantamount to a criminal charge in this particular (semi-)fictional dystopia — has wider resonances in today’s world of cultural dominance by social media. In fact, we may be seeing a similar inversion and perversion of language and values play out right before our eyes at this very cultural moment.
Read the rest of this entry

On the demise of the Encyclopedia Britannica’s print edition

Have you heard?

After 244 years, the Encyclopaedia Britannica is going out of print. Those coolly authoritative, gold-lettered reference books that were once sold door-to-door by a fleet of traveling salesmen and displayed as proud fixtures in American homes will be discontinued, company executives said. In an acknowledgment of the realities of the digital age — and of competition from the Web site Wikipedia — Encyclopaedia Britannica will focus primarily on its online encyclopedias and educational curriculum for schools. The last print version is the 32-volume 2010 edition, which weighs 129 pounds and includes new entries on global warming and the Human Genome Project.

— “After 244 Years, Encyclopedia Britannica Stops the Presses,” The New York Times, March 13, 2012 Read the rest of this entry

New Video: Ray Bradbury on F451, education, life passions, and humanity’s destiny in space

F451 graphic novel

Macmillan, the publishing giant, has just made available an absolutely wonderful new video interview with Ray Bradbury as a marketing adjunct for the release of the new graphic novel adaptation of his Fahrenheit 451, for which he wrote the introduction.

(See “Graphic novel of ‘Fahrenheit 451’ sparks Bradbury’s approval,” USA Today, Aug. 3. You can also read and/or listen to the NPR story about this publishing event; air date: one week ago.)

There’s a long version of the video (13 minutes) and a short one (just under two minutes). I recommend the long one. Ray is a very old man now — he’ll reach 89 later this month, two days before I reach 39 — and he suffered a stroke in the late 90s, and these are both evident as he speaks. But danged if he’s not still the same fiery, passionate, brilliant, charming guy who’s been ranting about his epic loves and hates for over 50 years now, and the long video displays this in vivid detail.

Highlights include:

  • Ray’s retelling of the stories, which he’s told a thousand times (without their ever growing old), about his boyhood love of Buck Rogers and Tarzan comics, and of his fateful meeting with Mr. Electrico at a carnival, which earned him the talismanic admonition to “Live forever!
  • His morning “theater of the mind” (which he has also talked about for decades), in which characters and metaphors zoom through his head when he first wakes up, after which he channels them into his stories.
  • His thoughts on the creative process of collaborating with filmmakers, graphic novelists, play directors, etc. He says Truffaut ruined F451 in the original film adaptation by, among other things, ruining the character of Clarisse McLellan.
  • His account of finally overcoming his fear of flying some years ago when he convinced Disney to fly him home from the opening of Epcot Center — a feat he accomplished by demanding that Disney “pour three double martinis” down him first, after which he says they “poured me into the plane.”
  • His update about the status of the planned new film adaptation of F451. He says the script is ready and director Frank Darabont is “a good friend,” so he’s convinced the film will happen. But he points out that Mel Gibson, who backed out a few years ago to make The Passion of the Christ, still owns the rights and is refusing to fund the project. Bradbury points out that Gibson is “a very rich man,” since he made $500 million from “his Jesus movie.” Ray describes the current situation, in which he and Darabont have to go looking for money, as “stupid,” even as he expresses optimism about the final outcome.
  • His assertion that America’s problems will all work themselves out if we devote ourselves to figuring out how to teach reading effectively to early-aged children.
  • His all-encompassing life prescription to “do what you love and love what you do.”
  • His assertion, still vibrant in full force after many decades, that the human race’s future is among the stars, and that we absolutely must return to the moon, and then colonize Mars, and then move outward and onward.

This guy is a force of nature. His very persona and personage have become for me inextricably intertwined with his work, to the point where each reinforces the other, in the same same way that it is with Lovecraft, Ligotti, Nietzsche, and a number of my other literary lights. God bless ya, Ray.

The Internet is melting our brains

The current issue of the Atlantic Monthly (July/August) has an interesting cover story by Nicholas Carr — “Is Google Making Us Stupid? What the Internet Is Doing to Our Brains” — about the effects of the Internet revolution on human cognition. I bought the issue at the airport last weekend while waiting for my flight to Mo*Con III and found it to be quite a worthy read, especially since the author’s description of some of the changes he has noticed in his own mental life under the spell of perpetual Internet usage parallel certain effects that I’ve been noticing in myself for the past several years.

He writes:

Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going — so far as I can tell — but it’s changing. I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.

I think I know what’s going on. For more than a decade now, I’ve been spending a lot of time online, searching and surfing and sometimes adding to the great databases of the Internet. The Web has been a godsend to me as a writer. Research that once required days in the stacks or periodical rooms of libraries can now be done in minutes. A few Google searches, some quick clicks on hyperlinks, and I’ve got the telltale fact or pithy quote I was after. Even when I’m not working, I’m as likely as not to be foraging in the Web’s info-thickets — reading and writing e-mails, scanning headlines and blog posts, watching videos and listening to podcasts, or just tripping from link to link to link. (Unlike footnotes, to which they’re sometimes likened, hyperlinks don’t merely point to related works; they propel you toward them.)

For me, as for others, the Net is becoming a universal medium, the conduit for most of the information that flows through my eyes and ears and into my mind. The advantages of having immediate access to such an incredibly rich store of information are many, and they’ve been widely described and duly applauded. “The perfect recall of silicon memory,” Wired’s Clive Thompson has written, “can be an enormous boon to thinking.” But that boon comes at a price. As the media theorist Marshall McLuhan pointed out in the 1960s, media are not just passive channels of information. They supply the stuff of thought, but they also shape the process of thought. And what the Net seems to be doing is chipping away my capacity for concentration and contemplation. My mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.

Carr goes on to offer a concise and fascinating history of the effects of new communications technologies on human cultures and societies, going all the way back to Plato’s low view of the development of writing itself (since he, Plato, feared dependence on the written word would siphon away people’s mental abilities), to the invention of the printing press, to Nietzsche’s admission that acquiring a typewriter had changed the character of his writing. Carr finishes by advising that we should be skeptical of his very skepticism about the Internet, since all revolutions in communication technologies have been met with similar Luddite-esque condemnations. That said, he still holds out the possibility that he’s right, and that something valuable, namely, our ability and even our desire to think and reflect deeply and to have our selves and societies formed and informed by this mental and moral depth, is currently under assault and in danger of being lost:

Perhaps those who dismiss critics of the Internet as Luddites or nostalgists will be proved correct, and from our hyperactive, data-stoked minds will spring a golden age of intellectual discovery and universal wisdom. Then again, the Net isn’t the alphabet, and although it may replace the printing press, it produces something altogether different. The kind of deep reading that a sequence of printed pages promotes is valuable not just for the knowledge we acquire from the author’s words but for the intellectual vibrations those words set off within our own minds. In the quiet spaces opened up by the sustained, undistracted reading of a book, or by any other act of contemplation, for that matter, we make our own associations, draw our own inferences and analogies, foster our own ideas. Deep reading, as Maryanne Wolf argues, is indistinguishable from deep thinking. If we lose those quiet spaces, or fill them up with “content,” we will sacrifice something important not only in our selves but in our culture.

Any longtime reader of The Teeming Brain will know that I exult in finding such thoughts and feeling expressed so well, and also in expressing them myself. From Neil Postman’s Amusing Ourselves to Death and Technopoly: The Surrender of Culture to Technology, to Theodore Roszak’s Where the Wasteland Ends, to Daniel Boorstin’s The Image: A Guide to Pseudo-Events in America, to dystopian fictional visions like Ray Bradbury’s Fahrenheit 451, Aldous Huxley’s Brave New World, and, most recently, Paolo Bacigalupi’s Pump Six and Other Stories (for which I wrote a glowing review for the new issue of Dead Reckonings), I am fascinated by the exploration of the changes that modern mass and digital communications technologies are wreaking upon our civilizations and cultures.

Like Carr, my fascination has a personal aspect. Ever since I was an undergraduate student majoring in communication and minoring in philosophy at the University of Missouri, where I was introduced to culture and media criticism and the high intellectual tradition of the West (and also the East), I have been obsessed with understanding the personal effects of the technology and mass media cocoon into which I was born, and which has grown astonishingly more comprehensive and complex in just my lifetime, which hasn’t yet reached its 40th year. Most recently I have noticed that my entry into the Internet world, which occurred definitively in 1996, has produced a progressive change in my attention span and concentrative abilities exactly like the one Carr describes.

Lately I have been taking steps to remedy that. I have reduced my online time (although not my total computer usage time). I have deliberately sought out a few long works of fiction to read. Interestingly, my ability to read in long-form has been impacted almost exclusively in the realm of fiction. I am able to read nonfiction just fine. But I have noticed a growing impatience with long fictional works over the years that is attributable, when I reflect on it and trace it, to the very phenomenon Carr describes. Presently I’m pleased to report that I am in process of successfully rehabilitating myself.

Lest anybody think that these fears are new, I’ll give the last word to Bradbury himself. About a year ago (May 30, 2007), L.A. Weekly published a fine article about him titled “Ray Bradbury: Fahrenheit 451 Misinterpreted” that featured a present-day Bradbury arguing that his most famous novel is not really about censorship, as has long been received opinion by the general public and literary establishment, but is instead about the insidious and pernicious effects of television on society. Bradbury is convinced — and so am I — that present-day trends in television and American society confirm the book’s warning.

The author of the article wisely delved into Bradbury’s history and discovered a letter Bradbury wrote in 1951 to Richard Matheson that covered the same territory. The words of the then-thirty-something Bradbury about the effects of radio on people’s ability to think, concentrate, and read serve as a fascinating touchstone to Carr’s Atlantic article, written 57 years later, about the effects of the Internet on the same activities:

As early as 1951, Bradbury presaged his fears about TV, in a letter about the dangers of radio, written to fantasy and science-fiction writer Richard Matheson. Bradbury wrote that “Radio has contributed to our ‘growing lack of attention.’… This sort of hopscotching existence makes it almost impossible for people, myself included, to sit down and get into a novel again. We have become a short story reading people, or, worse than that, a QUICK reading people.”

If Bradbury was right then, and if Carr is right now, then we have been living through the intellectual fall of our civilization for more than half a century, and have been dressing it up and passing it off to ourselves en masse as a wonderful, liberating cultural advance. This bears much reflection and meditation — if, that is, we’re still able to do it.

Anna Nicole Smith Is the Fourth Horseman

The only daily newspaper that originates from my part of the world is The News-Leader, which is located in Springfield, Missouri. It blankets southwest Missouri and part of Arkansas.

Last Tuesday, February 13th, editorial page editor Tony Messenger posted a brief observation at his blog, “Ozarks Messenger,” titled “A sign of the apocalypse…” It read as follows:

“I know that just by posting this I have become part of the problem, but I’m amazed at the coverage of the Anna Nicole Smith death and impending fight over her estate and paternity of her child. According to this study, the story has consumed more than 50 percent of cable news time. Between that and astronaut/diapergate, it’s amazing there’s any time for important coverage, such as, oh, I don’t know, a little war, health care, presidential politics. How low we as an industry, and a community, have sunk.”

I’ve really enjoyed Mr. Messenger’s handling of the paper’s editorial page ever since he took over from longtime editorial page editor Robert Leger last year, and this recent post is an example of why. I couldn’t help leaving a comment about it at his blog. Naturally, given my penchant for going on — or perhaps going off — about various indicators of cultural decline, my comment quickly bloomed to the length of an essay.

Here’s what I said:

As another commenter has already averred: Amen, brother Tony! I especially like the way you’ve framed this media insanity as an apocalyptic phenomenon. I know it’s become common to refer to things jokingly as “signs of the apocalypse,” but at present the type of idiocy you’ve decried here is hardly a joke, since the takeover of American and Western public life by trash and trivia over the past 30 to 40 years is truly a harbinger of cultural decline.

One of my favorite websites that talks about the “dumbing down” phenomenon (http://nomuzak.co.uk/dumbing_down.html) offers a vivid and accurate description of the way our collective consciousness has been hijacked by meaningless junk that obscures and edges out more serious fare: “In fact, the evidence for ‘dumbing down’ is everywhere: newspapers that once ran foreign news now feature celebrity gossip, pictures of scantily dressed young ladies, and football; television has replaced high-quality drama with gardening, cookery, and other ‘lifestyle’ programmes; bonkbusters have taken over the publishing world and pop cd’s and internet connections have taken over the libraries. In the dumbed-down world of reality TV and asinine soaps, the masses live in a perpetual present occupied by celebrity culture, fashion, a TV culture of diminished quality and range, an idealisation of mediocrity, and pop videos and brands. Speed and immediacy are the great imperatives, meaning that complex ideas are reduced to sound bites, high culture is represented by The Three Tenors and J K Rowling, people spend their spare time reading text messages instead of Dostoevsky, and listening to rap bands rather than Bartok and Stravinsky.”

Although the writer is speaking about Britain — note the British spellings — his words describe the contemporary culture of the U.S. as well. And indeed, he talks about America elsewhere in the same essay.

To speak more from my own personal experience, I can tell you that I teach English at a rural southwest Missouri high school, and whenever I speak to my students, if I want to make reference to any sort of common object of knowledge in order to illustrate a point about the dramatic structure of stories, or about irony or other literary techniques, or about anything else having to do with books and literature – and it’s a daily necessity to refer to a common fund of knowledge in order to illuminate something we’re studying – I find lately that the only thing I can mention with any reasonable expectation of group familiarity is the Harry Potter phenomenon. Almost all of the teens have seen the movies. Several have read one or more of the novels. I can also refer to THE LORD OF THE RINGS, but that’s because of the popular movies; only a tiny minority of students so far (as in, two or three of them) has actually read Tolkien’s books. I do have a student who has read a couple of Robert Jordan’s “Wheel of Time” books, so he has a minor grounding in literary fantasy.

But anyway, I simply can’t expect these kids to know much of anything, not even — and here’s the rub — about pop cultural stuff! It’s astonishing to find how many of them are oblivious to mass media culture. Not that they don’t know the names and faces of actors and bands and other celebrities, but if I mention the name of any movie director besides Rob Zombie, there’s a general look of blankness. I tried it with Spielberg once and had a couple of students respond, none too confidently, “Isn’t he the guy who made Saving Private Ryan?” I’ve also been shocked and dismayed at how many of them are functionally ignorant of Stephen King. Sure, they know some of his movies, but when it comes to the man himself the overwhelming consensus is an attitude of dull, suspicious disinterest, expressed in questions such as, “Stephen King – he’s really weird, right? Like, he’s that horror guy.” So even on the level of the pop culture crap that many of us decry, these kids’ frame of reference is shockingly narrow.

That said, I did find out recently, simply by asking, that they’re all aware of the Anna Nicole Smith “story.” So hooray. I guess.

Here’s what social critic and cultural historian Morris Berman had to say about these matters in his 2000 jeremiad, The Twilight of American Culture:

“In his introduction to the book, Dumbing Down: Essays on the Strip-Mining of American Culture, John Simon notes that a whole world of learning is disappearing before our eyes, in merely one generation. We cannot expect, he says, to make a mythological allusion anymore, or use a foreign phrase, or refer to a famous historical event or literary character, and still be understood by more than a tiny handful of people. (Try this in virtually any group setting, and note the reaction. This is an excellent wake-up call as to what this culture is about, and how totally alien to it you are.) Indeed, using Lewis Lapham’s criteria for genuine literacy — having some familiarity with a minimum number of standard texts (Marx, Darwin, Dickens . . .), and being able to spot irony — it may even be the case that the number of genuinely literate adults in the United States amounts to fewer than 5 million people — that is, less than 3 percent of the total population.

“In 1953, Ray Bradbury published Fahrenheit 451 — later made into a movie by Francois Truffaut — which depicts a future society in which intelligence has largely collapsed and the reading of books is forbidden by law. People sit around interacting with screens (referred to as ‘the family’) and taking tranquilizers. Today, nearly five decades later, isn’t this largely the point at which we have arrived? Do not the data cited above suggest that most of our neighbors are, in fact, the mindless automatons depicted in Truffaut’s film? True, the story does contain a class of ‘book people’ who hide in the forest and memorize the classics, to pass on to future generations — and this vignette does, in fact, provide a clue as to what just might enable our civilization to eventually recover — but the majority of citizens on the eve of the twenty-first century watch an average of four hours of TV a day, pop Prozac and its derivatives like candy, and perhaps read a Danielle Steel novel once a year.”

Okay, so there’s a misanthropic tone there. But, you know, Berman’s point is difficult to argue with, and sometimes the bitter pill is the necessary medicine.

To round out this rambling comment on the aforementioned apocalyptic note of cultural decline, I’ve long been disturbed by the terminal diagnosis of American culture that appeared in Neil Postman’s influential Amusing Ourselves to Death back in 1985: “When a population becomes distracted by trivia, when cultural life is redefined as a perpetual round of entertainment, when serious public conversation becomes a form of baby-talk, when, in short, a people become an audience and their public business a vaudeville act, then a nation finds itself at risk: culture-death is a clear possibility.” I truly think that’s where we stand now, even more so than when Postman penned those words two decades ago. And the fact that the national news media can go into a feeding frenzy over something as patently and disgustingly vapid as the Anna Nicole Smith “story” at a time when America’s foreign and domestic circumstances are as they are only drives home the truth of Postman’s (and Bradbury’s, and Berman’s) Dark Age diagnosis.

So just what the hell is post-modernism?

Background: Last week somebody posts a famous quip from Oscar Wilde at a popular message board: “In the old days books were written by men of letters and read by the public. Nowadays books are written by the public and read by nobody.” This leads to a conversation about what the quote means and whether it still applies today. The question of just what is meant by “men of letters” becomes a live issue, and somebody says, “For my tastes, post-modernism (or is it post-post modernism now) really made a lot of the fiction from these types pretty unreadable.”

This of course opens the floodgate for a conversation about the definition and meaning of post-modernism. I myself reply to the above-quoted assertion by saying, “But that indicates a problem with post-modernism, not with the pre-post-modern (um) writers or their work. I personally have profited enormously from my study and absorption of the post-modern outlook and worldview, but I also harbor a healthy measure of loathing for it because of the very effect you’ve described: that it has served to kick off a kind of semi-dark age by rendering the artistic works of a former age inaccessible for a great many people who were raised and weaned under its philosophical influence.”

Several other people offer their own thoughts about and definitions of post-modernism. Then somebody says she’s still confused and not sure what to make of it all.

Naturally, I’m unable to keep my mouth shut. My extended reply, in which I try to chase down the general meaning of post-modernism, is as follows.

Oh, but before I launch into it: Here’s wishing a very happy Monday to you all. I’m located in southwest Missouri, where we were pretty much pulverized by the massive ice storm that swept through the United States’ midsection over the weekend. Amazing to say, my family and I have not yet lost electricity at our house. But Missouri has been designated a disaster area, a state of emergency has been declared, the National Guard has been called in to help with the cleanup, and as I type these words more than 300,000 people in the state are without electricity and are likely to remain so for three or four more days. And here I sit, still managing to find time to update my blog late on a lazy afternoon (lazy because school was cancelled today and has already been cancelled for tomorrow) in the comfort of my cozy, warm house. Yeesh.

But anyway, somebody in that online conversation last week expressed continuing confusion about the meaning of the term “post-modern,” and so I reached for my keyboard and began to type. . .

* * * * *

The question “What is post-modernism?” has been the subject of entire book-length explorations, so don’t feel bad about your confusion. Nobody’s really sure how to define the whole phenomenon/movement/worldview.

That said, the answer given by the French philosopher and literary theorist Jean-Francois Lyotard in his 1979 study The Postmodern Condition has long seemed the most useful and helpful one to me. And not only to me, but to a great many other people as well. If there’s a standard answer to the question at issue here — and there isn’t — then Lyotard’s is it, by wide popular recognition.

Lyotard was commissioned to write The Postmodern Condition by the Conseil des Universités of the Quebec government at a time when they were considering incorporating computers into university-level education. “Post-modernism” was a buzzword at the time and they wanted Lyotard’s investigation of it to frame their discussions of the computer issue. In the end his book achieved a far wider scope, as indicated by the subtitle he gave it: “A Report on Knowledge.”

His argument is intricate and fascinating but his definition of post-modernism is a one-liner that’s arguably become the catch-all definition. He said post-modernism is, or is characterized by, “incredulity toward meta-narratives.”

That phrase simply refers to the collapse of meta-narratives — that is, totalizing storylines that cultures tell themselves to make sense of their experiences — as believable things. Lyotard also called meta-narratives “grand narratives,” which term may give a better sense of what he was getting at. He looked, for example, at the grand storyline of the 18th century Enlightenment, which told England, Europe, and America that science was the answer to everything, and that it was leading societies into a golden era of reason, peace, and justice. This kind of meta-narrative legitimates or justifies, and therefore elevates, certain types of knowledge, and also certain moral and social attitudes, social practices, political systems, educational practices, and so on. It organizes a society or civilization around a set of guiding principles that determine what does and does not count as “knowledge,” and therefore it creates the foundational assumptions that the society comes to consider as metaphysical givens, as “self-evident.”

Lyotard claimed this type of thing doesn’t hold up any more in the post-modern age when we’ve become all-too-aware of that very process of legitimation, and when computers are redefining the meaning of “knowledge.” We recognize that foundational societal assumptions aren’t objective facts but are instead manufactured agreements. Thus, we come to disbelieve in meta-narratives on principle.

The thing is, this incredulity blankets everything and has an especial relation to the arts, which are so very central in forming and playing upon generalized cultural assumptions. And that’s where the literary connection comes in. In the literary world, the idea of the death of grand narratives can be seen in the deconstruction movement, which holds that authorial intent means nothing, that any text can mean anything, and that the locus of meaning is not in the text but in the person, or rather in the interaction between them. (This way of putting it is a crude simplification, but it does get the idea across.) And so this naturally does away with assumptions from former eras about the distinction between high art and low art. Suddenly, everything’s up for grabs, and the aesthetic literary principles that former eras took for granted are regarded as mere ideologies, mere legitimations of certain types of writing.

In fact, the very idea that entire peoples shared the same set of aesthetic assumptions is attacked by post-modernism, one of whose most significant effects has been the “recovery” of “marginalized voices,” such as those of women and — in Western Anglo culture — non-whites. The idea is that throughout history those fictional meta-narratives were not only providing a coherent shared worldview but were also excluding and obscuring other viewpoints and types of knowledge that were just as real and legitimate. Hence the rise of multiculturalism, feminism, and other such movements.

There’s a lot more to say, but maybe I’ve said enough to get the idea across. Andy Warhol’s soup-can art is post-modern because it deliberately sidesteps or negates the traditional artistic goal of expressing a specific meaning, and therefore a mini-meta-narrative, by organizing certain elements into a coherent whole. Warhol took material from everyday life and put it in what seemed like an artistic context, and left it up to the viewer to make sense of it. This is entirely post-modern. In literature, metafictions like John Barthes’ “Lost in the Funhouse,” which frequently interrupts the fictional narrative with ruminations upon the writing of fiction itself, qualify as post-modern because they keep on reminding the reader of the fact that he or she is reading a story.

To sum up, consider these definitions of post-modernism that I snagged from the web, which amplify what I’ve been saying here:

“Contrasted with Modernism, whose authors attempted to come to new terms with old ideas in attempt to find the ‘deep structure’ of the human experience, Post-Modernism is identifiable by authors who were highly skeptical of any ‘deep structure,’ regarding all structures as subjective and ideologically tainted.”

“Catch-phrase or jargon term used extensively in film and literary studies to identify certain trends in contemporary media and fiction. Post-modernist works tend to be highly self-referential and are typically saturated with irony and allusion. Such works also tend to subvert traditional models of unity and coherence and instead try to capture the sense of discontinuity and apparent chaos characteristic of the electronic age.”

Obviously, one can see the presence of Lyotard’s influence here. His identification of the central premise of post-modernism has proved most helpful to me personally in my considerations of both artistic matters and other matters, since it provides a satisfying explanation or interpretation of the various fragmenting tendencies of modern Western life. Multiculturalism can be viewed as a post-modern phenomenon since it’s predicated on the idea of multiple legitimate cultural viewpoints — i.e., “knowledge” — that should not be flattened by a single totalizing ideology. The 20th century’s collapse of high culture into low culture and vice versa, not just in the arts but in terms of fundamental American social mores and attitudes, can be viewed the same way. The rise (and possibly, depending on your present viewpoint, fall) of MTV is a product of the post-modern thrust. America’s present “culture war,” including, especially, some of its most prominent manifestations such as the ongoing raging controversy over curriculum issues in public schools, hails from the same philosophical country, since the absence of an agreed-upon grand narrative naturally leaves a vacuum when it comes to the question of what government-sponsored schools should be teaching the nation’s youth to know and do.

The mention of the school issue brings me back to what I said in a previous post about the measure of loathing I feel for post-modernism. While I have profited hugely from studying the movement and looking at the world through its eyes, I have also shared the views of many of its critics who point to the cultural nihilism that’s inherent in the whole thing. If all shared knowledge is merely legitimation, then where the hell does that leave us? In the artistic realm, it leaves us in a place where the shared meanings of former eras become inaccessible to entire generations of people, since these people themselves have little or no idea of what a shared meaning even is. Allan Bloom expressed the idea I’m getting at when he wrote in The Closing of the American Mind about the pitiable state of a hypothetical modern American young person who is ignorant of the “grand tradition” of Western cultural achievement and finds himself or herself wandering through the Louvre or the Uffizi. Bloom says the meanings of the great works of art housed in those places is utterly inaccessible to such a person, who is able to see them only as abstract, as mere form devoid of significance. I personally think the collapse of the high/low culture distinction, as well as the cultural gridlock over school curriculum issues, as well as the general cultural disagreement over what’s worth knowing, has produced and is continuing to produce exactly this type of person. I’m talking about the type of people that Ray Bradbury posited in Fahrenheit 451, those robotic denizens of a dystopia who told themselves that they were so very enlightened and happy, but whose thoughts and attitudes were so stunted and infantilized by immersion in trivia and lack of exposure to matters of real depth that they were really just walking corpses. Not to wax too dramatic, but I spend a lot of time around high school kids, and I’m telling you from personal experience that the cultural confusion in this era of the post-modern influence has led to a situation where successive generations of teens are being raised in an intellectual and moral vacuum, and are thus coming perilously close to F451 territory.

Then again, the truth of an idea shouldn’t be judged by its practical utility or effects. As Nietzsche observed in Beyond Good and Evil, in one of my favorite philosophical passages of all time (as evidenced by the fact that I quoted it in my short story “Teeth” in the Children of Cthulhu anthology), “Nobody is very likely to consider a doctrine true merely because it makes people happy or virtuous. . . . Happiness and virtue are no arguments. But people like to forget — even sober spirits — that making unhappy and evil are no counterarguments. Something might be true while being harmful and dangerous in the highest degree.”

And so I’m conflicted over the fact that the post-modern epiphany does seem true, and even inescapable, when I really consider it, and that its influence appears to be largely negative. Does the culture-wide collapse of meta-narratives necessarily result in a state of permanent cultural confusion and a de facto descent into an F451-like dystopia? Was Plato right when he wrote in The Republic that a “noble lie” is necessary to serve as the foundation for the best society? If so, is it desirable — or even possible — for us to pursue such a self-delusion? Or is there a way to avoid an awful cultural fate while still staring unblinking into the void of indeterminacy? I just don’t know.