Blog Archives

The Sad Failure of ‘Fahrenheit 451’ to Prevent the Future

Teeming Brain readers are familiar with my longtime focus on Fahrenheit 451 and my abiding sense that we’re currently caught up in a real-world version of its dystopian vision. This is not, of course, an opinion peculiar to me. Many others have held it, too, including, to an extent, Bradbury himself. I know that some of you, my readers, share it as well.

As of a couple of weeks ago, a writer for the pop culture analysis website Acculturated has publicly joined the fold:

Ray Bradbury often said that he wrote science fiction not to predict the future but to prevent it. On this score, Fahrenheit 451 seems to have failed. The free speech wars on college campuses, the siloing effect of technology, the intolerance of diverse political opinions, and the virtual cocoon provided by perpetual entertainment all suggest that Bradbury anticipated the future with an accuracy unparalleled elsewhere in science fiction literature.

It’s a strange irony that, in the age of the Internet, which was supposed to encourage more transparency and debate, the open exchange of ideas is under threat. This was pointed out by another famous science fiction writer, Michael Crichton. “In the information society,” says Ian Malcolm in Jurassic Park, “No one thinks. We expected to banish paper, but we actually banished thought.” Bradbury saw this coming many decades earlier, and he understood why. Exposure to new ideas is uncomfortable and potentially dangerous. Staying safe, comfortable, and equal requires that everyone think identically. Liberal learning, the crucible that forms the individual, is anathema to group identity and cannot be tolerated. If you disagree, you’re morally suspect.

Which is why we need Bradbury’s message today more than ever. In a coda to the 1979 printing of Fahrenheit 451, Bradbury wrote: “There is more than one way to burn a book. And the world is full of people running about with lit matches.”

Full Text: “Ray Bradbury Wrote ‘Fahrenheit 451’ to Prevent a Dystopia. Instead, He Predicted One

(If you click through to read the full text, be aware that the first paragraph of the piece presents a slightly inaccurate potted history of Bradbury’s career trajectory that implies he only rose to literary prominence with the publication of F451 in 1953. In fact, some of his previous books and stories, including, especially, 1950’s The Martian Chronicles, had already brought him considerable attention and acclaim.)

For more on the same theme, see my previous posts “On living well in Ray Bradbury’s dystopia: Notes toward a monastic response” and “Facebook, Fahrenheit 451, and the crossing of a cultural threshold,” as well as the Strange Horizons essay “The Failure of Fahrenheit 451.”

For thoughts from the author himself, see the 2007 LA Weekly piece Ray Bradbury: Fahrenheit 451 Misinterpreted,” featuring Bradbury’s comments on the reality of F451-like trends in contemporary society. (However, Bradbury’s comments in that article/interview should be read in tandem with this context-creating response from his biographer, Sam Weller.) Also see Bradbury’s interviews for A.V. Club and the Peoria Journal Star for more observations from him about the encroaching threat of his novel’s realization in the world around us. And see especially his 1998 interview for Wired, titled “Bradbury’s Tomorrowland,” in which he said the following:

Almost everything in Fahrenheit 451 has come about, one way or the other — the influence of television, the rise of local TV news, the neglect of education. As a result, one area of our society is brainless. But I utilized those things in the novel because I was trying to prevent a future, not predict one.

What a shame the world isn’t just driving over hills and never coming to a town

The sun was gone. The sky lingered its colors for a time while they sat in the clearing. At last, he heard a whispering. She was getting up. She put out her hand to take his. He stood beside her, and they looked at the woods around them and the distant hills. They began to walk away from the path and the car, away from the highway and the town. A spring moon rose over the land while they were walking.

The breath of nightfall was rising up out of the separate blades of grass, a warm sighing of air, quiet and endless. They reached the top of the hill, and without a word, sat there watching the sky.

He thought to himself that this was impossible; that such things did not happen. He wondered who she was, and what she was doing here.

Ten miles away, a train whistled in the spring night and went on its way over the dark evening earth, flashing a brief fire. And then, again, he remembered the old story, the old dream. The thing he and his friend had discussed, so many years ago.

There must be one night in your life that you will remember forever. There must be one night for everyone. And if you know that the night is coming on and that this night will be that particular night, then take it and don’€™t question it and don’€™t talk about it to anyone ever after that. For if you let it pass it might not come again. Many have let it pass, many have seen it go by and have never seen another like it, when all the circumstances of weather, light, moon and time, of night hill and warm grass and train and town and distance were balanced upon the trembling of a finger. . . .

He woke during the night. She was awake, near him.

“Who are you?” he whispered. She said nothing.

“I could stay another night,” he said.

But he knew that one can never stay another night. One night is the night, and only one. After that, the gods turn their backs.

“I could come back in a year or so.”

Her eyes were closed, but she was awake.

“But I don’t know who you are,” he said.

“You could come with me,” he said, “to New York.”

But he knew that she could never be there, or anywhere but here, on this night.

“And I can’t stay here,” he said, knowing that this was the truest and most empty part of all.

He waited for a time and then said again, “Are you real? Are you really real?”

They slept. The moon went down the sky toward morning.

He walked out of the hills and the forest at dawn, to find the car covered with dew. He unlocked it and climbed in behind the wheel, and sat for a moment looking back at the path he had made in the wet grass.

He moved over, preparatory to getting out of the car again. He put his hand on the inside of the door and gazed steadily out. The forest was empty and still. The path was deserted. The highway was motionless and serene. There was no movement anywhere in a thousand miles.

He started the car motor and let it idle. The car was pointed east, where the orange sun was now rising slowly.

“All right,” he said quietly. “Everyone, here I come. What a shame you’re all still alive. What a shame the world isn’t just hills and hills, and nothing else to drive over but hills, and never coming to a town.”

He drove away east, without looking back.

— From Ray Bradbury, “One Night in Your Life,” in The Toynbee Convector

Superfluous humans in a world of smart machines

Robot_Hand_and_Earth_Globe

Remember Ray Bradbury’s classic dystopian short story “The Veldt” (excerpted here) with its nightmare vision of a soul-sapping high-technological future where monstrously narcissistic — and, as it turns out, sociopathic and homicidal — children resent even having to tie their own shoes and brush their own teeth, since they’re accustomed to having these things done for them by machines?

Remember Kubrick’s and Clarke’s 2001: A Space Odyssey, where HAL, the super-intelligent AI system that runs the spaceship Discovery, decides to kill the human crew that he has been created to serve, because he has realized/decided that humans are too defective and error-prone to be allowed to jeopardize the mission?

Remember that passage (which I’ve quoted here before) from John David Ebert’s The New Media Invasion in which Ebert identifies the dehumanizing technological trend that’s currently unfolding all around us? Humans, says Ebert, are becoming increasingly superfluous in a culture of technology worship:

Everywhere we look nowadays, we find the same worship of the machine at the expense of the human being, who always comes out of the equation looking like an inconvenient, leftover remainder: instead of librarians to check out your books for you, a machine will do it better; instead of clerks to ring up your groceries for you, a self-checkout will do it better; instead of a real live DJ on the radio, an electronic one will do the job better; instead of a policeman to write you a traffic ticket, a camera (connected to a computer) will do it better. In other words . . . the human being is actually disappearing from his own society, just as the automobile long ago caused him to disappear from the streets of his cities . . . . [O]ur society is increasingly coming to be run and operated by machines instead of people. Machines are making more and more of our decisions for us; soon, they will be making all of them.

Bear all of that in mind, and then read this, which is just the latest in a volley of media reports about the encroaching advent, both rhetorical and factual, of all these things in the real world:

A house that tracks your every movement through your car and automatically heats up before you get home. A toaster that talks to your refrigerator and announces when breakfast is ready through your TV. A toothbrush that tattles on kids by sending a text message to their parents. Exciting or frightening, these connected devices of the futuristic “smart” home may be familiar to fans of science fiction. Now the tech industry is making them a reality.

Mundane physical objects all around us are connecting to networks, communicating with mobile devices and each other to create what’s being called an “Internet of Things,” or IoT. Smart homes are just one segment — cars, clothing, factories and anything else you can imagine will eventually be “smart” as well.

. . . We won’t really know how the technology will change our lives until we get it into the hands of creative developers. “The guys who had been running mobile for 20 years had no idea that some developer was going to take the touchscreen and microphone and some graphical resources and turn a phone into a flute,” [Liat] Ben-Zur [of chipmaker Qualcomm] said.

The same may be true when developers start experimenting with apps for connected home appliances. “Exposing that, how your toothbrush and your water heater and your thermostat . . . are going to interact with you, with your school, that’s what’s next,” said Ben-Zur.

MORE: “The Internet of Things: Helping Smart Devices Talk to Each Other

Image courtesy of Victor Habbick / FreeDigitalPhotos.net

Ray Bradbury: A life of mythic numinosity

Ray Douglas Bradbury, photo by NASA (http://history.nasa.gov/EP-125/part6.htm) [Public domain], via Wikimedia Commons

Ray Douglas Bradbury, photo by NASA (http://history.nasa.gov/EP-125/part6.htm) [Public domain], via Wikimedia Commons

Long-time Teeming Brain readers are well aware that Ray Bradbury frequently comes up in conversation here. Like so many other people, and as I detailed three years ago in “The October Mystique: 7 Authors on the Visionary Magic of Ray Bradbury,” I tend to think of him especially when October and the autumn season roll around. Maybe that’s why I’ve had him on the brain lately. Well, that, and the fact that next week the students in the college class on horror and science that I’m currently teaching will begin a two-week study of Bradbury and his Fahrenheit 451. Before that, during the coming weekend, their assignment is to read his “The Foghorn” and “The Crowd.”

Last week I was led to quote one of Bradbury’s famous bits of life advice — of which there are many — to one of those students. It was his line about leaping off cliffs and then building your wings on the way down. Afterward, I got curious about the provenance of this quote, and this led me on an Internet search for its source or sources. Eventually I was led to an excellent 23-year-old interview with Bradbury in South Carolina’s Spartanburg Herald-Journal, obtained by them from the New York Times news service, and presently readable thanks to Google’s news archive.

The title is “Learning is solitary pursuit for Bradbury.” The journalist is Luaine Lee. The date is October 17, 1990. And the interview shows Bradbury offering some really lovely articulations of ideas, insights, and anecdotes (many of them familiar but all of them neverendingly fascinating) from his personal mythic journey. Read the rest of this entry

Last of the Titans: A Note on the Passing of Ray Harryhausen (and Forrest Ackerman and Ray Bradbury)

Monstrous_Singularities_150pxEDITOR’S NOTE: With this post we welcome award-winning writer, editor, filmmaker, composer, and artist Jason V. Brock to the Teem. Jason’s work has been published in Butcher Knives & Body Counts, Simulacrum and Other Possible Realities, Fungi, Fangoria, S. T. Joshi’s Black Wings series, and elsewhere. He was Art Director/Managing Editor for Dark Discoveries magazine for more than four years, and he edits the biannual pro digest [NAMEL3SS], dedicated to “the macabre, esoteric and intellectual,” which can be found on Twitter at @NamelessMag and on the Interwebs at www.NamelessMag.com. He and his wife, Sunni, also run Cycatrix Press.

As a filmmaker Jason’s work includes the documentaries Charles Beaumont: The Short Life of Twilight Zone’s Magic Man, The Ackermonster Chronicles!, and Image, Reflection, Shadow: Artists of the Fantastic. He is the primary composer and instrumentalist/singer for his band, ChiaroscurO. Jason loves his wife, their family of reptiles/amphibians, travel, and vegan/vegetarianism. He is active on social sites such as Facebook and Twitter (@JaSunni_JasonVB) and at his and Sunni’s personal website/blog, www.JaSunni.com.

Jason will contribute an occasional column titled “Monstrous Singularities.” For this inaugural installment, he offers an elegiac reflection on the passing of three authentic titans of fantasy, horror, and science fiction whose work literally helped to define major aspects of popular culture and collective psychology during the twentieth century.

* * *

Ray Harryhausen, 1920-2013

Ray Harryhausen, 1920-2013

They were present at the beginning… and we are witness to their end.

Endings, in many ways, are entrances into self-realization — whether a portal into some altered state of mind, a window into collective insight, or even a chance for some final and comforting acceptance. Endings signify not only change, but also, often, transcendence, either metaphorically or literally, and on occasion simultaneously. Be it a lonely struggle that reaches a sad (even tragic) conclusion, or perhaps the unexpected outcome of a traumatic situation, or the shared exhilaration of justice served, endings are always transitional, even transformational, in ways that beginnings cannot be. Endings are the true headstones by which we collectively measure and define history. They are markers of conclusiveness — more so than births or the start of a new venture, which can be shrouded in secrecy, obscured by the fog of antiquity, or both. Thus, they are uniquely able to serve as touchstones for what has been bequeathed to the past (what cannot be again) and what is yet to be accomplished (and is therefore allotted to the future).

In May of 2013, the 92-year-old stop-motion animation film pioneer and artistic genius Ray Harryhausen, perhaps best known for his creation of the special visual effects for Jason and the Argonauts and Clash of the Titans, passed away. His ending completes, in a sense, a circle of loss for the world; with the transitioning of Harryhausen away from the realm of the living and into the annals of time, a triumvirate of giants has now vanished from the Earth, a troika destined to become even more powerful in voice, authority, and veneration over time. This amplification will undoubtedly be quite profound in the immediately foreseeable future, as people who are not yet aware of them, or who may have forgotten the seismic impact of their works and personalities, discover or rediscover their greatness and celebrate it even more, perchance, than those who instantly recognized it and mourned their loss to humanity and culture. Read the rest of this entry

Validating Ray Bradbury: Climate change and high temps linked to violent behavior

Remember Ray Bradbury’s famous fascination with the idea that hot weather spurs an increase in assaults and other violent behavior? This was the basic premise behind his widely reprinted 1954 short story “Touched with Fire,” in which two retired insurance salesmen try to prevent a murder. In a key passage, one of them shares his thoughts on the relationship between heat and violence:

More murders are committed at ninety-two degrees Fahrenheit than any other temperature. Over one hundred, it’s too hot to move. Under ninety, cool enough to survive. But right at ninety-two degrees lies the apex of irritability, everything is itches and hair and sweat and cooked pork. The brain becomes a rat rushing around a red-hot maze. The least thing, a word, a look, a sound, the drop of a hair and — irritable murder.  Irritable murder, there’s a pretty and terrifying phrase for you.

Notably, Bradbury adapted this story twice for television, once for Alfred Hitchcock Presents as the 1956 episode “Shopping for Death” and then more than thirty years later for his own Ray Bradbury Theater as the 1990 episode “Touched with Fire.” He also inserted the same idea about heat and violence into his screen treatment for the 1953 minor science fiction classic It Came from Outer Space, which was thoroughly reworked by screenwriter Harry Essex, who got the actual screenplay credit, but which ended up including much of a Bradburyan nature, including a detailed statement of the 92-degrees thesis, placed in the mouth of a small-town American sheriff confronting an alien invasion. (Note that you can hear an audio clip of this dialogue at the beginning of Siouxsie & the Banshees’ 1986 song “92 Degrees.”)

Now comes a new study, conducted by several U.S. scientists, that appears to offer preliminary “official” vindication for this idea that so fascinated Bradbury when he encountered it somewhere or other during the early decades of his long and fertile career:

Bring on the cool weather — climate change is predicted to cause extreme weather, more intense storms, more frequent floods and droughts, but could it also cause us to be more violent with one another? A new study from scientists in the US controversially draws a link between increased rates of domestic violence, assault and other violent crimes and a warming climate.

That conflict could be a major result of global warming has long been accepted. As climate change makes vulnerable parts of the world more susceptible to weather-related problems, people move from an afflicted region to neighbouring areas, bringing them into conflict with the existing populations. That pattern has been evident around the world, and experts have even posited that conflicts such as Darfur should be regarded as climate related. But the authors of the study, published in the peer review journal Science, have departed from such examples to look closely at patterns of violence in Brazil, China, Germany and the US.

The authors suggest that even a small increase in average temperatures or unusual weather can spark violent behaviour. They found an increase in reports of domestic violence in India and Australia at times of drought; land invasions in Brazil linked to poor weather; and more controversially, a rise in the number of assaults and murders in the US and Tanzania.

. . . The underlying reasons could run from increased economic hardship as harvests fail or droughts bite, to the physiological effects of hot weather.

— Fiona Harvey, “Climate change linked to violent behavior,” The Guardian, August 2, 2013

To illustrate this study, here’s that episode of Alfred Hitchcock Presents:

You can also watch the (alas, decidedly inferior) adaptation of the same story for Ray Bradbury Theater online.

Silence, solitude, and self-discovery in an age of mass distraction

“[T]he internet seizes our attention only to scatter it. We are immersed because there’s a constant barrage of stimuli coming at us and we seem to be very much seduced by that kind of constantly changing patterns of visual and auditorial stimuli. When we become immersed in our gadgets, we are immersed in a series of distractions rather than a sustained, focused type of thinking … There are messages coming at us through email, instant messenger, SMS, tweets etc. We are distracted by everything on the page, the various windows, the many applications running. You have to see the entire picture of how we are being stimulated. If you compare that to the placidity of a printed page, it doesn’t take long to notice that the experience of taking information from a printed page is not only different but almost the opposite from taking in information from a network-connected screen. With a page, you are shielded from distraction. We underestimate how the page encourages focused thinking — which I don’t think is normal for human beings — whereas the screen indulges our desire to be constantly distracted.”

— “Information and Contemplative Thought: We Turn Ourselves into Media Creations,” Interview with Nicholas Carr, The European, January 31, 2012

“Has it really come to this? In barely one generation we’ve moved from exulting in the time-saving devices that have so expanded our lives to trying to get away from them — often in order to make more time. The more ways we have to connect, the more many of us seem desperate to unplug. Like teenagers, we appear to have gone from knowing nothing about the world to knowing too much all but overnight. Internet rescue camps in South Korea and China try to save kids addicted to the screen. Writer friends of mine pay good money to get the Freedom software that enables them to disable (for up to eight hours) the very Internet connections that seemed so emancipating not long ago. Even Intel (of all companies) experimented in 2007 with conferring four uninterrupted hours of quiet time every Tuesday morning on 300 engineers and managers … [T]he average American spends at least eight and a half hours a day in front of a screen … The average American teenager sends or receives 75 text messages a day … We have more and more ways to communicate, as Thoreau noted, but less and less to say … The central paradox of the machines that have made our lives so much brighter, quicker, longer and healthier is that they cannot teach us how to make the best use of them; the information revolution came without an instruction manual.”

— Pico Iyer, “The Joy of Quiet,” The New York Times, December 29, 2011

“I am encouraged by services such as Instapaper, Readability or Freedom — applications that are designed to make us more attentive when using the internet. It is a good sign because it shows that some people are concerned about this and sense that they are no longer in control of their attention. Of course there’s an irony in looking for solutions in the same technology that keeps us distracted.”

— Carr, “Information and Contemplative Thought” Read the rest of this entry

Recommended Reading 30

This week’s (exceptionally long and varied) offering of intellectual enrichment includes: an argument that the likely death of economic growth is the underlying theme of the current U.S. presidential election; thoughts on the rise of a real-life dystopia of universal algorithmic automation; an account of how the founder of TED became disgusted with the direction of the iconic ideas conference and created a new one to fulfill his original vision; reflections by a prominent physicist on science, religion, the Higgs boson, and the cultural politics of scientific research; a new examination of the definition, meaning, and cultural place of “pseudoscience”; a deeply absorbing warning about the possible death of the liberal arts, enhanced by the author’s first-person account of his own undergraduate liberal arts education at the University of Chicago; thoughts on the humanizing effects of deep literary reading in an age of proliferating psychopathy; a question/warning about the possible fate of literature as a central cultural-historical force in the grip of a publishing environment convulsed by epochal changes; an interview with Eric Kandel on the convergence of art, biology, and psychology; a new report from a psychiatric research project on the demonstrable link between creativity and mental illness; a career-spanning look at the works and themes of Ray Bradbury; and a spirited (pun intended) statement of the supreme value of supernatural and metaphysical literature from Michael Dirda. Read the rest of this entry

When humans fuse with apps, what will happen to the soul?

Beware the coming fusion of humans — you, me, all of us — with our smartphones and their array of apps for everything from finding directions to buying groceries to making ethical decisions. And make no mistake: this fusion is indeed coming. Or rather, it’s already here in nascent form. Just look around yourself and notice the people sitting and standing in any public place — stores, restaurants, movie theaters, sidewalks, streets, roads — and consulting their handheld digi-daemons about where to go and how to get there, what to do and when and how to do it, whom to call and what to say. In literally every aspect of life, we increasingly get by with more than just a little help from our handheld friends. Read the rest of this entry

On living well in Ray Bradbury’s dystopia: Notes toward a monastic response

Morris Berman may not have been the first person to offer simultaneous commentary on American culture and Fahrenheit 451 by observing that the former has basically transformed itself into the dystopian society depicted by the latter. Many people have noted in the decades since Fahrenheit was first published in 1953 that things have been moving eerily and strikingly in the direction Bradbury foresaw (or rather, the direction he tried to forestall; “I wasn’t trying to predict the future,” he famously said in a 2003 interview. “I was trying to prevent it.”) But it was Morris who most forcefully affected me with this line of thought when he laid it out in The Twilight of American Culture:

In 1953, Ray Bradbury published Fahrenheit 451 — later made into a movie by Francois Truffaut — which depicts a future society in which intelligence has largely collapsed and the reading of books is forbidden by law. People sit around interacting with screens (referred to as “the family”) and taking tranquilizers. Today, nearly five decades later, isn’t this largely the point at which we have arrived? Do not the data [on the collapse of American intelligence] suggest that most of our neighbors are, in fact, the mindless automatons depicted in Truffaut’s film? True, the story does contain a class of “book people” who hide in the forest and memorize the classics, to pass on to future generations — and this vignette does, in fact, provide a clue as to what just might enable our civilization to eventually recover — but the majority of citizens on the eve of the twenty-first century watch an average of four hours of TV a day, pop Prozac and its derivatives like candy, and perhaps read a Danielle Steel novel once a year

. . . [T]he society depicted in Fahrenheit 451 has banned books and immerses itself instead in video entertainment, a kind of “electronic Zen,” in which history has been forgotten and only the present moment counts . . . [The novel] is extraordinarily prescient. Leaving aside the issue of direct censorship of books — rendered unnecessary by McWorld, as it turns out, because most people don’t read anymore — most of the features of this futuristic society are virtually upon us, or perhaps no more than twenty years away. [1]

Read the rest of this entry