Blog Archives

The Sad Failure of ‘Fahrenheit 451’ to Prevent the Future

Teeming Brain readers are familiar with my longtime focus on Fahrenheit 451 and my abiding sense that we’re currently caught up in a real-world version of its dystopian vision. This is not, of course, an opinion peculiar to me. Many others have held it, too, including, to an extent, Bradbury himself. I know that some of you, my readers, share it as well.

As of a couple of weeks ago, a writer for the pop culture analysis website Acculturated has publicly joined the fold:

Ray Bradbury often said that he wrote science fiction not to predict the future but to prevent it. On this score, Fahrenheit 451 seems to have failed. The free speech wars on college campuses, the siloing effect of technology, the intolerance of diverse political opinions, and the virtual cocoon provided by perpetual entertainment all suggest that Bradbury anticipated the future with an accuracy unparalleled elsewhere in science fiction literature.

It’s a strange irony that, in the age of the Internet, which was supposed to encourage more transparency and debate, the open exchange of ideas is under threat. This was pointed out by another famous science fiction writer, Michael Crichton. “In the information society,” says Ian Malcolm in Jurassic Park, “No one thinks. We expected to banish paper, but we actually banished thought.” Bradbury saw this coming many decades earlier, and he understood why. Exposure to new ideas is uncomfortable and potentially dangerous. Staying safe, comfortable, and equal requires that everyone think identically. Liberal learning, the crucible that forms the individual, is anathema to group identity and cannot be tolerated. If you disagree, you’re morally suspect.

Which is why we need Bradbury’s message today more than ever. In a coda to the 1979 printing of Fahrenheit 451, Bradbury wrote: “There is more than one way to burn a book. And the world is full of people running about with lit matches.”

Full Text: “Ray Bradbury Wrote ‘Fahrenheit 451’ to Prevent a Dystopia. Instead, He Predicted One

(If you click through to read the full text, be aware that the first paragraph of the piece presents a slightly inaccurate potted history of Bradbury’s career trajectory that implies he only rose to literary prominence with the publication of F451 in 1953. In fact, some of his previous books and stories, including, especially, 1950’s The Martian Chronicles, had already brought him considerable attention and acclaim.)

For more on the same theme, see my previous posts “On living well in Ray Bradbury’s dystopia: Notes toward a monastic response” and “Facebook, Fahrenheit 451, and the crossing of a cultural threshold,” as well as the Strange Horizons essay “The Failure of Fahrenheit 451.”

For thoughts from the author himself, see the 2007 LA Weekly piece Ray Bradbury: Fahrenheit 451 Misinterpreted,” featuring Bradbury’s comments on the reality of F451-like trends in contemporary society. (However, Bradbury’s comments in that article/interview should be read in tandem with this context-creating response from his biographer, Sam Weller.) Also see Bradbury’s interviews for A.V. Club and the Peoria Journal Star for more observations from him about the encroaching threat of his novel’s realization in the world around us. And see especially his 1998 interview for Wired, titled “Bradbury’s Tomorrowland,” in which he said the following:

Almost everything in Fahrenheit 451 has come about, one way or the other — the influence of television, the rise of local TV news, the neglect of education. As a result, one area of our society is brainless. But I utilized those things in the novel because I was trying to prevent a future, not predict one.

Our smartphone apocalypse, animated by Steve Cutts

This remarkable animation comes from the hand (or computer) of illustrator and animator Steve Cutts, famed for such things as 2012’s Man, which packs an unbelievable punch. So does the one I’ve chosen to post here. Cutts created it for last year’s hit song “Are You Lost in the World Like Me?” by Moby and The Void Pacific Choir. But I personally like this slight repurposing much better, where the musical accompaniment is changed to French composer Yann Tiersen’s “Comptine d’un autre été, l’après-midi” (best known for being featured in the soundtrack for the 2001 French film Amélie).

The story told by the visuals, and also by the piercingly beautiful and sad musical accompaniment, can stand without comment here, as Teeming Brain readers are well aware of my deep disturbance and unhappiness at the digital dystopia that has emerged in the age of the smartphone. I consider Cutts something of a genius, both for his choice of animation style and for his devastating accuracy in calling out the dark and despairing heart of this cultural dead end in fairly visionary fashion. And no, the fact that his creation of this animation, and my sharing of it here, and your reading of it, is all facilitated by the existence of networked computers doesn’t invalidate the message with a fatal irony. We could probably do better, culturally and humanly speaking, in our uses of these technologies. But instead we’re apparently inclined to give way, en masse, to our lowest impulses, resulting in a kind of digital Dante’s Inferno whose factual reality isn’t really all that far from the only slightly exaggerated version presented by Cutts.

A grateful acknowledgment goes out to Jesús Olmo, who introduced me to Cutts by sending me a link to Man last month.

Our Craving for Apocalypse: ‘Dispatches from the Ruins’ (short video)

This brief video essay on the source of our collective craving for “the awful futures of apocalyptic fiction” is really well done. Skillfully executed and thought-provoking. A worthwhile investment of five reflective minutes. Here’s the description:

In the first two decades of the new millennium, stories of the post-apocalypse have permeated pop culture, from books such as Cormac McCarthy’s The Road (2006), Paolo Bacigalupi’s The Windup Girl (2009) and Emily St John Mandel’s Station Eleven (2014) to films and TV programmes such as The Walking Dead (2010-), the Hunger Games series (2012-15) and Mad Max: Fury Road (2015). While post-apocalyptic fictions of previous eras largely served as cautionary tales — against nuclear brinksmanship in On the Beach (1959) or weaponised biology in The Stand (1978) — today’s versions of these tales depict less alterable, more oblique and diffuse visions of our doom. So why can’t we seem to get enough of humanity’s unavoidable collapse and its bleak aftermath?

Dispatches from the Ruins reflects on what these stories — set among crumbling buildings, overgrown lots and barren wastelands — might be telling us about modern fears and fantasies. This Aeon original video is adapted from an Aeon essay by the US writer Frank Bures. Bures is also the author of The Geography of Madness (2016), a book about cultural syndromes across the world. His work has been included in the Best American Travel Writing and appeared in Harper’s, Lapham’s Quarterly and the Washington Post Magazine, among others.

 

Orwell Meets Frankenstein: The Internet as a Monster of Mass Surveillance and Social Control

The following paragraphs are from a talk delivered by Pinboard founder Maciej Cegłowski at the recent Emerging Technologies for the Enterprise conference in Philadelphia.  Citing as Exhibit A the colossal train wreck that was the 2016 American presidential election, Cegłowski basically explains how, in the current version of the Internet that has emerged over the past decade-plus, we have collectively created a technology that is perfectly calibrated for undermining Western democratic societies and ideals.

But as incisive as his analysis is, I seriously doubt that his (equally incisive) proposed solutions, described later in the piece, will ever be implemented to any meaningful extent. I mean, if we’re going to employ the explicitly Frankensteinian metaphor of “building a monster,” then it’s important to bear in mind that Victor Frankenstein and his wretched creation did not find their way to anything resembling a happy ending. (And note that Cegłowski himself acknowledges as much at the end of his piece when he closes his discussion of proposed solutions by asserting that “even though we’re likely to fail, all we can do is try.”)

This year especially there’s an uncomfortable feeling in the tech industry that we did something wrong, that in following our credo of “move fast and break things,” some of what we knocked down were the load-bearing walls of our democracy. . . .

A question few are asking is whether the tools of mass surveillance and social control we spent the last decade building could have had anything to do with the debacle of the 2017 [sic] election, or whether destroying local journalism and making national journalism so dependent on our platforms was, in retrospect, a good idea. . . .

We built the commercial internet by mastering techniques of persuasion and surveillance that we’ve extended to billions of people, including essentially the entire population of the Western democracies. But admitting that this tool of social control might be conducive to authoritarianism is not something we’re ready to face. After all, we’re good people. We like freedom. How could we have built tools that subvert it? . . .

The economic basis of the Internet is surveillance. Every interaction with a computing device leaves a data trail, and whole industries exist to consume this data. Unlike dystopian visions from the past, this surveillance is not just being conducted by governments or faceless corporations. Instead, it’s the work of a small number of sympathetic tech companies with likable founders, whose real dream is to build robots and Mars rockets and do cool things that make the world better. Surveillance just pays the bills. . . .

Orwell imagined a world in which the state could shamelessly rewrite the past. The Internet has taught us that people are happy to do this work themselves, provided they have their peer group with them, and a common enemy to unite against. They will happily construct alternative realities for themselves, and adjust them as necessary to fit the changing facts . . . .

A lot of what we call “disruption” in the tech industry has just been killing flawed but established institutions, and mining them for parts. When we do this, we make a dangerous assumption about our ability to undo our own bad decisions, or the time span required to build institutions that match the needs of new realities.

Right now, a small caste of programmers is in charge of the surveillance economy, and has broad latitude to change it. But this situation will not last for long. The kinds of black-box machine learning that have been so successful in the age of mass surveillance are going to become commoditized and will no longer require skilled artisans to deploy. . . .

Unless something happens to mobilize the tech workforce, or unless the advertising bubble finally bursts, we can expect the weird, topsy-turvy status quo of 2017 to solidify into the new reality.

FULL TEXT: “Build a Better Monster: Morality, Machine Learning, and Mass Surveillance

Big Data, Artificial Intelligence, and Dehumanization: Surrendering to the Death of Democracy

 

Greetings, Teeming Brainers. I’m just peeking in from the digital wings, amid much ongoing blog silence, to observe that many of the issues and developments — sociocultural, technological, and more — that I began furiously tracking here way back in 2006 are continuing to head in pretty much the same direction. A case in point is provided by the alarming information, presented in a frankly alarmed tone, that appears in this new piece from Scientific American (originally published in SA’s German-language sister publication, Spektrum der Wissenschaft):

Everything started quite harmlessly. Search engines and recommendation platforms began to offer us personalised suggestions for products and services. This information is based on personal and meta-data that has been gathered from previous searches, purchases and mobility behaviour, as well as social interactions. While officially, the identity of the user is protected, it can, in practice, be inferred quite easily. Today, algorithms know pretty well what we do, what we think and how we feel — possibly even better than our friends and family or even ourselves. Often the recommendations we are offered fit so well that the resulting decisions feel as if they were our own, even though they are actually not our decisions. In fact, we are being remotely controlled ever more successfully in this manner. The more is known about us, the less likely our choices are to be free and not predetermined by others.

But it won’t stop there. Some software platforms are moving towards “persuasive computing.” In the future, using sophisticated manipulation technologies, these platforms will be able to steer us through entire courses of action, be it for the execution of complex work processes or to generate free content for Internet platforms, from which corporations earn billions. The trend goes from programming computers to programming people. . . .

[I]t can be said that we are now at a crossroads. Big data, artificial intelligence, cybernetics and behavioral economics are shaping our society — for better or worse. If such widespread technologies are not compatible with our society’s core values, sooner or later they will cause extensive damage. They could lead to an automated society with totalitarian features. In the worst case, a centralized artificial intelligence would control what we know, what we think and how we act. We are at the historic moment, where we have to decide on the right path — a path that allows us all to benefit from the digital revolution

Oh, and for a concrete illustration of all the above, check this out:

How would behavioural and social control impact our lives? The concept of a Citizen Score, which is now being implemented in China, gives an idea. There, all citizens are rated on a one-dimensional ranking scale. Everything they do gives plus or minus points. This is not only aimed at mass surveillance. The score depends on an individual’s clicks on the Internet and their politically-correct conduct or not, and it determines their credit terms, their access to certain jobs, and travel visas. Therefore, the Citizen Score is about behavioural and social control. Even the behaviour of friends and acquaintances affects this score, i.e. the principle of clan liability is also applied: everyone becomes both a guardian of virtue and a kind of snooping informant, at the same time; unorthodox thinkers are isolated. Were similar principles to spread in democratic countries, it would be ultimately irrelevant whether it was the state or influential companies that set the rules. In both cases, the pillars of democracy would be directly threatened.

FULL ARTICLE: Will Democracy Survive Big Data and Artificial Intelligence?

Of course, none of this is real news to anybody who has been paying attention. It’s just something that people like me, and maybe like you, find troubling enough to highlight and comment on. And maybe, in the end, Cipher from The Matrix will turn out to have been right: Maybe ignorance really is bliss. Because from where I’m sitting, there doesn’t appear to be anything one can do to stop this streamrollering, metastasizing, runaway train-like dystopian trend. Talking about it is just that: talk. Which is one reason why I’ve lost a portion of the will that originally kept me blogging here for so many years. You can only play the role of Cassandra for so long before the intrinsic attraction begins to dissipate. Read the rest of this entry

Short Film: ‘2084’

Remember: You must conform.

Better yet: Doughnut thing. (Watch for explanation.)

Utopia, dystopia, and the eternal present of Amish time

 

Traditional_Amish_buggy
Traditional Amish Buggy. By Ad Meskens (Own work) [CC BY-SA 3.0 (http://creativecommons.org/licenses/by-sa/3.0) or GFDL (http://www.gnu.org/copyleft/fdl.html)], via Wikimedia Commons

When I was a kid, the first time I ever heard of the Amish was when I watched the movie Witness for the first time. Much later, in the first decade of the aughts, I lived for seven years right in the heart of Missouri Amish country, where horse-drawn buggies on the shoulder of the road were a frequent sight, and where I regularly rubbed shoulders with Amish people in stores, at garage sales, and elsewhere. Between those two extremes, I took an undergraduate college sociology course at Mizzou titled simply “The Old Order Amish,” taught by a highly respected professor who had himself grown up in an Old Order Mennonite community (and who, as I just now discovered, died only two months ago). So all of that amplifies my personal interest in this brief and thoughtful reflection by University of Wyoming English professor Arielle Zibrak on the possible meaning and lessons of “Amish time” for a 21st-century technological society that has become obsessed with future visions and intimations of collapse and dystopia:

Wendell Berry wrote that American society’s inability to see the Amish for what they are is indicative of the most basic flaws of the American progress narrative. I think we’re beginning to see the frayed edges of that narrative’s unraveling. While the future used to appear to us as Marty Mcfly’s hoverboard, robo cops, and casual space travel, it now seems more frequently to come in the form of close-knit roving communities that communicate via flare and cook game over open fires, e.g. McCarthy’s The Road or Frank Darabont’s “The Walking Dead.”

We usually cast these fictional futures as dystopias. But if Margaret Atwood is right — and she should know — “within every dystopia there’s a little utopia.” And I can’t help but wonder if, as our vision of the future continues to shift, our view of the Amish will shift with it. For now, the best I can do is to try to learn from the Amish.

These days, after another move, I’m living out west, where my little sedan is itself a buggy parked alongside the giant pickups at the superstore. I can’t be around the Amish anymore in the sense of space. But I can try to be closer to where they are in the sense of time, which is neither the past nor precisely the future (even if there’s a zombie apocalypse or the hipsters keep defecting to dairy farms and haberdasheries) but is squarely inside of the mystery of the present.

MORE: “On Amish Time

Dehumanized in a dark age

 

Fire_of_Troy_by_Kerstiaen de Keuninck (Coninck)

“Fire of Troy” by Kerstiaen de Keuninck (Coninck), 17th cent. [Public domain], via Wikimedia Commons

NOTE: This post was originally published in January 2007 in a different form. Based on various circumstances — including the publication just yesterday of a post titled “Collective Brainwashing & Modern Concentration Camps” over at Daily Grail, which calls out the below-transcribed portion of My Dinner with Andre — now seems like a good time to re-present this in a slightly revised and enhanced form.

* * *

One of the most nightmarish things about a dark age is the degradation it entails for life’s overall tone, not least in the dehumanization that occurs when a people’s intellectual, emotional, moral, spiritual, political, social, and cultural life in general is reduced to a ghastly level of brutishness and ignorance. As is now plainly evident all around us in the industrialized world of present-day info-technocracy, this coarsening of life can occur even in circumstances of relative material prosperity. It doesn’t always have to be a dark age like the one that gripped Europe in the aftermath of Rome’s fall, when starvation and plague were rampant and most people barely scraped by at a miserable subsistence level. A dark age can unfold and exist right in the middle of outward conditions that may appear enlightened to those who don’t look too closely or deeply.

Sometimes it’s oddly comforting to dwell on the words of people who have seen today’s dark age of dehumanization unfolding. When it feels like the world is full of robots instead of people, or when it begins to feel like we really are living on the planet of the apes (as Robert Anton Wilson liked to put it), it can be a powerfully affirming experience to be reminded that other people have observed the same thing.

With this in mind, here are three of my own favorite articulations of these things, which, based on my own experience, I recommend you ingest, digest, memorize, and keep mentally handy for reciting to yourself on a rainy day. There are no solutions offered here. There’s just the satisfaction of being confronted by grim realities and looking them full in the face. Read the rest of this entry

“A web of flesh spun over a void”

Francisco_Goya_-_Casa_de_locos

“The Madhouse” by Francisco Goya [Public domain], via Wikimedia Commons

So brilliant: an implicitly ironic but outwardly straight-faced reading of the DSM-5 as a dystopian horror novel, complete with a quasi-Ligottian assessment of the book’s narrative voice and view of humanity.

Great dystopia isn’t so much fantasy as a kind of estrangement or dislocation from the present; the ability to stand outside time and see the situation in its full hideousness. The dystopian novel doesn’t necessarily have to be a novel. . . . Something has gone terribly wrong in the world; we are living the wrong life, a life without any real fulfillment. The newly published DSM-5 is a classic dystopian novel in this mold.

Here, we have an entire book, something that purports to be a kind of encyclopedia of madness, a Library of Babel for the mind, containing everything that can possibly be wrong with a human being. . . . DSM-5 arranges its various strains of madness solely in terms of the behaviors exhibited. This is a recurring theme in the novel, while any consideration of the mind itself is entirely absent. . . . The idea emerges that every person’s illness is somehow their own fault, that it comes from nowhere but themselves: their genes, their addictions, and their inherent human insufficiency. We enter a strange shadow-world where for someone to engage in prostitution isn’t the result of intersecting environmental factors (gender relations, economic class, family and social relationships) but a symptom of “conduct disorder,” along with “lying, truancy, [and] running away.” A mad person is like a faulty machine. The pseudo-objective gaze only sees what they do, rather than what they think or how they feel. A person who shits on the kitchen floor because it gives them erotic pleasure and a person who shits on the kitchen floor to ward off the demons living in the cupboard are both shunted into the diagnostic category of encopresis. It’s not just that their thought-processes don’t matter, it’s as if they don’t exist. The human being is a web of flesh spun over a void.

. . . The word “disorder” occurs so many times that it almost detaches itself from any real signification, so that the implied existence of an ordered state against which a disorder can be measured nearly vanishes is almost forgotten. Throughout the novel, this ordered normality never appears except as an inference; it is the object of a subdued, hopeless yearning. With normality as a negatively defined and nebulously perfect ideal, anything and everything can then be condemned as a deviation from it. . . . If there is a normality here, it’s a state of near-catatonia. DSM-5 seems to have no definition of happiness other than the absence of suffering. The normal individual in this book is tranquilized and bovine-eyed, mutely accepting everything in a sometimes painful world without ever feeling much in the way of anything about it. The vast absurd excesses of passion that form the raw matter of art, literature, love, and humanity are too distressing; it’s easier to stop being human altogether, to simply plod on as a heaped collection of diagnoses with a body vaguely attached.

. . . For all the subtlety of its characterization, the book doesn’t just provide a chilling psychological portrait, it conjures up an entire world. The clue is in the name: On some level we’re to imagine that the American Psychiatric Association is a body with real powers, that the “Diagnostic and Statistical Manual” is something that might actually be used, and that its caricature of our inner lives could have serious consequences. Sections like those on the personality disorders offer a terrifying glimpse of a futuristic system of repression, one in which deviance isn’t furiously stamped out like it is in Orwell’s unsubtle Oceania, but pathologized instead. Here there’s no need for any rats, and the diagnostician can honestly believe she’s doing the right thing; it’s all in the name of restoring the sick to health. DSM-5 describes a nightmare society in which human beings are individuated, sick, and alone. For much of the novel, what the narrator of this story is describing is its own solitude, its own inability to appreciate other people, and its own overpowering desire for death — but the real horror lies in the world that could produce such a voice.

MORE: “Book of Lamentations

For more on the DSM-V and the controversy it has elicited, see this.

Teeming Links – July 25, 2014

FireHead

What happens in a world where war has become perpetual, live-reported popcorn entertainment? Answer: we’re as far as we ever were from understanding anything about it. “Far from offering insights into the mysteries of history and politics, these spectacles give us a sense that we are further away than ever from understanding their causes, their implications, and their consequences. Combat makes for a disappointing program — we approach it with great expectations, prepared to encounter essential truths of human existence, but we leave empty-handed.”

Novelist William Boyd reflects on how mortality shapes human existence: “I am convinced that what makes our species unique among the fauna of this small planet circling its insignificant star is that we know we are trapped in time, caught briefly between these two eternities of darkness, the prenatal darkness and the posthumous one.”

Philosopher and journalist Steven Cave meditates on the reality, mystery, and meaning of death, from humans to flies: “Perhaps, as Tennyson believed, death’s relentless reaping should lead us to question the existence of some higher meaning — one above, beyond or external to us. But whoever thought there was such a thing anyway? Not the frogs and tadpoles. . . . Because life is so teeming with intentions and meanings, the death of each creature really is a catastrophe. But we must live with it anyway.”

Paul Kingsnorth, co-founder of the Dark Mountain Project and co-author of Uncivilisation: The Dark Mountain Manifesto, discusses his defeatist position on climate change and the liberation to be found in giving up hope.

Journalist Matt Stroud delves into the unbelievable life and death of Michael C. Ruppert: “After decades of struggle, the notorious doomsayer finally found fame and recognition. Then he shot himself.” (Also see my reflections, in a post published five years ago, on Ruppert’s startling ascent to mainstream fame via the movie Collapse.)

Historian and writer Rebecca Onion looks at how 1980s childhoods changed the way America thought about nuclear Armageddonwith an extended analysis of the role of the 1983 television movie The Day After, which utterly freaked out my 13-year-old self.

Jacob Silverman reflects on the dystopian plight of office drones in the digital tech age: “[They are] more gadgeted-out than ever, but still facing the same struggle for essential benefits, wages, and dignity that workers have for generations. . . . Such are the perverse rewards we reap when we permit tech culture to become our culture. The profits and power flow to the platform owners and their political sponsors. We get the surveillance, the data mining, the soaring inequality, and the canned pep talks from bosses who have been upsold on analytics software. Without Gchat, Twitter, and Facebook — the great release valves of workaday ennui — the roofs of metropolitan skyscrapers would surely be filled with pallid young faces, wondering about the quickest way down.”

Seriously? We’re now entertaining the possibility of robot caregivers? Sociologist and tech expert Zeynep Tufekci is right: this is how to fail the third machine age.

You’ve seen me mention my love of My Dinner with Andre many times here. That’s why I’m so pleased to call attention to this brand new interview from On Point with “The Inscrutable, Ubiquitous Wallace Shawn.” It’s highly recommendable both for the way it offends common radio sensibilities (the whole thing gets off to a rocky start as the interviewer adopts a somewhat glib approach that apparently annoys Mr. Shawn) and for the depth of Shawn’s carefully expressed thoughts on everything from the heady joys of being a writer and articulating things you never knew were in your soul, to the changing nature of conversation in an age when everybody is perpetually interrupted by phone calls and text messages. There is also, of course, some discussion of his portrayal of Vizzini in The Princess Bride. (Oh, and also see the recent pieces on Shawn, and also Andre Gregory, and their new collaboration, in The Wall Street Journal, Vulture, and Salon.)

When I was a kid, my mother actually walked out of the theater during the heart-ripping scene in Indiana Jones and the Temple of Doom. Well, guess what? George Lucas and Steven Spielberg hate that movie’s notorious grimness and violence, too. Grantland unearths the history of why Temple of Doom turned out that way.

 

“Fire Head” image courtesy of Salvatore Vuono / FreeDigitalPhotos.net