Blog Archives

Teeming Links – March 15, 2019

In light of yesterday’s awful mosque attacks in New Zealand, I feel led to start with this except from a 2003 PBS interview with Thich Nhat Hanh. After an extensive conversation about Buddhism, Christianity, mindfulness, and other such matters, and their relationship to gritty large-scale matters of war and violence, the interaction ends with this:

Q: What is so tantalizing about talking to you is the wonderful promise of your teachings at the personal level, and the frustration of not seeing how it can change the policies of big institutions, such as government.

A: It is the individual who can effectuate change. When I change, I can help produce change in you. As a journalist, you can help change many people. That’s the way things go. There’s no other way. Because you have the seed of understanding, compassion, and insight in you. What I say can water that seed, and the understanding and compassion are yours and not mine. You see? My compassion, my understanding can help your compassion and understanding to manifest. It’s not something that you can transfer.

Are you burned out on collapse? According to a recent article on “the hidden psychological toll of living through a time of fracture,” you’re not alone. As the writer astutely observes, “When reality itself has turned into something like a grotesque, bizarre dystopia, then just making contact with it is deeply psychologically stressful.”

Douglas Rushkoff has offered a brief and typically insightful reflection on the deep cause and possible cure for our culture of doom and collapse: the Internet is acid, and America is having a bad trip. (Seriously, his thesis is profound.)

Meanwhile, journalist and author Nick Bilton writes in Vanity Fair that “No One Is at the Controls” as “Facebook, Amazon, and Others Are Turning Life into a Horrific Bradbury Novel.” It occurs to me that his thesis — that the Internet now runs itself, that “nobody is behind the curtain” of our digital dystopia — resonates with the horrific discovery of the empty movie theater projection booth in Lamberto Bava’s Demons. The characters storm the booth after the horror movie they’ve been watching comes to life and fills the theater with raging, murderous demons. But their horror is compounded when they discover there’s no projectionist. In other words, nobody is responsible. Nobody is making the nightmare happen. The equipment all just runs on its own. As one of them fearfully observes in a line of dialogue that resonates with overtones of cosmic nihilism, “Oh, God, then that means no one’s ever been here!” (Watch the scene.)

By contrast, this is quite lovely: Composer James Agnelli created music by using the position of birds on electrical wires to represent notes. Then he facilitated the production of this short film about it. Also see the brief explanation of further background at The Daily Grail.

In his recent commencement address to graduates of the Bennington Writing Seminars at Vermont’s Bennington College, poet and author Garth Greenwell communicated some riveting advice and wisdom on living the writer’s life: “To write a story or a poem or an essay is to make a claim about what we find beautiful, about what moves us, to reveal a vision of the world, which is always terrifying; to write seriously is to find ourselves pressed against not just our technical but our moral limits. . . . That intimate communication between writer and reader, that miracle of affective translation across distance and time, is the real life of literature; that’s what matters.” His words on the place of literary awards and sales figures are particularly astute: “The soul one pours into a novel or a collection of poems, the years of effort a book represents — what possible response from the world could be adequate recompense for that?”

This explain a lot: A secret brain trust of scientists and billionaires, unofficially headquartered at Silicon Valley, has embraced belief in UFOs as a new religious mode.

And then there’s this: “The British military is recruiting philosophers, psychologists and theologians to research new methods of psychological warfare and behavioural manipulation, leaked documents show.” Apparently the project comes with a communications campaign to help manage “reputational risks” for participating academic institutions. Quoth one Cambridge scholar interviewed for the linkedGuardian piece, “Now I don’t want to be too academic about this, but it’s very striking that a programme designed to change people’s views and opinions for military purposes would spend some of its money changing people’s views and opinions, so that they wouldn’t object to changing people’s views and opinions. See what they did there?”

An essay at The American Scholar titled “The Sound of Evil” provides an interesting cinematic-cultural-sociological analysis of the avenues by which classical music in movies and television have become synonymous with villainy

A free symposium titled “Detecting Pessimism: Thomas Ligotti and the Weird in an Age of Post-Truth” will be held this June at Manchester Metropolitan University’s 70 Oxford St. The announcement explains that “Ligotti is increasingly seen as one of the key literary horror and weird fiction writers of recent decades whose works present a unique, bleak and controversial portrayal of both human existence and society.” The symposium “will comprise of [sic] two panels with papers delivered by staff and students on Ligotti and the weird mode, and will include a keynote delivered by weird expert Professor Roger Luckhurst. They will explore the works, philosophy and influence of Ligotti within a diverse range of contexts, from philosophical nihilism and pessimism, weird fiction and horror to his impact on film and television.” (Tangential side note: About half the presenting scholars were involved in my Horror Literature through History encyclopedia.) Even if you, like me, will sadly be unable to attend, you can still read this piece containing brief interviews with some of the participants about their thoughts on Ligotti and his work.

While the rest of the US raves breathlessly on about AOC and Wells Fargo or whatever, I much prefer to slow down and savor a delicious interview with Whitley Strieber about his outlandish experiences and the way his career as a major and still-rising horror novelist was derailed when he became America’s most prominent paranormal lightning rod.

December saw the publication of Peter Bebergal’s Strange Frequencies: The Extraordinary Story of the Technological Quest for the Supernatural. Teeming Brain readers will recall that Peter was one of the panelists on the Teeming Brain podcast “Cosmic Horror vs. Sacred Terror.” His new book offers “a journey through the attempts artists, scientists, and tinkerers have made to imagine and communicate with the otherworldly using various technologies, from cameras to radiowaves.”

T. E. (Ted) Grau, who produced a handful of fine articles for The Teeming Brain a few years back, is presently on the final ballot for the Bram Stoker Award for his novel I Am the River. Publishers Weekly gave it a starred review, saying that “Grau’s poetic prose and stunning evocation of time and place, from the killing fields of Vietnam to the haunted alleyways of Bangkok, form a fever dream of copious bloodshed and many shades of gray.”

Speaking of horror, the crowd-funded documentary In Search of Darkness is in its final stages of production. I only learned about the project recently via a tweet from long-time Teeming Brain friend and fellow religion/horror adept John Morehead. Here’s the official description, followed by the official trailer. The description reads like a feast, while the trailer feels like a time warp to my misspent, VHS-saturated adolescence.

Featuring compelling critical takes and insider tales of the Hollywood filmmaking experience throughout the 1980s, In Search of Darkness will provide fans with a unique perspective on the decade that gave rise to some of the horror genre’s greatest icons, performers, directors and franchises that forever changed the landscape of modern cinema. Tracking major theatrical releases, obscure titles and straight-to-video gems, the incredible array of interviewees that have been assembled for ISOD will weigh in on a multitude of topics: from creative and budgetary challenges creatives faced throughout the decade to the creature suits and practical effects that reinvigorated the makeup effects industry during the era to the eye-popping stunts that made a generation of fans believe in the impossible. In Search of Darkness will also celebrate many of the atmospheric soundtracks released during that time, the resurgence of 3-D filmmaking, the cable TV revolution and the powerful marketing in video store aisles, the socio-political allegories infused throughout many notable films, and so much more.

Finally, a recent piece by Glenn Greenwald deserves to be read by everybody of all political persuasions: “NYT’s Exposé on the Lies About Burning Aid Trucks in Venezuela Shows How U.S. Government and Media Spread Pro-War Propaganda.” It presents an utterly damning account of collusion between the U.S. government and U.S. corporate media to foment Venezuelan regime change through brazen lies, thus perpetuating a long and sordid tradition in America’s international relations.

The man who invented the Internet didn’t foresee our Neuromancer/Black Mirror future

The following insights are excerpted from a brief but engaging NPR piece that traces the cultural arc from Vint Cerf (the “inventor of the Internet”) and his early naive optimism about this new technology, to William Gibson’s uncanny prescience in forecasting exactly where the Internet would really take us (to a corporate-controlled cyberdystopia with sharply curtailed human relationships), to Black Mirror creator Charlie Brooker’s ongoing exploration of the darkest corners of the whole thing:

Initially, Cerf was trying to create an Internet through which scientists and academics from all over the world could share data and research. Then, one day in 1988, Cerf says he went to a conference for commercial vendors where they were selling products for the Internet. “I just stood there thinking, ‘My God! Somebody thinks they’re going to make money out of the Internet.’ ” Cerf was surprised and happy. “I was a big proponent of that. My friends in the community thought I was nuts. ‘Why would you let the unwashed masses get access to the Internet?’ And I said, ‘Because I want everybody to take advantage of its capability.’ ”

Clearly, Cerf is an optimist. That is what allowed him to dream big. But, in retrospect, some of the decisions his team made seem hopelessly naive, especially for a bunch of geniuses. They made it possible to surf the Internet anonymously — unlike a telephone, you don’t have a unique number that announces who you are. We know how that turned out. People with less lofty ambitions than Cerf used that loophole for cybercrime, international espionage and online harassment.

Cerf admits all that dark stuff never crossed his mind. “And we have to cope with that — I mean, welcome to the real world,” he says. . . .

Somehow [William] Gibson was able to imagine the potential scale of it — all those computers connected together. . . . But, it isn’t just the Internet that Gibson saw coming. In Neuromancer, the Internet has become dominated by huge multinational corporations fighting off hackers. The main character is a washed-up criminal hacker who goes to work for an ex-military officer to regain his glory. And get this: The ex-military guy is deeply involved in cyber-espionage between the U.S. and Russia.

Gibson says he didn’t need to try a computer or see the Internet to imagine this future. “The first people to embrace a technology are the first to lose the ability to see it objectively,” he says. He says he’s more interested in how people behave around new technologies. He likes to tell a story about how TV changed New York City neighborhoods in the 1940s. “Fewer people sat out on the stoops at night and talked to their neighbors, and it was because everyone was inside watching television,” he says. “No one really noticed it at the time as a kind of epochal event, which I think it was” . . . .

Brooker has a certain amount of frustration with the leaders in tech. “It’s felt like tech companies have for years just put this stuff out there,” he says. “And they distance themselves from the effects of their product effectively by saying, ‘Oh, we’re just offering a service.’ ” Brooker sees each new technology more like an untested drug waiting to launch us on a very bad trip. Each episode of Black Mirror is like its own laboratory testing a technology that is already out, but pushing it by mixing in common human behaviors and desires.

You will be assimilated: Our future of tech-enhanced brains to keep up with AI

Here’s renowned neuroscientist Christopher Koch explaining in a Wall Street Journal piece that our future will be a dystopian nightmare in which humans will necessarily become ever more completely fused on a neurological level with super sophisticated computer technologies. This will, he says, be a non-negotiable requirement if we want to keep up with the artificial intelligences that will be billions of times smarter than us, and that will otherwise utterly rule humanity and pose an existential threat to us in all kinds of ways that we, with our currently unenhanced meat brains, can hardly imagine.

Or actually, Koch speaks not grimly but enthusiastically of this future (and semi-present) scenario. He views the technological enhancement of the human brain for purposes of keeping pace with AI as an exciting thing. The negative gloss on it is mine. What a wonderful world, he avers. “Resistance is futile. You will be assimilated,” my own meat brain keeps hearing.

Whether you are among those who believe that the arrival of human-level AI signals the dawn of paradise, such as the technologist Ray Kurzweil, or the sunset of the age of humans, such as the prominent voices of the philosopher Nick Bostrom, the physicist Stephen Hawking and the entrepreneur Elon Musk, there is no question that AI will profoundly influence the fate of humanity.

There is one way to deal with this growing threat to our way of life. Instead of limiting further research into AI, we should turn it in an exciting new direction. To keep up with the machines we’re creating, we must move quickly to upgrade our own organic computing machines: We must create technologies to enhance the processing and learning capabilities of the human brain. . . .

Unlike say, the speed of light, there are no known theoretical limits to intelligence. While our brain’s computational power is more or less fixed by evolution, computers are constantly growing in power and flexibility. This is made possible by a vast ecosystem of several hundred thousand hardware and software engineers building on each other’s freely shared advances and discoveries. How can the human species keep up? . . .

In the face of this relentless onslaught, we must actively shape our future to avoid dystopia. We need to enhance our cognitive capabilities by directly intervening in our nervous systems.

We are already taking steps in this direction. . . .

My hope is that someday, a person could visualize a concept — say, the U.S. Constitution. An implant in his visual cortex would read this image, wirelessly access the relevant online Wikipedia page and then write its content back into the visual cortex, so that he can read the webpage with his mind’s eye. All of this would happen at the speed of thought. Another implant could translate a vague thought into a precise and error-free piece of digital code, turning anyone into a programmer.

People could set their brains to keep their focus on a task for hours on end, or control the length and depth of their sleep at will.

Another exciting prospect is melding two or more brains into a single conscious mind by direct neuron-to-neuron links — similar to the corpus callosum, the bundle of two hundred million fibers that link the two cortical hemispheres of a person’s brain. This entity could call upon the memories and skills of its member brains, but would act as one “group” consciousness, with a single, integrated purpose to coordinate highly complex activities across many bodies.

These ideas are compatible with everything we know about the brain and the mind. Turning them from science fiction into science fact requires a crash program to design safe, inexpensive, reliable and long-lasting devices and procedures for manipulating brain processes inside their protective shell. It must be focused on the end-to-end enhancement of human capabilities. . . .

While the 20th century was the century of physics — think the atomic bomb, the laser and the transistor — this will be the century of the brain. In particular, it will be the century of the human brain — the most complex piece of highly excitable matter in the known universe. It is within our reach to enhance it, to reach for something immensely powerful we can barely discern.

Full article: “To Keep Up with AI, We’ll Need High-Tech Brains(You may or may not encounter a paywall)

How Google replaced God

NYU marketing professor Scott Galloway, writing for Esquire:

Our brains are sophisticated enough to ask very complex questions but not sophisticated enough to answer them. Since Homo sapiens emerged from caves, we’ve relied on prayer to address that gap: We lift our gaze to the heavens, send up a question, and wait for a response from a more intelligent being. “Will my kid be all right?” “Who might attack us?”

As Western nations become wealthier, organized religion plays a smaller role in our lives. But the void between questions and answers remains, creating an opportunity. As more and more people become alienated from traditional religion, we look to Google as our immediate, all-knowing oracle of answers from trivial to profound. Google is our modern-day god. Google appeals to the brain, offering knowledge to everyone, regardless of background or education level. If you have a smartphone or an Internet connection, your prayers will always be answered: “Will my kid be all right?” “Symptoms and treatment of croup. . .” “Who might attack us?” “Nations with active nuclear-weapons programs . . .”

Think back on every fear, every hope, every desire you’ve confessed to Google’s search box and then ask yourself: Is there any entity you’ve trusted more with your secrets? Does anybody know you better than Google?

Full article: “Silicon Valley’s Tax-Avoiding, Job-Killing, Soul-Sucking Machine

Image Credit: Kavinmecx (Own work) [CC BY-SA 4.0 (https://creativecommons.org/licenses/by-sa/4.0)], via Wikimedia Commons

Instagram and the memeification of human experience

Courtesy of The Guardian, here’s another way the Internet is making life better and more fulfilling for all of us (by which I mean worse and more soullessly unsatisfying for a great many of us):

Tourists have always taken photographs. Like graffiti, it’s a very human way of saying “I was here.” But in the pre-digital age, because of the expense of film as well as high shooting ratios, you were lucky if you ended up with one usable picture. Now “influencers” can take as many photographs as they need, photoshopping and filtering until they are able to post the perfect advertisement (for that indeed is what these images are). The centering of the self to such an extent is new too, and at the expense of knowledge, exploration and adventure.

When most travel photographs on Instagram begin to look like fashion editorials you have to wonder whether anyone is learning anything. And when people are taking idiotic risks such as hanging out of fast moving trains or proffering food to tempt wild animals into shot, all for the sake of a photo that isn’t even an original composition, you might start to think that we’re approaching the end times.

It all goes to show how ineffective the internet can be as a lens for human experience, especially within a capitalist system. You might think social media would diversify the range of images we see, yet the most popular users operate according to a strict schema that takes full advantage of the relevant algorithms (creative, fascinating accounts are still there, but said algorithms make them harder to find). And it’s not just travel – it’s interiors, fashion, weddings, food, children. Social media encourages the memeification of human experience. Instead of diversity we see homogeneity. It’s extremely boring.

Full article: “Instagrammers are sucking the life and soul out of travel

Your smartphone is built to hijack and harvest your mind

At the beginning of each semester I tell my students the very thing that journalist Zat Rana gets at in a recent article for Quartz when I deliver a mini-sermon about my complete ban on phones — and also, for almost all purposes, laptops — in my classroom. A smartphone or almost any cell phone in your hand, on your desk, or even in your pocket as you’re trying to concentrate on important other things is a vampire demon powered by dystopian corporate overlords whose sole purpose is to suck your soul by siphoning away your attention and immersing you in a portable customized Matrix.

Or as Rana says, in less metaphorical language:

One of the biggest problems of our generation is that while the ability to manage our attention is becoming increasingly valuable, the world around us is being designed to steal away as much of it as possible….Companies like Google and Facebook aren’t just creating products anymore. They’re building ecosystems. And the most effective way to monetize an ecosystem is to begin with engagement. It’s by designing their features to ensure that we give up as much of our attention as possible.

Full Text: “Technology is destroying the most important asset in your life

Rana offers three pieces of sound advice for helping to reclaim your attention (which is the asset referred to in the title): mindfulness meditation, “ruthless single-tasking,” and regular periods of deliberate detachment from the digital world.

Interestingly, it looks like there’s a mini-wave of this type of awareness building in the mediasphere. Rana’s article for Quartz was published on October 2. Four days later The Guardian published a provocative and alarming piece with this teaser: “Google, Twitter and Facebook workers who helped make technology so addictive are disconnecting themselves from the internet. Paul Lewis reports on the Silicon Valley refuseniks alarmed by a race for human attention.” It’s a very long and in-depth article. Here’s a taste:

There is growing concern that as well as addicting users, technology is contributing toward so-called “continuous partial attention”, severely limiting people’s ability to focus, and possibly lowering IQ. One recent study showed that the mere presence of smartphones damages cognitive capacity — even when the device is turned off. “Everyone is distracted,” Rosenstein says. “All of the time.”

But those concerns are trivial compared with the devastating impact upon the political system that some of Rosenstein’s peers believe can be attributed to the rise of social media and the attention-based market that drives it. . . .

Tech companies can exploit such vulnerabilities to keep people hooked; manipulating, for example, when people receive “likes” for their posts, ensuring they arrive when an individual is likely to feel vulnerable, or in need of approval, or maybe just bored. And the very same techniques can be sold to the highest bidder. . . .

“The dynamics of the attention economy are structurally set up to undermine the human will,” [ex-Google strategist James Williams] says. “If politics is an expression of our human will, on individual and collective levels, then the attention economy is directly undermining the assumptions that democracy rests on.” If Apple, Facebook, Google, Twitter, Instagram and Snapchat are gradually chipping away at our ability to control our own minds, could there come a point, I ask, at which democracy no longer functions?

“Will we be able to recognise it, if and when it happens?” Williams replies. “And if we can’t, then how do we know it hasn’t happened already?”

Full Text: “‘Our minds can be hijacked’: The tech insiders who fear a smartphone dystopia

In the same vein, Nicholas Carr (no stranger to The Teeming Brain’s pages) published a similarly aimed — and even a similarly titled — essay in the Weekend Review section of The Wall Street Journal on the very day the Guardian article appeared (October 6). “Research suggests that as the brain grows dependent on phone technology, the intellect weakens,” says the teaser. Here’s a representative passage from the essay itself:

Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object in the environment that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.” Media and communication devices, from telephones to TV sets, have always tapped into this instinct. Whether turned on or switched off, they promise an unending supply of information and experiences. By design, they grab and hold our attention in ways natural objects never could.

But even in the history of captivating media, the smartphone stands out. It’s an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what [Adrian] Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it’s part of the surroundings — which it always is. Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library, and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That’s what a smartphone represents to us. No wonder we can’t take our minds off it.

Full Text: “How Smartphones Hijack Our Minds

At his blog Carr noted the simultaneous appearance of his essay and the Guardian article on the same day. He also noted the striking coincidence of the similarity between the titles, calling it a “telling coincidence” and commenting:

It’s been clear for some time that smartphones and social-media apps are powerful distraction machines. They routinely divide our attention. But the “hijack” metaphor — I took it from Adrian Ward’s article “Supernormal” — implies a phenomenon greater and more insidious than distraction. To hijack something is to seize control of it from its rightful owner. What’s up for grabs is your mind.

Perhaps the most astonishing thing about all of this is that John Carpenter warned us about it three decades ago, and not vaguely, but quite specifically and pointedly. The only difference is that the technology in his (quasi-)fictional presentation was television. Well, that, plus the fact that his evil overlords really were ill-intentioned, whereas ours may be in some cases as much victims of their own devices as we are. In any event:

There is a signal broadcast every second of every day through our television sets. Even when the set is turned off, the brain receives the input. . . . Our impulses are being redirected. We are living in an artificially induced state of consciousness that resembles sleep. . . . We have been lulled into a trance.

Art, creativity, and what Google doesn’t know

From an essay by Ed Finn, founding director of the Center for Science and the Imagination at Arizona State University:

We are all centaurs now, our aesthetics continuously enhanced by computation. Every photograph I take on my smartphone is silently improved by algorithms the second after I take it. Every document autocorrected, every digital file optimised. Musicians complain about the death of competence in the wake of Auto-Tune, just as they did in the wake of the synthesiser in the 1970s. It is difficult to think of a medium where creative practice has not been thoroughly transformed by computation and an attendant series of optimisations. . . .

Today, we experience art in collaboration with these algorithms. How can we disentangle the book critic, say, from the highly personalised algorithms managing her notes, communications, browsing history and filtered feeds on Facebook and Instagram? . . . .

The immediate creative consequence of this sea change is that we are building more technical competence into our tools. It is getting harder to take a really terrible digital photograph, and in correlation the average quality of photographs is rising. From automated essay critiques to algorithms that advise people on fashion errors and coordinating outfits, computation is changing aesthetics. When every art has its Auto-Tune, how will we distinguish great beauty from an increasingly perfect average? . . .

We are starting to perceive the world through computational filters, perhaps even organising our lives around the perfect selfie or defining our aesthetic worth around the endorsements of computationally mediated ‘friends’. . . .

Human creativity has always been a response to the immense strangeness of reality, and now its subject has evolved, as reality becomes increasingly codeterminate, and intermingled, with computation. If that statement seems extreme, consider the extent to which our fundamental perceptions of reality — from research in the physical sciences to finance to the little screens we constantly interject between ourselves in the world — have changed what it means to live, to feel, to know. As creators and appreciators of the arts, we would do well to remember all the things that Google does not know.

FULL TEXT: “Art by Algorithm

 

The Sad Failure of ‘Fahrenheit 451’ to Prevent the Future

Teeming Brain readers are familiar with my longtime focus on Fahrenheit 451 and my abiding sense that we’re currently caught up in a real-world version of its dystopian vision. This is not, of course, an opinion peculiar to me. Many others have held it, too, including, to an extent, Bradbury himself. I know that some of you, my readers, share it as well.

As of a couple of weeks ago, a writer for the pop culture analysis website Acculturated has publicly joined the fold:

Ray Bradbury often said that he wrote science fiction not to predict the future but to prevent it. On this score, Fahrenheit 451 seems to have failed. The free speech wars on college campuses, the siloing effect of technology, the intolerance of diverse political opinions, and the virtual cocoon provided by perpetual entertainment all suggest that Bradbury anticipated the future with an accuracy unparalleled elsewhere in science fiction literature.

It’s a strange irony that, in the age of the Internet, which was supposed to encourage more transparency and debate, the open exchange of ideas is under threat. This was pointed out by another famous science fiction writer, Michael Crichton. “In the information society,” says Ian Malcolm in Jurassic Park, “No one thinks. We expected to banish paper, but we actually banished thought.” Bradbury saw this coming many decades earlier, and he understood why. Exposure to new ideas is uncomfortable and potentially dangerous. Staying safe, comfortable, and equal requires that everyone think identically. Liberal learning, the crucible that forms the individual, is anathema to group identity and cannot be tolerated. If you disagree, you’re morally suspect.

Which is why we need Bradbury’s message today more than ever. In a coda to the 1979 printing of Fahrenheit 451, Bradbury wrote: “There is more than one way to burn a book. And the world is full of people running about with lit matches.”

Full Text: “Ray Bradbury Wrote ‘Fahrenheit 451’ to Prevent a Dystopia. Instead, He Predicted One

(If you click through to read the full text, be aware that the first paragraph of the piece presents a slightly inaccurate potted history of Bradbury’s career trajectory that implies he only rose to literary prominence with the publication of F451 in 1953. In fact, some of his previous books and stories, including, especially, 1950’s The Martian Chronicles, had already brought him considerable attention and acclaim.)

For more on the same theme, see my previous posts “On living well in Ray Bradbury’s dystopia: Notes toward a monastic response” and “Facebook, Fahrenheit 451, and the crossing of a cultural threshold,” as well as the Strange Horizons essay “The Failure of Fahrenheit 451.”

For thoughts from the author himself, see the 2007 LA Weekly piece Ray Bradbury: Fahrenheit 451 Misinterpreted,” featuring Bradbury’s comments on the reality of F451-like trends in contemporary society. (However, Bradbury’s comments in that article/interview should be read in tandem with this context-creating response from his biographer, Sam Weller.) Also see Bradbury’s interviews for A.V. Club and the Peoria Journal Star for more observations from him about the encroaching threat of his novel’s realization in the world around us. And see especially his 1998 interview for Wired, titled “Bradbury’s Tomorrowland,” in which he said the following:

Almost everything in Fahrenheit 451 has come about, one way or the other — the influence of television, the rise of local TV news, the neglect of education. As a result, one area of our society is brainless. But I utilized those things in the novel because I was trying to prevent a future, not predict one.

Our smartphone apocalypse, animated by Steve Cutts

This remarkable animation comes from the hand (or computer) of illustrator and animator Steve Cutts, famed for such things as 2012’s Man, which packs an unbelievable punch. So does the one I’ve chosen to post here. Cutts created it for last year’s hit song “Are You Lost in the World Like Me?” by Moby and The Void Pacific Choir. But I personally like this slight repurposing much better, where the musical accompaniment is changed to French composer Yann Tiersen’s “Comptine d’un autre été, l’après-midi” (best known for being featured in the soundtrack for the 2001 French film Amélie).

The story told by the visuals, and also by the piercingly beautiful and sad musical accompaniment, can stand without comment here, as Teeming Brain readers are well aware of my deep disturbance and unhappiness at the digital dystopia that has emerged in the age of the smartphone. I consider Cutts something of a genius, both for his choice of animation style and for his devastating accuracy in calling out the dark and despairing heart of this cultural dead end in fairly visionary fashion. And no, the fact that his creation of this animation, and my sharing of it here, and your reading of it, is all facilitated by the existence of networked computers doesn’t invalidate the message with a fatal irony. We could probably do better, culturally and humanly speaking, in our uses of these technologies. But instead we’re apparently inclined to give way, en masse, to our lowest impulses, resulting in a kind of digital Dante’s Inferno whose factual reality isn’t really all that far from the only slightly exaggerated version presented by Cutts.

A grateful acknowledgment goes out to Jesús Olmo, who introduced me to Cutts by sending me a link to Man last month.

Our Craving for Apocalypse: ‘Dispatches from the Ruins’ (short video)

This brief video essay on the source of our collective craving for “the awful futures of apocalyptic fiction” is really well done. Skillfully executed and thought-provoking. A worthwhile investment of five reflective minutes. Here’s the description:

In the first two decades of the new millennium, stories of the post-apocalypse have permeated pop culture, from books such as Cormac McCarthy’s The Road (2006), Paolo Bacigalupi’s The Windup Girl (2009) and Emily St John Mandel’s Station Eleven (2014) to films and TV programmes such as The Walking Dead (2010-), the Hunger Games series (2012-15) and Mad Max: Fury Road (2015). While post-apocalyptic fictions of previous eras largely served as cautionary tales — against nuclear brinksmanship in On the Beach (1959) or weaponised biology in The Stand (1978) — today’s versions of these tales depict less alterable, more oblique and diffuse visions of our doom. So why can’t we seem to get enough of humanity’s unavoidable collapse and its bleak aftermath?

Dispatches from the Ruins reflects on what these stories — set among crumbling buildings, overgrown lots and barren wastelands — might be telling us about modern fears and fantasies. This Aeon original video is adapted from an Aeon essay by the US writer Frank Bures. Bures is also the author of The Geography of Madness (2016), a book about cultural syndromes across the world. His work has been included in the Best American Travel Writing and appeared in Harper’s, Lapham’s Quarterly and the Washington Post Magazine, among others.