Much to my surprise, a two-volume encyclopedia priced for institutional purchase by academic and public libraries has become a bestseller at Amazon. I don’t know the actual sales figures, and I’m sure they’re pretty small in terms of absolute numbers, since the book’s category (the history and criticism of horror and supernatural literature) is a rather narrow one. In other words, a book of this type probably doesn’t have to move many units in order to qualify as a bestseller. But for what it’s worth, for much of the past two weeks Horror Literature through History has hovered in the top ten books in that category, peaking at number six and then dropping much lower, but then spiking up again a few times. Amazon sold out of its original stock of the title and had to order more. A couple of days ago I saw that it was briefly flagged as the bestselling new encyclopedia of any kind. Currently those numbers have trailed off again.
In any event, I hadn’t expected so much interest from individual readers, given the book’s steep pricing. I’ve seen a couple of early readers among that crowd speaking glowingly of it in an online forum that I frequent, so that felt good. There’s a forthcoming interview with me about the project at a major horror website. I’m also slated to be interviewed on a major horror podcast a few days from now. I’ll post the links when they become available. In the meantime, if any of my Teeming Brain readers are among those who have purchased the encyclopedia, please know that I sincerely appreciate your interest and support, and I hope the book rewards your investment of time and money.
Update, October 17: The encyclopedia has also sold out at the website for Barnes & Noble.
At the beginning of each semester I tell my students the very thing that journalist Zat Rana gets at in a recent article for Quartz when I deliver a mini-sermon about my complete ban on phones — and also, for almost all purposes, laptops — in my classroom. A smartphone or almost any cell phone in your hand, on your desk, or even in your pocket as you’re trying to concentrate on important other things is a vampire demon powered by dystopian corporate overlords whose sole purpose is to suck your soul by siphoning away your attention and immersing you in a portable customized Matrix.
Or as Rana says, in less metaphorical language:
One of the biggest problems of our generation is that while the ability to manage our attention is becoming increasingly valuable, the world around us is being designed to steal away as much of it as possible….Companies like Google and Facebook aren’t just creating products anymore. They’re building ecosystems. And the most effective way to monetize an ecosystem is to begin with engagement. It’s by designing their features to ensure that we give up as much of our attention as possible.
Rana offers three pieces of sound advice for helping to reclaim your attention (which is the asset referred to in the title): mindfulness meditation, “ruthless single-tasking,” and regular periods of deliberate detachment from the digital world.
Interestingly, it looks like there’s a mini-wave of this type of awareness building in the mediasphere. Rana’s article for Quartz was published on October 2. Four days later The Guardian published a provocative and alarming piece with this teaser: “Google, Twitter and Facebook workers who helped make technology so addictive are disconnecting themselves from the internet. Paul Lewis reports on the Silicon Valley refuseniks alarmed by a race for human attention.” It’s a very long and in-depth article. Here’s a taste:
There is growing concern that as well as addicting users, technology is contributing toward so-called “continuous partial attention”, severely limiting people’s ability to focus, and possibly lowering IQ. One recent study showed that the mere presence of smartphones damages cognitive capacity — even when the device is turned off. “Everyone is distracted,” Rosenstein says. “All of the time.”
But those concerns are trivial compared with the devastating impact upon the political system that some of Rosenstein’s peers believe can be attributed to the rise of social media and the attention-based market that drives it. . . .
Tech companies can exploit such vulnerabilities to keep people hooked; manipulating, for example, when people receive “likes” for their posts, ensuring they arrive when an individual is likely to feel vulnerable, or in need of approval, or maybe just bored. And the very same techniques can be sold to the highest bidder. . . .
“The dynamics of the attention economy are structurally set up to undermine the human will,” [ex-Google strategist James Williams] says. “If politics is an expression of our human will, on individual and collective levels, then the attention economy is directly undermining the assumptions that democracy rests on.” If Apple, Facebook, Google, Twitter, Instagram and Snapchat are gradually chipping away at our ability to control our own minds, could there come a point, I ask, at which democracy no longer functions?
“Will we be able to recognise it, if and when it happens?” Williams replies. “And if we can’t, then how do we know it hasn’t happened already?”
In the same vein, Nicholas Carr (no stranger to The Teeming Brain’s pages) published a similarly aimed — and even a similarly titled — essay in the Weekend Review section of The Wall Street Journal on the very day the Guardian article appeared (October 6). “Research suggests that as the brain grows dependent on phone technology, the intellect weakens,” says the teaser. Here’s a representative passage from the essay itself:
Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object in the environment that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.” Media and communication devices, from telephones to TV sets, have always tapped into this instinct. Whether turned on or switched off, they promise an unending supply of information and experiences. By design, they grab and hold our attention in ways natural objects never could.
But even in the history of captivating media, the smartphone stands out. It’s an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what [Adrian] Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it’s part of the surroundings — which it always is. Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library, and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That’s what a smartphone represents to us. No wonder we can’t take our minds off it.
Full Text: “How Smartphones Hijack Our Minds“
At his blog Carr noted the simultaneous appearance of his essay and the Guardian article on the same day. He also noted the striking coincidence of the similarity between the titles, calling it a “telling coincidence” and commenting:
It’s been clear for some time that smartphones and social-media apps are powerful distraction machines. They routinely divide our attention. But the “hijack” metaphor — I took it from Adrian Ward’s article “Supernormal” — implies a phenomenon greater and more insidious than distraction. To hijack something is to seize control of it from its rightful owner. What’s up for grabs is your mind.
Perhaps the most astonishing thing about all of this is that John Carpenter warned us about it three decades ago, and not vaguely, but quite specifically and pointedly. The only difference is that the technology in his (quasi-)fictional presentation was television. Well, that, plus the fact that his evil overlords really were ill-intentioned, whereas ours may be in some cases as much victims of their own devices as we are. In any event:
There is a signal broadcast every second of every day through our television sets. Even when the set is turned off, the brain receives the input. . . . Our impulses are being redirected. We are living in an artificially induced state of consciousness that resembles sleep. . . . We have been lulled into a trance.
I’m confident that what follows is the best paragraph I’ll read this week. I daresay it may be the best one you’ll read, too. Unsurprisingly, it’s from James Howard Kunstler’s blog. For me, it provides both a substantively and a tonally accurate description of what I’ve been seeing, hearing, and experiencing around me in recent weeks and months (and years).
Poor old Karl Marx, tortured by boils and phantoms, was right about one thing: History repeats itself, first as tragedy, second as farce. Thus, I give you the Roman Empire and now the United States of America. Rome surrendered to time and entropy. Our method is to drive a gigantic clown car into a ditch.
BONUS ITEM: Here’s the best headline I’ve read in recent memory. The story itself resides behind a paywall at The Washington Post, so I don’t know what it actually says, but the headline alone probably says it all:
Rocket man and dotard go bonkers in toontown
I can’t help wondering if this headline might serve for future generations as some sort of quasi/crypto-Zen koan of esoteric fascination, in the same way that “No Wife, No Horse, No Mustache” worked for Robert Anton Wilson.
From an essay by Ed Finn, founding director of the Center for Science and the Imagination at Arizona State University:
We are all centaurs now, our aesthetics continuously enhanced by computation. Every photograph I take on my smartphone is silently improved by algorithms the second after I take it. Every document autocorrected, every digital file optimised. Musicians complain about the death of competence in the wake of Auto-Tune, just as they did in the wake of the synthesiser in the 1970s. It is difficult to think of a medium where creative practice has not been thoroughly transformed by computation and an attendant series of optimisations. . . .
Today, we experience art in collaboration with these algorithms. How can we disentangle the book critic, say, from the highly personalised algorithms managing her notes, communications, browsing history and filtered feeds on Facebook and Instagram? . . . .
The immediate creative consequence of this sea change is that we are building more technical competence into our tools. It is getting harder to take a really terrible digital photograph, and in correlation the average quality of photographs is rising. From automated essay critiques to algorithms that advise people on fashion errors and coordinating outfits, computation is changing aesthetics. When every art has its Auto-Tune, how will we distinguish great beauty from an increasingly perfect average? . . .
We are starting to perceive the world through computational filters, perhaps even organising our lives around the perfect selfie or defining our aesthetic worth around the endorsements of computationally mediated ‘friends’. . . .
Human creativity has always been a response to the immense strangeness of reality, and now its subject has evolved, as reality becomes increasingly codeterminate, and intermingled, with computation. If that statement seems extreme, consider the extent to which our fundamental perceptions of reality — from research in the physical sciences to finance to the little screens we constantly interject between ourselves in the world — have changed what it means to live, to feel, to know. As creators and appreciators of the arts, we would do well to remember all the things that Google does not know.
FULL TEXT: “Art by Algorithm“
Here’s the ending to my interview with Thomas Ligotti in Horror Literature through History (which, as I just learned, was published a few days ago, slightly ahead of the advertised schedule). I think these lines represent my favorite thing Tom has ever said in an interview. (And as you know, his interviews are plentiful.)
MATT CARDIN: What is the point, purpose, or value of horror literature?
THOMAS LIGOTTI: To entertain and disillusion at the same time.
MC: What do readers of horror literature need to know?
TL: If you read a lot of horror literature because you like to be scared, then you’re probably a normal, healthy person. If you read horror literature to fulfill some deeply personal predisposition, be assured there is probably something odd and unwholesome about you. Don’t ever let anyone tell you it’s not all right to be that way.
It’s less than two weeks until the official publication date of Horror Literature through History: An Encyclopedia of the Stories That Speak to Our Deepest Fears (available from the publisher, Amazon, Barnes & Noble, and elsewhere). It’s presently the subject of a feature article in the 2017 Halloween issue of Rue Morgue magazine. With these things in mind, I have obtained permission from the publisher to present my full introduction to the encyclopedia here at The Teeming Brain, along with the full table of contents. (You can also see the full list of 70 contributors, along with further information, here.)
A Preliminary Word about the Contents
As you’ll observe when you read the TOC (see the link below), the encyclopedia is structured in a unique way that makes it a special kind of reference work on the topic of horror literature and its long and rich literary history. Specifically, it’s divided into three broad sections. The first, titled “Horror through History,” consists of a series of sequential essays laying out the history of horror literature across time, from the ancient world to the present. The second section, “Themes, Topics, and Genres,” presents essays on major themes and issues in the field, such as apocalyptic horror, young adult horror, ghost stories, horror comics, horror video games, weird and cosmic horror fiction, and the relationship between horror literature and topics like religion, gender, and ecology. The third and longest section consists of alphabetically organized reference entries on authors, literary works, and specialized topics, such as horror awards, different types of monsters, important literary techniques, and various important elements in the field, such as haunted houses, ancestral curses, and the idea of forbidden knowledge.
Basically, the three sections mutually illuminate each other. As explained in the official publisher description, the first section with its deep tracing of horror literature’s historical evolution provides an overarching context for understanding the reference entries by placing them within the sociocultural, intellectual, and artistic currents of their respective eras. The second section expands on important topics to provide a greater depth of understanding about specific genres and forms, and about the multiple cultural and philosophical issues with which horror has always been intertwined. The final reference section provides informational “close-ups,” as it were — some short, others quite long and in-depth — on matters broached more fleetingly in the large-scale examinations of the first two sections.
I’m also pleased to to point out is that there are in fact many more authors, works, and topics covered in the encyclopedia than what’s listed on the TOC. For example, there are 150 sidebars accompanying the main entries, and quite a few of these are mini-essays on various horror stories. For example, the entry on E. F. Benson is accompanied by a sidebar essay on his classic story “Caterpillars.” The entry on Nathaniel Hawthorne is accompanied by a sidebar essay on “Rappaccini’s Daughter.” The same treatment is given to stories by the likes of Robert Hichens, Edgar Allan Poe, Daphne du Maurier, Robert E. Howard, Shirley Jackson, Fritz Leiber, Thomas Ligotti, Richard Matheson, and many more. The sidebars also provide timelines, story excerpts, commentary, and further types of contextualizing information to help illuminate the main entries. Read the rest of this entry
Teeming Brain readers are familiar with my longtime focus on Fahrenheit 451 and my abiding sense that we’re currently caught up in a real-world version of its dystopian vision. This is not, of course, an opinion peculiar to me. Many others have held it, too, including, to an extent, Bradbury himself. I know that some of you, my readers, share it as well.
As of a couple of weeks ago, a writer for the pop culture analysis website Acculturated has publicly joined the fold:
Ray Bradbury often said that he wrote science fiction not to predict the future but to prevent it. On this score, Fahrenheit 451 seems to have failed. The free speech wars on college campuses, the siloing effect of technology, the intolerance of diverse political opinions, and the virtual cocoon provided by perpetual entertainment all suggest that Bradbury anticipated the future with an accuracy unparalleled elsewhere in science fiction literature.
It’s a strange irony that, in the age of the Internet, which was supposed to encourage more transparency and debate, the open exchange of ideas is under threat. This was pointed out by another famous science fiction writer, Michael Crichton. “In the information society,” says Ian Malcolm in Jurassic Park, “No one thinks. We expected to banish paper, but we actually banished thought.” Bradbury saw this coming many decades earlier, and he understood why. Exposure to new ideas is uncomfortable and potentially dangerous. Staying safe, comfortable, and equal requires that everyone think identically. Liberal learning, the crucible that forms the individual, is anathema to group identity and cannot be tolerated. If you disagree, you’re morally suspect.
Which is why we need Bradbury’s message today more than ever. In a coda to the 1979 printing of Fahrenheit 451, Bradbury wrote: “There is more than one way to burn a book. And the world is full of people running about with lit matches.”
(If you click through to read the full text, be aware that the first paragraph of the piece presents a slightly inaccurate potted history of Bradbury’s career trajectory that implies he only rose to literary prominence with the publication of F451 in 1953. In fact, some of his previous books and stories, including, especially, 1950’s The Martian Chronicles, had already brought him considerable attention and acclaim.)
For more on the same theme, see my previous posts “On living well in Ray Bradbury’s dystopia: Notes toward a monastic response” and “Facebook, Fahrenheit 451, and the crossing of a cultural threshold,” as well as the Strange Horizons essay “The Failure of Fahrenheit 451.”
For thoughts from the author himself, see the 2007 LA Weekly piece Ray Bradbury: Fahrenheit 451 Misinterpreted,” featuring Bradbury’s comments on the reality of F451-like trends in contemporary society. (However, Bradbury’s comments in that article/interview should be read in tandem with this context-creating response from his biographer, Sam Weller.) Also see Bradbury’s interviews for A.V. Club and the Peoria Journal Star for more observations from him about the encroaching threat of his novel’s realization in the world around us. And see especially his 1998 interview for Wired, titled “Bradbury’s Tomorrowland,” in which he said the following:
Almost everything in Fahrenheit 451 has come about, one way or the other — the influence of television, the rise of local TV news, the neglect of education. As a result, one area of our society is brainless. But I utilized those things in the novel because I was trying to prevent a future, not predict one.
George Romero, 1940-2017
Rest in peace, Mr. Romero. I’ll never get to tell you this in person, but you played a major part in my mental-emotional life, with your Living Dead world helping to explain the non-cinematic “real” world to me in more ways than one. The paper in my Dark Awakenings collection about the possible use of your first three Living Dead films as tools for spiritual contemplation was the culmination of many years of dwelling on and in your imaginary (or perhaps imaginal) zombie otherworld.
Plus, you created Bub, the greatest movie zombie in history. (I’m among the minority of oddballs who favor DAY OF THE DEAD above all others in the series.) But I did always wish that Bub would have successfully shot Rhodes during their showdown…
And I always felt so deeply sorry for Bub when he experienced an agony of grief upon finding his master and quasi-friend Dr. Logan dead, murdered by Rhodes and his goons.
But then again, you and Bub did give Rhodes exactly what was coming to him in the end, didn’t you? For many years, until I stumbled across Peter Jackson’s Braindead / Dead Alive, this was single goriest scene in the single goriest movie that I had ever watched. (Yes, I also watched some European zombie horror and such, but you always seemed to top them somehow.)
Best of all, and apart from all the gore and grimness, you allowed us to witness the weirdly beautiful spectacle of a zombie experiencing a paroxysm of spiritual ecstasy at the sound of the “Ode to Joy” from the fourth movement of Beethoven’s Ninth Symphony.
It’s an amazing scene, in an amazing movie, with an amazing actor, from an amazing director. For this, and for the rest of your gift to the world, I do hope you’ll rest more peacefully than the zombies you created for us.
Image credit: By George_Romero,_66ème_Festival_de_Venise_(Mostra).jpg: nicolas genin derivative work: Andibrunt [CC BY-SA 2.0 (http://creativecommons.org/licenses/by-sa/2.0)], via Wikimedia Commons
This remarkable animation comes from the hand (or computer) of illustrator and animator Steve Cutts, famed for such things as 2012’s Man, which packs an unbelievable punch. So does the one I’ve chosen to post here. Cutts created it for last year’s hit song “Are You Lost in the World Like Me?” by Moby and The Void Pacific Choir. But I personally like this slight repurposing much better, where the musical accompaniment is changed to French composer Yann Tiersen’s “Comptine d’un autre été, l’après-midi” (best known for being featured in the soundtrack for the 2001 French film Amélie).
The story told by the visuals, and also by the piercingly beautiful and sad musical accompaniment, can stand without comment here, as Teeming Brain readers are well aware of my deep disturbance and unhappiness at the digital dystopia that has emerged in the age of the smartphone. I consider Cutts something of a genius, both for his choice of animation style and for his devastating accuracy in calling out the dark and despairing heart of this cultural dead end in fairly visionary fashion. And no, the fact that his creation of this animation, and my sharing of it here, and your reading of it, is all facilitated by the existence of networked computers doesn’t invalidate the message with a fatal irony. We could probably do better, culturally and humanly speaking, in our uses of these technologies. But instead we’re apparently inclined to give way, en masse, to our lowest impulses, resulting in a kind of digital Dante’s Inferno whose factual reality isn’t really all that far from the only slightly exaggerated version presented by Cutts.
A grateful acknowledgment goes out to Jesús Olmo, who introduced me to Cutts by sending me a link to Man last month.
The gorgeous-looking new edition of Lovecraft’s stories from The Folio Society, The Call of Cthulhu and Other Weird Stories, has this really effective (and kind of gorgeous in its own right) promotional video to go with it. Sadly, I don’t have $120 to spare. But with illustrations by Dan Hillier — who comes off quite well in the video, and whose work for this project looks amazing — and an introduction by Alan Moore, the book sure is tempting. Here’s the publisher’s description:
This edition, based on its sister limited edition [at $575!] marries Lovecraft’s best-known fiction with two modern masters of the macabre, the acclaimed artist Dan Hillier and author Alan Moore. In his beautifully crafted new preface, Moore finds Lovecraft at once at odds with and integral to the time in which he lived: ‘the improbable embodiment of an estranged world in transition’. Yet, despite his prejudices and parochialisms, he ‘possessed a voice and a perspective both unique in modern literature’.
Hillier’s six mesmerising, portal-like illustrations embrace the alien realities that lurk among the gambrel roofs of Lovecraft’s landscapes. By splicing Victorian portraits and lithographs with cosmic and Lovecraftian symbolism, each piece – like the stories themselves – pulls apart the familiar to reveal what lies beneath.
The edition itself shimmers with Lovecraft’s ‘unknown colours’, bound in purple and greens akin to both the ocean depths and mysteries from outer space. The cover is embossed with a mystical design by Hillier, while a monstrous eye stares blankly from the slipcase.
I find this all quite winning, personally, for the way it underscores Lovecraft’s growing prevalence and relevance in contemporary culture. For more about the new edition, see the write-ups at Tor (where several of the Hillier illustrations are shown), Wired (where the writer amusingly frames his encounter with the book as a harrowing Lovecraftian brush with forbidden knowledge and eldritch monstrosities), and The Verge. The latter presents an interview with Hillier. It also bears the best title of any of these articles, notwithstanding the slight misspelling of Great Cthulhu’s name: “A new collection of Lovecraft stories looks like an artifact from the Cthulu universe.”