The following insights are excerpted from a brief but engaging NPR piece that traces the cultural arc from Vint Cerf (the “inventor of the Internet”) and his early naive optimism about this new technology, to William Gibson’s uncanny prescience in forecasting exactly where the Internet would really take us (to a corporate-controlled cyberdystopia with sharply curtailed human relationships), to Black Mirror creator Charlie Brooker’s ongoing exploration of the darkest corners of the whole thing:
Initially, Cerf was trying to create an Internet through which scientists and academics from all over the world could share data and research. Then, one day in 1988, Cerf says he went to a conference for commercial vendors where they were selling products for the Internet. “I just stood there thinking, ‘My God! Somebody thinks they’re going to make money out of the Internet.’ ” Cerf was surprised and happy. “I was a big proponent of that. My friends in the community thought I was nuts. ‘Why would you let the unwashed masses get access to the Internet?’ And I said, ‘Because I want everybody to take advantage of its capability.’ ”
Clearly, Cerf is an optimist. That is what allowed him to dream big. But, in retrospect, some of the decisions his team made seem hopelessly naive, especially for a bunch of geniuses. They made it possible to surf the Internet anonymously — unlike a telephone, you don’t have a unique number that announces who you are. We know how that turned out. People with less lofty ambitions than Cerf used that loophole for cybercrime, international espionage and online harassment.
Cerf admits all that dark stuff never crossed his mind. “And we have to cope with that — I mean, welcome to the real world,” he says. . . .
Somehow [William] Gibson was able to imagine the potential scale of it — all those computers connected together. . . . But, it isn’t just the Internet that Gibson saw coming. In Neuromancer, the Internet has become dominated by huge multinational corporations fighting off hackers. The main character is a washed-up criminal hacker who goes to work for an ex-military officer to regain his glory. And get this: The ex-military guy is deeply involved in cyber-espionage between the U.S. and Russia.
Gibson says he didn’t need to try a computer or see the Internet to imagine this future. “The first people to embrace a technology are the first to lose the ability to see it objectively,” he says. He says he’s more interested in how people behave around new technologies. He likes to tell a story about how TV changed New York City neighborhoods in the 1940s. “Fewer people sat out on the stoops at night and talked to their neighbors, and it was because everyone was inside watching television,” he says. “No one really noticed it at the time as a kind of epochal event, which I think it was” . . . .
Brooker has a certain amount of frustration with the leaders in tech. “It’s felt like tech companies have for years just put this stuff out there,” he says. “And they distance themselves from the effects of their product effectively by saying, ‘Oh, we’re just offering a service.’ ” Brooker sees each new technology more like an untested drug waiting to launch us on a very bad trip. Each episode of Black Mirror is like its own laboratory testing a technology that is already out, but pushing it by mixing in common human behaviors and desires.
A few year ago I had two articles published in editor Joe Laycock’s Spirit Possession around the World: Possession, Communion, and Demon Expulsion across Cultures (ABC-CLIO, 2015). One of these was a survey of possession and exorcism in the history of literature. The other was an article about the daimon.
When I submitted the latter of these, Joe got back to me with a request for significant revisions, the better to make the article fit harmoniously with the rest of the encyclopedia’s contents, and the better to make it align with his editorial vision of its place in the book.
As a result, the version that is now published in the encyclopedia is thoroughly different from what I originally wrote. The original version has never been published. And since I own the copyright on that version, I’m free to share it here with Teeming Brain readers. As those of you who have been here for awhile will immediately recognize, this is entirely appropriate, since the article lands right in the middle of several of this blog’s foundational interests, themes, and concerns.
Possession, Exorcism, and the Daimon
The word “daimon” has several possible meanings, but in relation to possession and exorcism it refers to a particular type of autonomous or autonomous-feeling force in the psyche that influences or, in some cases, dominates a person’s thoughts, actions, and feelings. It comes from ancient Greece and the ancient Hellenistic world, where it generally referred to a particular class of deity or spirit being, and where its basic meaning evolved over time to refer as much or more to an inner psychic or subjective force as to an objectively conceived entity. The concept of the daimon is one of the key components in the origin and evolution of the related concepts of the demon and demonic possession. Its adjectival form, daimonic, has been widely used in modern-day depth psychology to refer to a particular aspect of the psyche that lies outside a person’s conscious, voluntary control, and that is especially associated with creativity, anger, and other surging states of mind and emotion that can effectively swamp the conscious ego and result in violent outbursts of creation and destruction.
Among the ancient Greeks, the concept of the daimon led a dual existence as it progressed along two distinct but related strands. On the one hand, daimons were conceived in typically animistic terms as spirits that inhabited or haunted certain places, affected the weather and other natural occurrences, and so on. Some were associated with the spirits of the dead. On the other hand, a spiritualized or psychologized view placed the daimons in a position of deep intertwinement with human subjectivity. Essentially, the Greeks regarded daimons as objectively real presences that made themselves known through their influence upon and within the human psyche. The objective, animistic beliefs about them were thus matched and accompanied by a more subtle and psychologically oriented view that framed them as inner influences upon human thoughts and emotions, and even as the keepers and emblems of individual character and destiny. This second view gradually became dominant over time. Read the rest of this entry
Philip Roth, 1973
Here’s Nathaniel Rich, writing for The New York Review of Books about Philip Roth’s Why Write?: Collected Nonfiction 1960–2013:
Between the interviews given in self-defense, the conversations with peers, and the exchanges with angry Jews, there emerges from Roth’s nonfiction a unified theory of the novel as a bulwark against the excesses of modern society. The assaults on the novelist come from two fronts. The first is the social chaos of a nation in political crisis and cultural decline. Roth began to speak about this danger in 1960:
The American writer in the middle of the twentieth century has his hands full in trying to understand, describe, and then make credible much of American reality. It stupefies, it sickens, it infuriates, and finally it is even a kind of embarrassment to one’s own meager imagination. The actuality is continually outdoing our talents, and the culture tosses up figures almost daily that are the envy of any novelist.
This problem obsessed Saul Bellow too; it was the dominant subject of his nonfiction. “The noise of life is the great threat,” he wrote in 1970, “the sounds of the public sphere, the din of politics, the turbulence and agitation that set in about 1914 and have now reached an intolerable volume.” Bellow worried that the fervor of public life would destroy the private conditions necessary for the creation and appreciation of art. Roth, despite writing before the tumult of the Sixties, went farther, suggesting that a radically destabilized society had made it difficult to discriminate between reality and fiction. What was the point of writing or reading novels when reality was as fantastic as any fiction?
Such apprehensions may seem quaint when viewed from the comic-book hellscape of 2018, though it is perversely reassuring that life in 1960 felt as berserk as it does now. American reality continued to overwhelm the imagination during the Vietnam War, which Roth likened to “living on a steady diet of Dostoevsky,” and under the administration of the “grotesque” Richard Nixon, the subject of Our Gang. And in Reagan’s Eighties, dominated as they were by “a proliferation of . . . media stupidity and cynical commercialism — American-style philistinism run amok,” a time when, Roth complained, it became “easier for even the best-educated people” to discuss movies and television shows than literature.
The threat continued in the 1990s, when Roth bemoaned to Ivan Klíma the obliterating influence of “that trivializer of everything, commercial television”; during the administration of George W. Bush (“we are ambushed . . . by the unpredictability that is history”); and in the final years of the Obama administration: “Very little truthfulness anywhere, antagonism everywhere, so much calculated to disgust, the gigantic hypocrisies, no holding fierce passions at bay, the ordinary viciousness you can see just by pressing the remote, explosive weapons in the hands of creeps. . . .” This year, in an e-mail published in The New Yorker, Roth worried about the newest manifestation of this threat: “It isn’t Trump as a character, a human type — the real-estate type, the callow and callous killer capitalist — that outstrips the imagination. It is Trump as President of the United States.”
Toward the end of his career, in his novels and public statements, Roth began to prophesy the extinction of a literary culture — an age-old pastime for aging writers. But in his earlier critical essays, he described literature as not only immune to the incursions of the “mass electronically amplified philistine culture,” but its most powerful antidote. What better refuge from the simplifying influence of mass culture than the richness of great fiction, with its openhearted embrace of moral contradiction and emotional complexity? As the shrill hue increases to an insane volume, fiction’s value grows ever more precious. “Where the mass media inundate us with inane falsifications of human affairs,” Roth wrote in 1990, “serious literature is no less of a life preserver, even if the society is all but oblivious of it.” In the current deluge, we have more reason to cling to that preserver than ever before.
Full article: “Roth Agonistes“
Photo by Nancy Crampton (ebay) [Public domain], via Wikimedia Commons
Here’s renowned neuroscientist Christopher Koch explaining in a Wall Street Journal piece that our future will be a dystopian nightmare in which humans will necessarily become ever more completely fused on a neurological level with super sophisticated computer technologies. This will, he says, be a non-negotiable requirement if we want to keep up with the artificial intelligences that will be billions of times smarter than us, and that will otherwise utterly rule humanity and pose an existential threat to us in all kinds of ways that we, with our currently unenhanced meat brains, can hardly imagine.
Or actually, Koch speaks not grimly but enthusiastically of this future (and semi-present) scenario. He views the technological enhancement of the human brain for purposes of keeping pace with AI as an exciting thing. The negative gloss on it is mine. What a wonderful world, he avers. “Resistance is futile. You will be assimilated,” my own meat brain keeps hearing.
Whether you are among those who believe that the arrival of human-level AI signals the dawn of paradise, such as the technologist Ray Kurzweil, or the sunset of the age of humans, such as the prominent voices of the philosopher Nick Bostrom, the physicist Stephen Hawking and the entrepreneur Elon Musk, there is no question that AI will profoundly influence the fate of humanity.
There is one way to deal with this growing threat to our way of life. Instead of limiting further research into AI, we should turn it in an exciting new direction. To keep up with the machines we’re creating, we must move quickly to upgrade our own organic computing machines: We must create technologies to enhance the processing and learning capabilities of the human brain. . . .
Unlike say, the speed of light, there are no known theoretical limits to intelligence. While our brain’s computational power is more or less fixed by evolution, computers are constantly growing in power and flexibility. This is made possible by a vast ecosystem of several hundred thousand hardware and software engineers building on each other’s freely shared advances and discoveries. How can the human species keep up? . . .
In the face of this relentless onslaught, we must actively shape our future to avoid dystopia. We need to enhance our cognitive capabilities by directly intervening in our nervous systems.
We are already taking steps in this direction. . . .
My hope is that someday, a person could visualize a concept — say, the U.S. Constitution. An implant in his visual cortex would read this image, wirelessly access the relevant online Wikipedia page and then write its content back into the visual cortex, so that he can read the webpage with his mind’s eye. All of this would happen at the speed of thought. Another implant could translate a vague thought into a precise and error-free piece of digital code, turning anyone into a programmer.
People could set their brains to keep their focus on a task for hours on end, or control the length and depth of their sleep at will.
Another exciting prospect is melding two or more brains into a single conscious mind by direct neuron-to-neuron links — similar to the corpus callosum, the bundle of two hundred million fibers that link the two cortical hemispheres of a person’s brain. This entity could call upon the memories and skills of its member brains, but would act as one “group” consciousness, with a single, integrated purpose to coordinate highly complex activities across many bodies.
These ideas are compatible with everything we know about the brain and the mind. Turning them from science fiction into science fact requires a crash program to design safe, inexpensive, reliable and long-lasting devices and procedures for manipulating brain processes inside their protective shell. It must be focused on the end-to-end enhancement of human capabilities. . . .
While the 20th century was the century of physics — think the atomic bomb, the laser and the transistor — this will be the century of the brain. In particular, it will be the century of the human brain — the most complex piece of highly excitable matter in the known universe. It is within our reach to enhance it, to reach for something immensely powerful we can barely discern.
Full article: “To Keep Up with AI, We’ll Need High-Tech Brains” (You may or may not encounter a paywall)
NYU marketing professor Scott Galloway, writing for Esquire:
Our brains are sophisticated enough to ask very complex questions but not sophisticated enough to answer them. Since Homo sapiens emerged from caves, we’ve relied on prayer to address that gap: We lift our gaze to the heavens, send up a question, and wait for a response from a more intelligent being. “Will my kid be all right?” “Who might attack us?”
As Western nations become wealthier, organized religion plays a smaller role in our lives. But the void between questions and answers remains, creating an opportunity. As more and more people become alienated from traditional religion, we look to Google as our immediate, all-knowing oracle of answers from trivial to profound. Google is our modern-day god. Google appeals to the brain, offering knowledge to everyone, regardless of background or education level. If you have a smartphone or an Internet connection, your prayers will always be answered: “Will my kid be all right?” “Symptoms and treatment of croup. . .” “Who might attack us?” “Nations with active nuclear-weapons programs . . .”
Think back on every fear, every hope, every desire you’ve confessed to Google’s search box and then ask yourself: Is there any entity you’ve trusted more with your secrets? Does anybody know you better than Google?
Full article: “Silicon Valley’s Tax-Avoiding, Job-Killing, Soul-Sucking Machine“
Image Credit: Kavinmecx (Own work) [CC BY-SA 4.0 (https://creativecommons.org/licenses/by-sa/4.0)], via Wikimedia Commons
Some time ago here at The Teeming Brain, I announced the birth of a new literary journal titled Vastarien, to be edited by Jon Padgett and me, and to be framed as “a source of critical study and creative response to the corpus of Thomas Ligotti as well as associated authors and ideas.” We launched a website, www.vastarien-journal.com, where we published submission guidelines and started receiving stories, poems, articles, essays, and artwork. Jon and I then spent many months and countless hours responding to these submissions and crafting the first issue. Jon also retained the services of artist Dave Felton and designer Anna Trueman to create a stunning cover.
Yesterday we launched a Kickstarter campaign to cover the costs of the first three issues. It reached its funding goal today, in a total of 27 hours. In fact, we have now surpassed that funding goal, and we will soon be announcing some stretch goals. This is a wonderfully affirming response that shows what a high level of interest and excitement there really is for such a publication.
The Kickstarter campaign has nearly a month left. This means you can still become one of our backers. We have created an attractive set of rewards for different pledge levels. At the campaign page you can also read the full table of contents for Volume 1, Issue 1. Consider yourself invited:
(BONUS NOTE: We’re also now accepting submissions for issues 2 and 3. The submission period will close on March 1.)
If reading is not always an act of liberation, it is at least an act of self-definition. It is an experience of solitude in which we become unavailable to those immediately around us. Even when we read to someone else, usually a lover or a child, or have them read to us, the effect is to be pulled together into an orbit defined by the book. In reading we make a public space into something private, and find a way to be private in public. . . .
What’s more, we are never just reading: we are always reading in a specific place and time, in a certain chair, at the window or in the basement, hot or cold, sleepy or wide awake, alone or in a crowded room. In an essay on Ruskin, Proust writes that when we look back on our favorite childhood days of reading, what we remember is all the interruptions that kept us from the book — the family that was calling us to dinner, for example, the very dinner that was ruined because we spent the whole meal wishing we were still reading. But now the memory of the reading is riddled with all its interruptions, and we look back on them fondly as part of the same event.”
Maybe that also describes what it’s like to watch movies or television shows. I don’t think it describes what it’s like to use a phone. It could be that in ten or twenty years I will look back fondly on these nights on the couch, where I panic over the headlines, compulsively like photos on Instagram, check my email, and return to the headlines on the great hamster wheel of contemporary enervation. Is this reading? Will I recall the interruptions that wrench me away from the latest political disaster with fond nostalgia, the cries of the baby intermingled with tweets about sexual harassment and rising sea levels? What I know is that on the nights when I force myself to open a book, I feel like a person, an individual engaged in an activity at once secret and communal, rather than a receptacle of mass information.
Full text: “Reading in the Dark“
Dejan Ognjanovic, who runs the prominent Serbian horror blog The Cult of Ghoul, has given Horror Literature through History a 2018 Golden Ghoul Award for best non-fiction horror book of 2017. You can read the complete awards list (in Serbian) at the blog.
A newly published op-ed by Los Angeles Times theater critic Charles McNulty is well worth reading for its nuanced response to the current crisis of falling idols in the world of arts and entertainment. Given my personal literary leanings, I find McNulty’s points to be nicely applicable to the case of someone he doesn’t name: H. P. Lovecraft, the moral excoriation of whom has by this point become de rigeur in some wings of the speculative fiction community. Here are some high points of McNulty’s argument, decontextualized from the rich field of specific examples, both classic and contemporary, that he uses to illustrate his point:
I know that an artist is not identical with his or her masterpieces and that few human beings can live up to their greatest achievements. . . .
If a book or play speaks, it does so in a way that transcends the limitations, and imperfections of the author, a more elusive figure than the publishing industry (and identity politics hard-liners) would have us believe. I’m not so much of the school of literary critic Roland Barthes, who famously declared the death of the author, as of the school of Proust, who saw that a writer crystallizes the notion of a multiplicity of identities, the way each of us contains numerous selves, not all of them readily categorizable.
Anyone whose occupation is imagining the lives of others necessarily has a thronging inner world. The artist who creates beauty can contain a fair amount of ugliness. . . .
History is the ultimate arbiter of what endures. Moral verdicts on the author, the raison d’être of many biographies, is a secondary layer that can color the reception of an artist’s oeuvre but cannot nullify work that retains its expressive power. . . .
Some of the shock we’re experiencing right now about all these fallen idols stems from our mythologizing natures. We expect our heroes to be exemplary, yet (as Proust points out) human fallibility may be a necessary ingredient in creativity. Heinous crimes are another matter entirely, but as any reader of biography can attest, genius and pathology aren’t exactly strangers.