Blog Archives

“A web of flesh spun over a void”

Francisco_Goya_-_Casa_de_locos

“The Madhouse” by Francisco Goya [Public domain], via Wikimedia Commons

So brilliant: an implicitly ironic but outwardly straight-faced reading of the DSM-5 as a dystopian horror novel, complete with a quasi-Ligottian assessment of the book’s narrative voice and view of humanity.

Great dystopia isn’t so much fantasy as a kind of estrangement or dislocation from the present; the ability to stand outside time and see the situation in its full hideousness. The dystopian novel doesn’t necessarily have to be a novel. . . . Something has gone terribly wrong in the world; we are living the wrong life, a life without any real fulfillment. The newly published DSM-5 is a classic dystopian novel in this mold.

Here, we have an entire book, something that purports to be a kind of encyclopedia of madness, a Library of Babel for the mind, containing everything that can possibly be wrong with a human being. . . . DSM-5 arranges its various strains of madness solely in terms of the behaviors exhibited. This is a recurring theme in the novel, while any consideration of the mind itself is entirely absent. . . . The idea emerges that every person’s illness is somehow their own fault, that it comes from nowhere but themselves: their genes, their addictions, and their inherent human insufficiency. We enter a strange shadow-world where for someone to engage in prostitution isn’t the result of intersecting environmental factors (gender relations, economic class, family and social relationships) but a symptom of “conduct disorder,” along with “lying, truancy, [and] running away.” A mad person is like a faulty machine. The pseudo-objective gaze only sees what they do, rather than what they think or how they feel. A person who shits on the kitchen floor because it gives them erotic pleasure and a person who shits on the kitchen floor to ward off the demons living in the cupboard are both shunted into the diagnostic category of encopresis. It’s not just that their thought-processes don’t matter, it’s as if they don’t exist. The human being is a web of flesh spun over a void.

. . . The word “disorder” occurs so many times that it almost detaches itself from any real signification, so that the implied existence of an ordered state against which a disorder can be measured nearly vanishes is almost forgotten. Throughout the novel, this ordered normality never appears except as an inference; it is the object of a subdued, hopeless yearning. With normality as a negatively defined and nebulously perfect ideal, anything and everything can then be condemned as a deviation from it. . . . If there is a normality here, it’s a state of near-catatonia. DSM-5 seems to have no definition of happiness other than the absence of suffering. The normal individual in this book is tranquilized and bovine-eyed, mutely accepting everything in a sometimes painful world without ever feeling much in the way of anything about it. The vast absurd excesses of passion that form the raw matter of art, literature, love, and humanity are too distressing; it’s easier to stop being human altogether, to simply plod on as a heaped collection of diagnoses with a body vaguely attached.

. . . For all the subtlety of its characterization, the book doesn’t just provide a chilling psychological portrait, it conjures up an entire world. The clue is in the name: On some level we’re to imagine that the American Psychiatric Association is a body with real powers, that the “Diagnostic and Statistical Manual” is something that might actually be used, and that its caricature of our inner lives could have serious consequences. Sections like those on the personality disorders offer a terrifying glimpse of a futuristic system of repression, one in which deviance isn’t furiously stamped out like it is in Orwell’s unsubtle Oceania, but pathologized instead. Here there’s no need for any rats, and the diagnostician can honestly believe she’s doing the right thing; it’s all in the name of restoring the sick to health. DSM-5 describes a nightmare society in which human beings are individuated, sick, and alone. For much of the novel, what the narrator of this story is describing is its own solitude, its own inability to appreciate other people, and its own overpowering desire for death — but the real horror lies in the world that could produce such a voice.

MORE: “Book of Lamentations

For more on the DSM-V and the controversy it has elicited, see this.

Teeming Links – July 25, 2014

FireHead

What happens in a world where war has become perpetual, live-reported popcorn entertainment? Answer: we’re as far as we ever were from understanding anything about it. “Far from offering insights into the mysteries of history and politics, these spectacles give us a sense that we are further away than ever from understanding their causes, their implications, and their consequences. Combat makes for a disappointing program — we approach it with great expectations, prepared to encounter essential truths of human existence, but we leave empty-handed.”

Novelist William Boyd reflects on how mortality shapes human existence: “I am convinced that what makes our species unique among the fauna of this small planet circling its insignificant star is that we know we are trapped in time, caught briefly between these two eternities of darkness, the prenatal darkness and the posthumous one.”

Philosopher and journalist Steven Cave meditates on the reality, mystery, and meaning of death, from humans to flies: “Perhaps, as Tennyson believed, death’s relentless reaping should lead us to question the existence of some higher meaning — one above, beyond or external to us. But whoever thought there was such a thing anyway? Not the frogs and tadpoles. . . . Because life is so teeming with intentions and meanings, the death of each creature really is a catastrophe. But we must live with it anyway.”

Paul Kingsnorth, co-founder of the Dark Mountain Project and co-author of Uncivilisation: The Dark Mountain Manifesto, discusses his defeatist position on climate change and the liberation to be found in giving up hope.

Journalist Matt Stroud delves into the unbelievable life and death of Michael C. Ruppert: “After decades of struggle, the notorious doomsayer finally found fame and recognition. Then he shot himself.” (Also see my reflections, in a post published five years ago, on Ruppert’s startling ascent to mainstream fame via the movie Collapse.)

Historian and writer Rebecca Onion looks at how 1980s childhoods changed the way America thought about nuclear Armageddonwith an extended analysis of the role of the 1983 television movie The Day After, which utterly freaked out my 13-year-old self.

Jacob Silverman reflects on the dystopian plight of office drones in the digital tech age: “[They are] more gadgeted-out than ever, but still facing the same struggle for essential benefits, wages, and dignity that workers have for generations. . . . Such are the perverse rewards we reap when we permit tech culture to become our culture. The profits and power flow to the platform owners and their political sponsors. We get the surveillance, the data mining, the soaring inequality, and the canned pep talks from bosses who have been upsold on analytics software. Without Gchat, Twitter, and Facebook — the great release valves of workaday ennui — the roofs of metropolitan skyscrapers would surely be filled with pallid young faces, wondering about the quickest way down.”

Seriously? We’re now entertaining the possibility of robot caregivers? Sociologist and tech expert Zeynep Tufekci is right: this is how to fail the third machine age.

You’ve seen me mention my love of My Dinner with Andre many times here. That’s why I’m so pleased to call attention to this brand new interview from On Point with “The Inscrutable, Ubiquitous Wallace Shawn.” It’s highly recommendable both for the way it offends common radio sensibilities (the whole thing gets off to a rocky start as the interviewer adopts a somewhat glib approach that apparently annoys Mr. Shawn) and for the depth of Shawn’s carefully expressed thoughts on everything from the heady joys of being a writer and articulating things you never knew were in your soul, to the changing nature of conversation in an age when everybody is perpetually interrupted by phone calls and text messages. There is also, of course, some discussion of his portrayal of Vizzini in The Princess Bride. (Oh, and also see the recent pieces on Shawn, and also Andre Gregory, and their new collaboration, in The Wall Street Journal, Vulture, and Salon.)

When I was a kid, my mother actually walked out of the theater during the heart-ripping scene in Indiana Jones and the Temple of Doom. Well, guess what? George Lucas and Steven Spielberg hate that movie’s notorious grimness and violence, too. Grantland unearths the history of why Temple of Doom turned out that way.

 

“Fire Head” image courtesy of Salvatore Vuono / FreeDigitalPhotos.net

Teeming Links – July 11, 2014

FireHead

Apologies for the dearth of posts during the week leading up to now. I have reached crunch time on both the mummy encyclopedia and the paranormal encyclopedia, and, in combination with the fact that just this week I started a new day job at a new (to me) college, my time will be limited in the near future. That said, weekly Teeming Links will continue appearing every Friday. I also have a number of great features lined up for publication, including a very long interview with psychedelic research pioneer James Fadiman (finished and currently in the editing and formatting stage) and the third installment of Dominik Irtenkauf’s “Sounds of Apocalypse” series.

 

TTB_divider

 

Niall Ferguson wonders whether the powers that be will transform the supposed “libertarian utopia” of the Internet into a totalitarian dystopia worthy of Fritz Lang’s Metropolis: “[T]he suspicion cannot be dismissed that, despite all the hype of the Information Age and all the brouhaha about Messrs. Snowden and Assange, the old hierarchies and new networks are in the process of reaching a quiet accommodation with one another, much as thrones and telephones did a century ago.”

Writer and former Omni editor-in-chief Keith Ferrell describes what he has learned from an experiment in living like an 11th-century farmer, or rather, like a post-apocalyptic survivor: “Our modern era’s dependence upon technology and, especially, chemical and motorised technology, has divorced most of us from soil and seeds and fundamental skills. . . . Planning and long-practised rhythms were at the core of the 11th-century farmer’s life; improvisation, much of it desperate, would be the heart of the post-apocalyptic farmer’s existence.”

In a world where the dominating goals of tech development are mobilility and sociality, Nicholas Carr wonders what kinds of alternative technologies and devices we might have if the guiding values were to be stationary and solitary. (Personally, I can think of one such technology, though not an electronic one: the paper book.)

Speaking of which, Andrew Erdmann uses the vehicle of Hal Ashby’s classic 1979 film Being There to reflect on our collective descent into aliteracy and electronically induced infantile idiocy: “I consider myself fortunate that I experienced reading and thinking before the Internet, and the written word before PowerPoint. I like to think that these experiences afford me some self-defense despite my own use of the Blackberry and other technologies.”

Roberto Bolaño says books are the only homeland for the true writer.

Javier Marías says the only real reason to write a novel is because this “allows the novelist to spend much of his time in a fictional world, which is really the only or at least the most bearable place to be.”

The Vatican has formally recognized the International Association of Exorcists and approved their statutes.

In response to the above, Chris French, the prominent skeptic and specialist in the psychology of paranormal beliefs and psychological states, argues in The Guardian that possession is better understood in psychological rather than supernatural terms. (Chris, btw, is writing the entry on anomalistic psychology for my paranormal encyclopedia.)

BBC journalist David Robson offers a firsthand, participatory account of how scientists are using hypnosis to simulate possession and understand why some people believe they’re inhabited by paranormal beings.

Over at Boing Boing, Don Jolly profiles Shannon Taggart, photographer of séances, spirits, and ectoplasm: “Taggart is not a ‘believer,’ in the traditional sense, nor does she seem to debunk her subject. Rather, she presents a world where belief and unbelief are radically mediated by technology — and raises the possibility that in the age of omnipresent electronic image what is ‘true’ may be a much harder debate than the skeptics suppose.” (Shannon, btw, is writing the entries on thoughtography and Kirlian photography for my paranormal encyclopedia.)

Philosopher Bernardo Kastrup absolutely nails, in his typically lucid fashion, the reason why scientific materialism is baloney:

It’s a philosophical and not a logical interpretation of science. Science itself is just a study of the patterns and the regularities that we can observe in reality. It doesn’t carry with it an interpretation. . . . Scientific materialism is when you load the scientific observations of the regularities of nature with an ontological interpretation and you say, “What you’re observing here is matter outside of mind that has an existence that would still go on even if nobody were looking at it.” That is already an interpretation. It’s not really pure science anymore, and the essence of scientific materialism is [the idea] that the real world is outside of mind, it’s independent of mind, and particular arrangements of elements in that real world, namely, subatomic particles, generate mind, generate subjective experience. Now of course the only carrier of reality anyone can know is subjective experience. So materialism is a kind of projection, an abstraction and then a projection onto the world of something that is fundamentally beyond knowledge.

Awesomeness alert: Guillermo del Toro hints — nay, states — that there is still life in his At the Mountains of Madness dream project.

Journalist and novelist Joseph L. Flatley offers an engaging exploration of the real-life occult influence of Lovecraft’s fictional Necronomicon (with much info about, e.g., the origin of the Simonomicon and the theories of Donald Tyson).

 

“Fire Head” image courtesy of Salvatore Vuono / FreeDigitalPhotos.net

Teeming Links – June 13, 2014

FireHead

In a truly horrific instance of mythic and memetic madness, Slender Man has now inspired two teens to try and murder a friend and (apparently) a daughter to try and murder her mother. For those who aren’t aware, Slenderman is “perhaps the Internet’s best and scariest legend,” a daemonic/demonic monster that was created in full view over the past few years by a collective Internet fanbase and that now stands as “a distillation of the most frightening images and trends” in both supernatural folklore and current pop culture (especially the horror genre). In Slenderman we can trace the birth and evolution of a modern monster that has now, in some sense, stepped out of daimonic-folkloric hyperspace and into the literal-factual realm.

Slender_Man

Awesome: Behold the Dystopia Tracker: “Our goal is to document, with your help, all the predictions in literature, film or games that have been made for the future and what has become reality already.”

Arts and culture writer Scott Timberg worries that people who read will become monkish outcasts in a cultural of digital distraction: “Are we doing our son a disservice by allowing him to become a deep and engaged reader? Are we raising a child for the 19th century rather than the 21st, training him on the harpsichord for an Auto-Tune world?”

A. O. Scott reflects deeply on the relationship between art, work, and financial success in a money-obsessed culture.

Roger Scruton analyzes the problem of scientism in the arts and humanities: “This is a sure sign of scientism — that the science precedes the question, and is used to redefine it as a question that the science can solve. . . . [There] are questions that deal with the ‘spirit,’ with Geist, and therefore with phenomena that lie outside the purview of experimental methods.”

Cultural and intellectual historian Sophia Rosenfeld observes that Americans have become tyrannized by too many “choices” in a culture dominated by the neoliberal market model of universal consumerism.

Mitch Horowitz traces the fascinating friendship and mutual inspiration of Timothy Leary and Marshall McLuhan.

Al Gore says the violations that Edward Snowden exposed are greater than the ones he committed.

Comparing_Religions_by_Jeffrey_KripalJeffrey Kripal’s new book Comparing Religions is the first introductory textbook on comparative religion that makes a major place for the paranormal as such. Chapter 8, for example, is titled “The Religious Imagination and Its Paranormal Powers: Angels, Aliens, and Anomalies.” In a recent two-part review (see Part 1 and Part 2), religion scholar and UFO theorist David Halperin writes, “More than a textbook, it’s an initiatory journey. . . . Do UFOs figure in any other textbook of comparative religion?  I haven’t seen it. . . . One of Kripal’s former students, he says while introducing his subject, felt each day as she left class that her tennis shoes had just burst into flames, that she had just stepped onto some very dangerous, but very exciting ground.’ Some will surely have this reaction.” The publisher’s companion Website for the book contains much of interest, including “Six Guidelines for Comparing Religions Responsibly” and detailed summaries of seven religious traditions.

 

“Fire Head” image courtesy of Salvatore Vuono / FreeDigitalPhotos.net
“Slender Man” image courtesy of mdl70 under Creative Commons / Flickr

Teeming Links – May 23, 2014

FireHead

Decline of religious belief means we need more exorcists, say Catholics: “The decline of religious belief in the West and the growth of secularism has ‘opened the window’ to black magic, Satanism and belief in the occult, the organisers of a conference on exorcism have said. The six-day meeting in Rome aims to train about 200 Roman Catholic priests from more than 30 countries in how to cast out evil from people who believe themselves to be in thrall to the Devil.”

Is there a ghost or monster? Is the weather always awful? Is the heroine a virginal saint prone to fainting? Is the villain a murderous tyrant with scary eyes? Are all non-white, non-middle class, non-Protestants portrayed as thoroughly frightening? Chances are you’re reading a Gothic novel.

The Return of Godzilla: “The first time Godzilla appeared, in 1954, Japan was still deep in the trauma of nuclear destruction. Hiroshima and Nagasaki was fresh and terrible memories. US nuclear tests in the Pacific had just rained more death down on Japanese fishermen. And here came the monster. Godzilla. The great force of nature from the deep. Swimming ashore. Stomping through Tokyo. Raising radioactive hell. Godzilla came back again and again. In movies and more. Now, maybe Fukushima’s nuclear disaster has roused the beast. It’s back.”

When you first heard the Snowden revelations about the NSA, did you just kind of shrug and feel like the whole thing merely confirmed what you already knew? This may be no accident: funded by the wealthy and powerful elite, Hollywood has acclimated us to the idea of a surveillance society.

Google Glass and related technologies will create the perfect Orwellian dystopia for workers: “In an office where everyone wears Glass, the very idea of workplace organizing will be utterly unimaginable, as every employee will be turned into an unwilling (perhaps even unwitting) informant for his or her superiors.”

Speaking of dystopias, James Howard Kunstler recently observed that it’s a true sign of the times when, in a society where our digital devices have basically become prosthetic extensions of our hands, it’s impossible to get anybody on the phone anymore.

Also speaking of dystopias, researchers are teaming with the U.S. Navy to develop robots that can make moral decisions. Meanwhile, scientists have no idea how to define human morality.

Net neutrality? Get real. It’s far too late to save the Internet: “The open Internet of legend is already winnowed to the last chaff. . . . To fear a ‘pay to play’ Internet because it will be less hospitable to competition and innovation is not just to board a ship that’s already sailed, but to prepay your cruise vacation down the river Styx.”

And anyway, as far as the Internet goes, it’s totally broken, including, especially, when it comes to security: “It’s hard to explain to regular people how much technology barely works, how much the infrastructure of our lives is held together by the IT equivalent of baling wire. Computers, and computing, are broken. . . . [A]ll computers are reliably this bad: the ones in
hospitals and governments and banks, the ones in your phone, the ones that control light switches and smart meters and air traffic control systems. Industrial computers that maintain infrastructure and manufacturing are even worse. I don’t know all the details, but those who do are the most alcoholic and nihilistic people in computer security.”

Despite Wikipedia’s skeptical disinformation campaign against all paranormal matters, remote viewing is not pseudoscience, says Russell Targ, the field’s most prominent pioneer. What’s more, he easily eviscerates the Wikiskeptics with a revolutionary tool called evidence: “Jessica Utts is a statistics Professor at the University of California, Irvine, and is president of the American Statistical Association. In writing for her part of a 1995 evaluation of our work for the CIA, she wrote: ‘Using the standards applied to any other area of science, it is concluded that psychic functioning has been well established’ . . . . [I]t should be clear that hundreds of people were involved in a 23 year, multi-million dollar operational program at SRI, the CIA, DIA and two dozen intelligence officers at the army base at Ft. Meade. Regardless of the personal opinion of a Wikipedia editor, it is not logically coherent to trivialize this whole remote viewing undertaking as some kind of ‘pseudoscience.’ Besides me, there is a parade of Ph.D. physicists, psychologists, and heads of government agencies who think our work was valuable, though puzzling.”

And finally: “Mesmerists, Mediums, and Mind-readers” (pdf) — Psychologist and stage magician Peter Lamont provides a brief and thoroughly absorbing “history of extraordinary psychological feats, and their relevance for our concept of psychology and science.”

Image courtesy of Salvatore Vuono / FreeDigitalPhotos.net

The digital murder of the Gutenberg mind

Evolution_of_the_Book_Gutenberg

Here’s a double dose of dystopian cheer to accompany a warm and sunny Monday afternoon (or at least that’s the weather here in Central Texas).

First, Adam Kirsch, writing for The New Republic, in a piece dated May 2:

Everyone who ever swore to cling to typewriters, record players, and letters now uses word processors, iPods, and e-mail. There is no room for Bartlebys in the twenty-first century, and if a few still exist they are scorned. (Bartleby himself was scorned, which was the whole point of his preferring not to.) Extend this logic from physical technology to intellectual technology, and it seems almost like common sense to say that if we are not all digital humanists now, we will be in a few years. As the authors of Digital_Humanities write, with perfect confidence in the inexorability — and the desirability — of their goals, “the 8-page essay and the 25-page research paper will have to make room for the game design, the multi-player narrative, the video mash-up, the online exhibit and other new forms and formats as pedagogical exercises.”

. . . The best thing that the humanities could do at this moment, then, is not to embrace the momentum of the digital, the tech tsunami, but to resist it and to critique it. This is not Luddism; it is intellectual responsibility. Is it actually true that reading online is an adequate substitute for reading on paper? If not, perhaps we should not be concentrating on digitizing our books but on preserving and circulating them more effectively. Are images able to do the work of a complex discourse? If not, and reasoning is irreducibly linguistic, then it would be a grave mistake to move writing away from the center of a humanities education.

. . . The posture of skepticism is a wearisome one for the humanities, now perhaps more than ever, when technology is so confident and culture is so self-suspicious. It is no wonder that some humanists are tempted to throw off the traditional burden and infuse the humanities with the material resources and the militant confidence of the digital. The danger is that they will wake up one morning to find that they have sold their birthright for a mess of apps.

MORE: “The False Promise of the Digital Humanities

Second, Will Self, writing for The Guardian, in a piece also dated May 2:

The literary novel as an art work and a narrative art form central to our culture is indeed dying before our eyes. Let me refine my terms: I do not mean narrative prose fiction tout court is dying — the kidult boywizardsroman and the soft sadomasochistic porn fantasy are clearly in rude good health. And nor do I mean that serious novels will either cease to be written or read. But what is already no longer the case is the situation that obtained when I was a young man. In the early 1980s, and I would argue throughout the second half of the last century, the literary novel was perceived to be the prince of art forms, the cultural capstone and the apogee of creative endeavour. The capability words have when arranged sequentially to both mimic the free flow of human thought and investigate the physical expressions and interactions of thinking subjects; the way they may be shaped into a believable simulacrum of either the commonsensical world, or any number of invented ones; and the capability of the extended prose form itself, which, unlike any other art form, is able to enact self-analysis, to describe other aesthetic modes and even mimic them. All this led to a general acknowledgment: the novel was the true Wagnerian Gesamtkunstwerk.

. . . [T]he advent of digital media is not simply destructive of the codex, but of the Gutenberg mind itself. There is one question alone that you must ask yourself in order to establish whether the serious novel will still retain cultural primacy and centrality in another 20 years. This is the question: if you accept that by then the vast majority of text will be read in digital form on devices linked to the web, do you also believe that those readers will voluntarily choose to disable that connectivity? If your answer to this is no, then the death of the novel is sealed out of your own mouth.

. . . I believe the serious novel will continue to be written and read, but it will be an art form on a par with easel painting or classical music: confined to a defined social and demographic group, requiring a degree of subsidy, a subject for historical scholarship rather than public discourse. . . . I’ve no intention of writing fictions in the form of tweets or text messages — nor do I see my future in computer-games design. My apprenticeship as a novelist has lasted a long time now, and I still cherish hopes of eventually qualifying. Besides, as the possessor of a Gutenberg mind, it is quite impossible for me to foretell what the new dominant narrative art form will be — if, that is, there is to be one at all.

MORE: “The Novel Is Dead (This Time It’s for Real)

 

Image: Painting: John White Alexander (1856–1915); Photo: Andreas Praefcke (Own work (own photograph)) [Public domain or Public domain], via Wikimedia Commons

The bias of scientific materialism and the reality of paranormal experience

Opened_Doors_to_Heaven

In my recent post about Jeff Kripal’s article “Visions of the Impossible,” I mentioned that biologist and hardcore skeptical materialist Jerry Coyne published a scathing response to Jeff’s argument soon after it appeared. For those who would like to keep up with the conversation, here’s the heart of Coyne’s response (which, in its full version, shows him offering several direct responses to several long passages that he quotes from Jeff’s piece):

For some reason the Chronicle of Higher Education, a weekly publication that details doings (and available jobs) in American academia, has shown a penchant for bashing science and promoting anti-materialist views. . . . I’m not sure why that is, but I suspect it has something to do with supporting the humanities against the dreaded incursion of science — the bogus disease of “scientism.”

That’s certainly the case with a big new article in the Chronicle, “Visions of the impossible: how ‘fantastic’ stories unlock the nature of consciousness,” by Jeffrey J. Kripal, a professor of religious studies at Rice University in Texas. Given his position, it’s not surprising that Kripal’s piece is an argument about Why There is Something Out There Beyond Science. And although the piece is long, I can summarize its thesis in two sentences (these are my words, not Kripal’s):

“People have had weird experiences, like dreaming in great detail about something happening before it actually does; and because these events can’t be explained by science, the most likely explanation is that they are messages from some non-material realm beyond our ken. If you combine that with science’s complete failure to understand consciousness, we must conclude that naturalism is not sufficient to understand the universe, and that our brains are receiving some sort of ‘transhuman signals.'”

That sounds bizarre, especially for a distinguished periodical, but anti-naturalism seems to be replacing postmodernism as the latest way to bash science in academia.

. . . But our brain is not anything like a radio. The information processed in that organ comes not from a transhuman ether replete with other people’s thoughts, but from signals sent from one neuron to another, ultimately deriving from the effect of our physical environment on our senses. If you cut your optic nerves, you go blind; if you cut the auditory nerves, you become deaf. Without such sensory inputs, whose mechanisms we understand well, we simply don’t get information from the spooky channels promoted by Kripal.

When science manages to find reliable evidence for that kind of clairvoyance, I’ll begin to pay attention. Until then, the idea of our brain as a supernatural radio seems like a kind of twentieth-century alchemy—the resort of those whose will to believe outstrips their respect for the facts.

Full article: “Science Is Being Bashed by Academic Who Should Know Better

(An aside: Is it just me, or in his second paragraph above does Coyne effectively insult and dismiss the entire field of religious studies and all of the people who work in it?)

Jeff responded five days later in a second piece for the Chronicle, where he met Coyne’s criticisms head-on with words like these: Read the rest of this entry

Superfluous humans in a world of smart machines

Robot_Hand_and_Earth_Globe

Remember Ray Bradbury’s classic dystopian short story “The Veldt” (excerpted here) with its nightmare vision of a soul-sapping high-technological future where monstrously narcissistic — and, as it turns out, sociopathic and homicidal — children resent even having to tie their own shoes and brush their own teeth, since they’re accustomed to having these things done for them by machines?

Remember Kubrick’s and Clarke’s 2001: A Space Odyssey, where HAL, the super-intelligent AI system that runs the spaceship Discovery, decides to kill the human crew that he has been created to serve, because he has realized/decided that humans are too defective and error-prone to be allowed to jeopardize the mission?

Remember that passage (which I’ve quoted here before) from John David Ebert’s The New Media Invasion in which Ebert identifies the dehumanizing technological trend that’s currently unfolding all around us? Humans, says Ebert, are becoming increasingly superfluous in a culture of technology worship:

Everywhere we look nowadays, we find the same worship of the machine at the expense of the human being, who always comes out of the equation looking like an inconvenient, leftover remainder: instead of librarians to check out your books for you, a machine will do it better; instead of clerks to ring up your groceries for you, a self-checkout will do it better; instead of a real live DJ on the radio, an electronic one will do the job better; instead of a policeman to write you a traffic ticket, a camera (connected to a computer) will do it better. In other words . . . the human being is actually disappearing from his own society, just as the automobile long ago caused him to disappear from the streets of his cities . . . . [O]ur society is increasingly coming to be run and operated by machines instead of people. Machines are making more and more of our decisions for us; soon, they will be making all of them.

Bear all of that in mind, and then read this, which is just the latest in a volley of media reports about the encroaching advent, both rhetorical and factual, of all these things in the real world:

A house that tracks your every movement through your car and automatically heats up before you get home. A toaster that talks to your refrigerator and announces when breakfast is ready through your TV. A toothbrush that tattles on kids by sending a text message to their parents. Exciting or frightening, these connected devices of the futuristic “smart” home may be familiar to fans of science fiction. Now the tech industry is making them a reality.

Mundane physical objects all around us are connecting to networks, communicating with mobile devices and each other to create what’s being called an “Internet of Things,” or IoT. Smart homes are just one segment — cars, clothing, factories and anything else you can imagine will eventually be “smart” as well.

. . . We won’t really know how the technology will change our lives until we get it into the hands of creative developers. “The guys who had been running mobile for 20 years had no idea that some developer was going to take the touchscreen and microphone and some graphical resources and turn a phone into a flute,” [Liat] Ben-Zur [of chipmaker Qualcomm] said.

The same may be true when developers start experimenting with apps for connected home appliances. “Exposing that, how your toothbrush and your water heater and your thermostat . . . are going to interact with you, with your school, that’s what’s next,” said Ben-Zur.

MORE: “The Internet of Things: Helping Smart Devices Talk to Each Other

Image courtesy of Victor Habbick / FreeDigitalPhotos.net

Teeming Links – March 28, 2014

FireHead

It turns out that right as I was putting together last week’s Teeming Brain doom-and-gloom update, a new “official prophecy of doom” had just been issued from a very prominent and mainstream source: “Global warming will cause widespread conflict, displace millions of people and devastate the global economy. Leaked draft report from UN panel seen by The Independent is most comprehensive investigation into impact of climate change ever undertaken — and it’s not good news.”

Did President Obama really just try to defend the U.S. war in Iraq while delivering a speech criticizing Russia’s actions in Crimea and Ukraine? Why, yes. Yes, he did. (Quoth Bill Clinton at the 2012 Democratic Convention: “It takes some brass to attack a guy for doing what you did.”)

What used to be paranoid is now considered the essence of responsible parenting. Ours is an age of obsessive parental overprotectiveness.

FALSE: Mental illness is caused by a “chemical imbalance” in the brain. FALSE: The DSM, the psychiatric profession’s diagnostic Bible, is scientifically valid and reliable. The whole field of psychiatry is imploding before our eyes. (Also see this.)

And even as mainstream psychiatry is self-destructing, the orthodox gospel of healthy eating continues to crumble — a development now being tracked by mainstream journalism. Almost everything we’ve been told for the past four decades is wrong. In point of fact, fatty foods like butter and cheese are better for you than trans-fat margarines. There’s basically no link between fats and heart disease .

Meanwhile, researchers are giving psychedelics to cancer patients to help alleviate their despair — and it’s working:

They almost uniformly experienced a dramatic reduction in existential anxiety and depression, and an increased acceptance of the cancer, and the changes lasted a year or more and in some cases were permanent. . . . [Stephen] Ross [director of the Division of Alcoholism and Drug Abuse at Bellevue Hospital in New York] is part of a new generation of researchers who have re-discovered what scientists knew more than half a century ago: that psychedelics can be good medicine. . . . Scientists still don’t completely understand why psychedelics seem to offer a shortcut to spiritual enlightenment, allowing people to experience life-changing insights that they are often unable to achieve after decades of therapy. But researchers are hopeful that will change, and that the success of these new studies will signal a renaissance in research into these powerful mind-altering drugs.

Don’t look now, but the future is a social media-fied video game:

In five years’ time, all news articles will consist of a single coloured icon you click repeatedly to make info-nuggets fly out, accompanied by musical notes, like a cross between Flappy Bird and Newsnight. . . . Meanwhile, video games and social media will combine to create a world in which you unlock exciting advantages in real life by accruing followers and influence. Every major city will house a glamorous gentrified enclave to which only successful social brand identities (or “people” as they used to be known) with more than 300,000 followers will be permitted entry, and a load of cardboard boxes and dog shit on the outside for everybody else.

Deflating the digital humanists:

[To portray their work] as part of a Copernican turn in the humanities overstates the extent to which it is anything more than a very useful tool for quantifying cultural and intellectual trends. It’s a new way of gathering information about culture, rather than a new way of thinking about it or of understanding it — things for which we continue to rely on the analog humanities.

Science and “progress” can’t tell us how to live. They can’t address the deep meaning of life, the universe, and everything. So where to turn? How about philosophy, which is unendingly relevant:

We are deluged with information; we know how to track down facts in seconds; the scientific method produces new discoveries every day. But what does all that mean for us? . . . The grand forward push of human knowledge requires each of us to begin by trying to think independently, to recognize that knowledge is more than information, to see that we are moral beings who must closely interrogate both ourselves and the world we inhabit — to live, as Socrates recommended, an examined life.

Take this, all of you scoffers at Fortean phenomena (and/or at Sharknado): “When Animals Fall from the Sky: The Surprising Science of Animal Rain

Finally, here’s a neat look at the evolution of popular American cinema in 3 minutes, underlaid by Grieg’s “In the Hall of the Mountain King”:

Image courtesy of Salvatore Vuono / FreeDigitalPhotos.net

Collapse and dystopia: Three recent updates on our possible future

Apocalypse_Wave

It looks like we can forget about “collapse fatigue,” the term — which I just now made up (or maybe not) — for the eventual exhaustion of the doom-and-collapse meme that has been raging its way through our collective public discourse and private psyches for the past decade-plus. I say this based on three recent items that have come to my attention spontaneously, as in, I didn’t go looking for them, but instead found them shoved into my awareness.

ONE: Just a couple of weeks after James Howard Kunstler asked “Are You Crazy to Continue Believing in Collapse?” — and answered, in sum, “No” — we now see that

TWO: a new collapse warning of rather epic proportions and pedigree has begun making its way through the online doom-o-sphere, starting with a piece in The Guardian:

A new study sponsored by Nasa’s Goddard Space Flight Center has highlighted the prospect that global industrial civilisation could collapse in coming decades due to unsustainable resource exploitation and increasingly unequal wealth distribution. Noting that warnings of ‘collapse’ are often seen to be fringe or controversial, the study attempts to make sense of compelling historical data showing that “the process of rise-and-collapse is actually a recurrent cycle found throughout history.” Cases of severe civilisational disruption due to “precipitous collapse – often lasting centuries – have been quite common.”

. . . By investigating the human-nature dynamics of these past cases of collapse, the project identifies the most salient interrelated factors which explain civilisational decline, and which may help determine the risk of collapse today: namely, Population, Climate, Water, Agriculture, and Energy.

These factors can lead to collapse when they converge to generate two crucial social features: “the stretching of resources due to the strain placed on the ecological carrying capacity”; and “the economic stratification of society into Elites [rich] and Masses (or “Commoners”) [poor]” These social phenomena have played “a central role in the character or in the process of the collapse,” in all such cases over “the last five thousand years.”

. . . Modelling a range of different scenarios, Motesharri and his colleagues conclude that under conditions “closely reflecting the reality of the world today . . . we find that collapse is difficult to avoid.”

FULL TEXT:
Nasa-funded study: industrial civilisation headed for ‘irreversible collapse’?

The study highlights, in a manner reminiscent of dystopian science fiction, the specific way this division into Elites and Masses not only might play out but has played out in the histories of real societies and civilizations: Read the rest of this entry