Blog Archives

Teeming Links – May 23, 2014

FireHead

Decline of religious belief means we need more exorcists, say Catholics: “The decline of religious belief in the West and the growth of secularism has ‘opened the window’ to black magic, Satanism and belief in the occult, the organisers of a conference on exorcism have said. The six-day meeting in Rome aims to train about 200 Roman Catholic priests from more than 30 countries in how to cast out evil from people who believe themselves to be in thrall to the Devil.”

Is there a ghost or monster? Is the weather always awful? Is the heroine a virginal saint prone to fainting? Is the villain a murderous tyrant with scary eyes? Are all non-white, non-middle class, non-Protestants portrayed as thoroughly frightening? Chances are you’re reading a Gothic novel.

The Return of Godzilla: “The first time Godzilla appeared, in 1954, Japan was still deep in the trauma of nuclear destruction. Hiroshima and Nagasaki was fresh and terrible memories. US nuclear tests in the Pacific had just rained more death down on Japanese fishermen. And here came the monster. Godzilla. The great force of nature from the deep. Swimming ashore. Stomping through Tokyo. Raising radioactive hell. Godzilla came back again and again. In movies and more. Now, maybe Fukushima’s nuclear disaster has roused the beast. It’s back.”

When you first heard the Snowden revelations about the NSA, did you just kind of shrug and feel like the whole thing merely confirmed what you already knew? This may be no accident: funded by the wealthy and powerful elite, Hollywood has acclimated us to the idea of a surveillance society.

Google Glass and related technologies will create the perfect Orwellian dystopia for workers: “In an office where everyone wears Glass, the very idea of workplace organizing will be utterly unimaginable, as every employee will be turned into an unwilling (perhaps even unwitting) informant for his or her superiors.”

Speaking of dystopias, James Howard Kunstler recently observed that it’s a true sign of the times when, in a society where our digital devices have basically become prosthetic extensions of our hands, it’s impossible to get anybody on the phone anymore.

Also speaking of dystopias, researchers are teaming with the U.S. Navy to develop robots that can make moral decisions. Meanwhile, scientists have no idea how to define human morality.

Net neutrality? Get real. It’s far too late to save the Internet: “The open Internet of legend is already winnowed to the last chaff. . . . To fear a ‘pay to play’ Internet because it will be less hospitable to competition and innovation is not just to board a ship that’s already sailed, but to prepay your cruise vacation down the river Styx.”

And anyway, as far as the Internet goes, it’s totally broken, including, especially, when it comes to security: “It’s hard to explain to regular people how much technology barely works, how much the infrastructure of our lives is held together by the IT equivalent of baling wire. Computers, and computing, are broken. . . . [A]ll computers are reliably this bad: the ones in
hospitals and governments and banks, the ones in your phone, the ones that control light switches and smart meters and air traffic control systems. Industrial computers that maintain infrastructure and manufacturing are even worse. I don’t know all the details, but those who do are the most alcoholic and nihilistic people in computer security.”

Despite Wikipedia’s skeptical disinformation campaign against all paranormal matters, remote viewing is not pseudoscience, says Russell Targ, the field’s most prominent pioneer. What’s more, he easily eviscerates the Wikiskeptics with a revolutionary tool called evidence: “Jessica Utts is a statistics Professor at the University of California, Irvine, and is president of the American Statistical Association. In writing for her part of a 1995 evaluation of our work for the CIA, she wrote: ‘Using the standards applied to any other area of science, it is concluded that psychic functioning has been well established’ . . . . [I]t should be clear that hundreds of people were involved in a 23 year, multi-million dollar operational program at SRI, the CIA, DIA and two dozen intelligence officers at the army base at Ft. Meade. Regardless of the personal opinion of a Wikipedia editor, it is not logically coherent to trivialize this whole remote viewing undertaking as some kind of ‘pseudoscience.’ Besides me, there is a parade of Ph.D. physicists, psychologists, and heads of government agencies who think our work was valuable, though puzzling.”

And finally: “Mesmerists, Mediums, and Mind-readers” (pdf) — Psychologist and stage magician Peter Lamont provides a brief and thoroughly absorbing “history of extraordinary psychological feats, and their relevance for our concept of psychology and science.”

Image courtesy of Salvatore Vuono / FreeDigitalPhotos.net

The digital murder of the Gutenberg mind

Evolution_of_the_Book_Gutenberg

Here’s a double dose of dystopian cheer to accompany a warm and sunny Monday afternoon (or at least that’s the weather here in Central Texas).

First, Adam Kirsch, writing for The New Republic, in a piece dated May 2:

Everyone who ever swore to cling to typewriters, record players, and letters now uses word processors, iPods, and e-mail. There is no room for Bartlebys in the twenty-first century, and if a few still exist they are scorned. (Bartleby himself was scorned, which was the whole point of his preferring not to.) Extend this logic from physical technology to intellectual technology, and it seems almost like common sense to say that if we are not all digital humanists now, we will be in a few years. As the authors of Digital_Humanities write, with perfect confidence in the inexorability — and the desirability — of their goals, “the 8-page essay and the 25-page research paper will have to make room for the game design, the multi-player narrative, the video mash-up, the online exhibit and other new forms and formats as pedagogical exercises.”

. . . The best thing that the humanities could do at this moment, then, is not to embrace the momentum of the digital, the tech tsunami, but to resist it and to critique it. This is not Luddism; it is intellectual responsibility. Is it actually true that reading online is an adequate substitute for reading on paper? If not, perhaps we should not be concentrating on digitizing our books but on preserving and circulating them more effectively. Are images able to do the work of a complex discourse? If not, and reasoning is irreducibly linguistic, then it would be a grave mistake to move writing away from the center of a humanities education.

. . . The posture of skepticism is a wearisome one for the humanities, now perhaps more than ever, when technology is so confident and culture is so self-suspicious. It is no wonder that some humanists are tempted to throw off the traditional burden and infuse the humanities with the material resources and the militant confidence of the digital. The danger is that they will wake up one morning to find that they have sold their birthright for a mess of apps.

MORE: “The False Promise of the Digital Humanities

Second, Will Self, writing for The Guardian, in a piece also dated May 2:

The literary novel as an art work and a narrative art form central to our culture is indeed dying before our eyes. Let me refine my terms: I do not mean narrative prose fiction tout court is dying — the kidult boywizardsroman and the soft sadomasochistic porn fantasy are clearly in rude good health. And nor do I mean that serious novels will either cease to be written or read. But what is already no longer the case is the situation that obtained when I was a young man. In the early 1980s, and I would argue throughout the second half of the last century, the literary novel was perceived to be the prince of art forms, the cultural capstone and the apogee of creative endeavour. The capability words have when arranged sequentially to both mimic the free flow of human thought and investigate the physical expressions and interactions of thinking subjects; the way they may be shaped into a believable simulacrum of either the commonsensical world, or any number of invented ones; and the capability of the extended prose form itself, which, unlike any other art form, is able to enact self-analysis, to describe other aesthetic modes and even mimic them. All this led to a general acknowledgment: the novel was the true Wagnerian Gesamtkunstwerk.

. . . [T]he advent of digital media is not simply destructive of the codex, but of the Gutenberg mind itself. There is one question alone that you must ask yourself in order to establish whether the serious novel will still retain cultural primacy and centrality in another 20 years. This is the question: if you accept that by then the vast majority of text will be read in digital form on devices linked to the web, do you also believe that those readers will voluntarily choose to disable that connectivity? If your answer to this is no, then the death of the novel is sealed out of your own mouth.

. . . I believe the serious novel will continue to be written and read, but it will be an art form on a par with easel painting or classical music: confined to a defined social and demographic group, requiring a degree of subsidy, a subject for historical scholarship rather than public discourse. . . . I’ve no intention of writing fictions in the form of tweets or text messages — nor do I see my future in computer-games design. My apprenticeship as a novelist has lasted a long time now, and I still cherish hopes of eventually qualifying. Besides, as the possessor of a Gutenberg mind, it is quite impossible for me to foretell what the new dominant narrative art form will be — if, that is, there is to be one at all.

MORE: “The Novel Is Dead (This Time It’s for Real)

 

Image: Painting: John White Alexander (1856–1915); Photo: Andreas Praefcke (Own work (own photograph)) [Public domain or Public domain], via Wikimedia Commons

The bias of scientific materialism and the reality of paranormal experience

Opened_Doors_to_Heaven

In my recent post about Jeff Kripal’s article “Visions of the Impossible,” I mentioned that biologist and hardcore skeptical materialist Jerry Coyne published a scathing response to Jeff’s argument soon after it appeared. For those who would like to keep up with the conversation, here’s the heart of Coyne’s response (which, in its full version, shows him offering several direct responses to several long passages that he quotes from Jeff’s piece):

For some reason the Chronicle of Higher Education, a weekly publication that details doings (and available jobs) in American academia, has shown a penchant for bashing science and promoting anti-materialist views. . . . I’m not sure why that is, but I suspect it has something to do with supporting the humanities against the dreaded incursion of science — the bogus disease of “scientism.”

That’s certainly the case with a big new article in the Chronicle, “Visions of the impossible: how ‘fantastic’ stories unlock the nature of consciousness,” by Jeffrey J. Kripal, a professor of religious studies at Rice University in Texas. Given his position, it’s not surprising that Kripal’s piece is an argument about Why There is Something Out There Beyond Science. And although the piece is long, I can summarize its thesis in two sentences (these are my words, not Kripal’s):

“People have had weird experiences, like dreaming in great detail about something happening before it actually does; and because these events can’t be explained by science, the most likely explanation is that they are messages from some non-material realm beyond our ken. If you combine that with science’s complete failure to understand consciousness, we must conclude that naturalism is not sufficient to understand the universe, and that our brains are receiving some sort of ‘transhuman signals.'”

That sounds bizarre, especially for a distinguished periodical, but anti-naturalism seems to be replacing postmodernism as the latest way to bash science in academia.

. . . But our brain is not anything like a radio. The information processed in that organ comes not from a transhuman ether replete with other people’s thoughts, but from signals sent from one neuron to another, ultimately deriving from the effect of our physical environment on our senses. If you cut your optic nerves, you go blind; if you cut the auditory nerves, you become deaf. Without such sensory inputs, whose mechanisms we understand well, we simply don’t get information from the spooky channels promoted by Kripal.

When science manages to find reliable evidence for that kind of clairvoyance, I’ll begin to pay attention. Until then, the idea of our brain as a supernatural radio seems like a kind of twentieth-century alchemy—the resort of those whose will to believe outstrips their respect for the facts.

Full article: “Science Is Being Bashed by Academic Who Should Know Better

(An aside: Is it just me, or in his second paragraph above does Coyne effectively insult and dismiss the entire field of religious studies and all the people who work in it?)

Jeff responded five days later in a second piece for the Chronicle, where he met Coyne’s criticisms head-on with words like these: Read the rest of this entry

Superfluous humans in a world of smart machines

Robot_Hand_and_Earth_Globe

Remember Ray Bradbury’s classic dystopian short story “The Veldt” (excerpted here) with its nightmare vision of a soul-sapping high-technological future where monstrously narcissistic — and, as it turns out, sociopathic and homicidal — children resent even having to tie their own shoes and brush their own teeth, since they’re accustomed to having these things done for them by machines?

Remember Kubrick’s and Clarke’s 2001: A Space Odyssey, where HAL, the super-intelligent AI system that runs the spaceship Discovery, decides to kill the human crew that he has been created to serve, because he has realized/decided that humans are too defective and error-prone to be allowed to jeopardize the mission?

Remember that passage (which I’ve quoted here before) from John David Ebert’s The New Media Invasion in which Ebert identifies the dehumanizing technological trend that’s currently unfolding all around us? Humans, says Ebert, are becoming increasingly superfluous in a culture of technology worship:

Everywhere we look nowadays, we find the same worship of the machine at the expense of the human being, who always comes out of the equation looking like an inconvenient, leftover remainder: instead of librarians to check out your books for you, a machine will do it better; instead of clerks to ring up your groceries for you, a self-checkout will do it better; instead of a real live DJ on the radio, an electronic one will do the job better; instead of a policeman to write you a traffic ticket, a camera (connected to a computer) will do it better. In other words . . . the human being is actually disappearing from his own society, just as the automobile long ago caused him to disappear from the streets of his cities . . . . [O]ur society is increasingly coming to be run and operated by machines instead of people. Machines are making more and more of our decisions for us; soon, they will be making all of them.

Bear all of that in mind, and then read this, which is just the latest in a volley of media reports about the encroaching advent, both rhetorical and factual, of all these things in the real world:

A house that tracks your every movement through your car and automatically heats up before you get home. A toaster that talks to your refrigerator and announces when breakfast is ready through your TV. A toothbrush that tattles on kids by sending a text message to their parents. Exciting or frightening, these connected devices of the futuristic “smart” home may be familiar to fans of science fiction. Now the tech industry is making them a reality.

Mundane physical objects all around us are connecting to networks, communicating with mobile devices and each other to create what’s being called an “Internet of Things,” or IoT. Smart homes are just one segment — cars, clothing, factories and anything else you can imagine will eventually be “smart” as well.

. . . We won’t really know how the technology will change our lives until we get it into the hands of creative developers. “The guys who had been running mobile for 20 years had no idea that some developer was going to take the touchscreen and microphone and some graphical resources and turn a phone into a flute,” [Liat] Ben-Zur [of chipmaker Qualcomm] said.

The same may be true when developers start experimenting with apps for connected home appliances. “Exposing that, how your toothbrush and your water heater and your thermostat . . . are going to interact with you, with your school, that’s what’s next,” said Ben-Zur.

MORE: “The Internet of Things: Helping Smart Devices Talk to Each Other

Image courtesy of Victor Habbick / FreeDigitalPhotos.net

Teeming Links – March 28, 2014

FireHead

It turns out that right as I was putting together last week’s Teeming Brain doom-and-gloom update, a new “official prophecy of doom” had just been issued from a very prominent and mainstream source: “Global warming will cause widespread conflict, displace millions of people and devastate the global economy. Leaked draft report from UN panel seen by The Independent is most comprehensive investigation into impact of climate change ever undertaken — and it’s not good news.”

Did President Obama really just try to defend the U.S. war in Iraq while delivering a speech criticizing Russia’s actions in Crimea and Ukraine? Why, yes. Yes, he did. (Quoth Bill Clinton at the 2012 Democratic Convention: “It takes some brass to attack a guy for doing what you did.”)

What used to be paranoid is now considered the essence of responsible parenting. Ours is an age of obsessive parental overprotectiveness.

FALSE: Mental illness is caused by a “chemical imbalance” in the brain. FALSE: The DSM, the psychiatric profession’s diagnostic Bible, is scientifically valid and reliable. The whole field of psychiatry is imploding before our eyes. (Also see this.)

And even as mainstream psychiatry is self-destructing, the orthodox gospel of healthy eating continues to crumble — a development now being tracked by mainstream journalism. Almost everything we’ve been told for the past four decades is wrong. In point of fact, fatty foods like butter and cheese are better for you than trans-fat margarines. There’s basically no link between fats and heart disease .

Meanwhile, researchers are giving psychedelics to cancer patients to help alleviate their despair — and it’s working:

They almost uniformly experienced a dramatic reduction in existential anxiety and depression, and an increased acceptance of the cancer, and the changes lasted a year or more and in some cases were permanent. . . . [Stephen] Ross [director of the Division of Alcoholism and Drug Abuse at Bellevue Hospital in New York] is part of a new generation of researchers who have re-discovered what scientists knew more than half a century ago: that psychedelics can be good medicine. . . . Scientists still don’t completely understand why psychedelics seem to offer a shortcut to spiritual enlightenment, allowing people to experience life-changing insights that they are often unable to achieve after decades of therapy. But researchers are hopeful that will change, and that the success of these new studies will signal a renaissance in research into these powerful mind-altering drugs.

Don’t look now, but the future is a social media-fied video game:

In five years’ time, all news articles will consist of a single coloured icon you click repeatedly to make info-nuggets fly out, accompanied by musical notes, like a cross between Flappy Bird and Newsnight. . . . Meanwhile, video games and social media will combine to create a world in which you unlock exciting advantages in real life by accruing followers and influence. Every major city will house a glamorous gentrified enclave to which only successful social brand identities (or “people” as they used to be known) with more than 300,000 followers will be permitted entry, and a load of cardboard boxes and dog shit on the outside for everybody else.

Deflating the digital humanists:

[To portray their work] as part of a Copernican turn in the humanities overstates the extent to which it is anything more than a very useful tool for quantifying cultural and intellectual trends. It’s a new way of gathering information about culture, rather than a new way of thinking about it or of understanding it — things for which we continue to rely on the analog humanities.

Science and “progress” can’t tell us how to live. They can’t address the deep meaning of life, the universe, and everything. So where to turn? How about philosophy, which is unendingly relevant:

We are deluged with information; we know how to track down facts in seconds; the scientific method produces new discoveries every day. But what does all that mean for us? . . . The grand forward push of human knowledge requires each of us to begin by trying to think independently, to recognize that knowledge is more than information, to see that we are moral beings who must closely interrogate both ourselves and the world we inhabit — to live, as Socrates recommended, an examined life.

Take this, all of you scoffers at Fortean phenomena (and/or at Sharknado): “When Animals Fall from the Sky: The Surprising Science of Animal Rain

Finally, here’s a neat look at the evolution of popular American cinema in 3 minutes, underlaid by Grieg’s “In the Hall of the Mountain King”:

Image courtesy of Salvatore Vuono / FreeDigitalPhotos.net

Collapse and dystopia: Three recent updates on our possible future

Apocalypse_Wave

It looks like we can forget about “collapse fatigue,” the term — which I just now made up (or maybe not) — for the eventual exhaustion of the doom-and-collapse meme that has been raging its way through our collective public discourse and private psyches for the past decade-plus. I say this based on three recent items that have come to my attention spontaneously, as in, I didn’t go looking for them, but instead found them shoved into my awareness.

ONE: Just a couple of weeks after James Howard Kunstler asked “Are You Crazy to Continue Believing in Collapse?” — and answered, in sum, “No” — we now see that

TWO: a new collapse warning of rather epic proportions and pedigree has begun making its way through the online doom-o-sphere, starting with a piece in The Guardian:

A new study sponsored by Nasa’s Goddard Space Flight Center has highlighted the prospect that global industrial civilisation could collapse in coming decades due to unsustainable resource exploitation and increasingly unequal wealth distribution. Noting that warnings of ‘collapse’ are often seen to be fringe or controversial, the study attempts to make sense of compelling historical data showing that “the process of rise-and-collapse is actually a recurrent cycle found throughout history.” Cases of severe civilisational disruption due to “precipitous collapse – often lasting centuries – have been quite common.”

. . . By investigating the human-nature dynamics of these past cases of collapse, the project identifies the most salient interrelated factors which explain civilisational decline, and which may help determine the risk of collapse today: namely, Population, Climate, Water, Agriculture, and Energy.

These factors can lead to collapse when they converge to generate two crucial social features: “the stretching of resources due to the strain placed on the ecological carrying capacity”; and “the economic stratification of society into Elites [rich] and Masses (or “Commoners”) [poor]” These social phenomena have played “a central role in the character or in the process of the collapse,” in all such cases over “the last five thousand years.”

. . . Modelling a range of different scenarios, Motesharri and his colleagues conclude that under conditions “closely reflecting the reality of the world today . . . we find that collapse is difficult to avoid.”

FULL TEXT:
Nasa-funded study: industrial civilisation headed for ‘irreversible collapse’?

The study highlights, in a manner reminiscent of dystopian science fiction, the specific way this division into Elites and Masses not only might play out but has played out in the histories of real societies and civilizations: Read the rest of this entry

Jacques Ellul’s nightmare vision of a technological dystopia

The_Technological_Society_by_Jacques_Ellul

It’s lovely to see one of my formative philosophical influences, and a man whose dystopian critique of technology is largely unknown to the populace at large these days — although it has deeply influenced such iconic cultural texts as Koyaanisqatsi — getting some mainstream attention (in The Boston Globe, two years ago):

Imagine for a moment that pretty much everything you think about technology is wrong. That the devices you believed are your friends are in fact your enemies. That they are involved in a vast conspiracy to colonize your mind and steal your soul. That their ultimate aim is to turn you into one of them: a machine.

It’s a staple of science fiction plots, and perhaps the fever dream of anyone who’s struggled too long with a crashing computer. But that nightmare vision is also a serious intellectual proposition, the legacy of a French social theorist who argued that the takeover by machines is actually happening, and that it’s much further along than we think. His name was Jacques Ellul, and a small but devoted group of followers consider him a genius.

To celebrate the centenary of his birth, a group of Ellul scholars will be gathering today at a conference to be held at Wheaton College near Chicago. The conference title: “Prophet in the Technological Wilderness.”

Ellul, who died in 1994, was the author of a series of books on the philosophy of technology, beginning with The Technological Society, published in France in 1954 and in English a decade later. His central argument is that we’re mistaken in thinking of technology as simply a bunch of different machines. In truth, Ellul contended, technology should be seen as a unified entity, an overwhelming force that has already escaped our control. That force is turning the world around us into something cold and mechanical, and — whether we realize it or not — transforming human beings along with it.

In an era of rampant technological enthusiasm, this is not a popular message, which is one reason Ellul isn’t well known. It doesn’t help that he refused to offer ready-made solutions for the problems he identified. His followers will tell you that neither of these things mean he wasn’t right; if nothing else, they say, Ellul provides one of the clearest existing analyses of what we’re up against. It’s not his fault it isn’t a pretty picture.

. . . Technology moves forward because we let it, he believed, and we let it because we worship it. “Technology becomes our fate only when we treat it as sacred,” says Darrell J. Fasching, a professor emeritus of religious studies at the University of South Florida. “And we tend to do that a lot.”

. . . “Ellul never opposed all participation in technology,” [says David Gill, founding president of the International Jacques Ellul Society and a professor of ethics at the Gordon-Conwell Theological Seminary]. “He didn’t live in the woods, he lived in a nice house with electric lights. He didn’t drive, but his wife did, and he rode in a car. But he knew how to create limits — he was able to say ‘no’ to technology. So using the Internet isn’t a contradiction. The point is that we have to say that there are limits.”

FULL STORY: “Jacques Ellul, technology doomsayer before his time

Dystopian fiction is barely keeping pace with bio-engineering reality

MaddAddam_by_Margaret_Atwood

From a review essay on Margaret Atwood’s new novel MaddAddam, which completes her apocalyptic-dystopian trilogy that began in 2003 with Oryx and Crake:

You can take your pick of Cassandras: Michael Crichton, Mary Shelley, whoever made Gattaca. Literature and pop culture never stop obsessing about the bastard spawn of technology and biology, although movies love to have it both ways, wallowing happily in high-tech gadgetry even as they deplore its effects.

Feverish as all this artistic angst is, what’s remarkable is that it barely keeps pace with reality. We are hurtling ever faster toward a point of no return. Consider that, just earlier this year, MIT researchers managed to implant false memories in mice. Or that the now-common procedure of preimplantation genetic diagnosis (PGD) lets would-be parents in fertility treatment test their multiple embryos for defects and discard the embryos they don’t want. One of these days, we may also be able to slow down aging by stopping the degradation of telomeres. (Telomeres are the caps on the ends of chromosomes that keep them from fraying.)

. . . Given how close reality has come to surpassing imagination, what do the Atwoods of the world have to offer? Only what good novelists have always offered: a sense of the tragic, a respect for the power of malevolence, a grasp of how things go awry. In her most recent works, a trilogy in the anti-utopian tradition of Brave New World and 1984 that she began with Oryx and Crake in 2003 and ended this September with MaddAddam, transhumanism meets capitalism. In place of Orwell’s totalitarian state, Atwood gives us an all-powerful genetic-engineering industry. Biotech corporations have superseded governments and turned criminal. Since they are so good at keeping people healthy, they have to come up with new profit centers, so they add viruses to their vitamins.

— Judith Shulevitz, “Margaret Atwood: Our Most Important Prophet of Doom,” The New Republic, September 25, 2013

Also see the September 20 radio interview with Atwood (nearly an hour long, downloadable or streamable) on NPR’s On Point:

Margaret Atwood writes “speculative fiction” — but don’t call it science fiction, she says. It could all happen. And maybe it is. Her latest novel is the culmination of a mind-bending trilogy story of the end of the world that seems all too hideously possible. The world, debauched and wrecked by human over-reach. A designer plague has wiped out almost all of old humanity. Gene-altered pigs and a successor race of leaf-eating humanoids are all over. A new Genesis story is unfolding. For a new world. Up next On Point: novelist Margaret Atwood, and after us.

— “Margaret Atwood Will Make You Afraid of Her Tomorrow,” On Point, NPR, September 20, 2013

Dystopia now: We’re living in (and living out) a real-life “Harrison Bergeron” scenario

By Butenkova Olga (Own work) [CC0], via Wikimedia Commons

By Butenkova Olga (Own work) [CC0], via Wikimedia Commons

Rebecca Solnit, writing in London Review of Books:

In or around June 1995 human character changed again. Or rather, it began to undergo a metamorphosis that is still not complete, but is profound — and troubling, not least because it is hardly noted. When I think about, say, 1995, or whenever the last moment was before most of us were on the internet and had mobile phones, it seems like a hundred years ago.

. . . Previous technologies have expanded communication. But the last round may be contracting it. The eloquence of letters has turned into the unnuanced spareness of texts; the intimacy of phone conversations has turned into the missed signals of mobile phone chat. I think of that lost world, the way we lived before these new networking technologies, as having two poles: solitude and communion. The new chatter puts us somewhere in between, assuaging fears of being alone without risking real connection. It is a shallow between two deep zones, a safe spot between the dangers of contact with ourselves, with others.

I live in the heart of it, and it’s normal to walk through a crowd — on a train, or a group of young people waiting to eat in a restaurant — in which everyone is staring at the tiny screens in their hands. It seems less likely that each of the kids waiting for the table for eight has an urgent matter at hand than that this is the habitual orientation of their consciousness. At times I feel as though I’m in a bad science fiction movie where everyone takes orders from tiny boxes that link them to alien overlords. Which is what corporations are anyway, and mobile phones decoupled from corporations are not exactly common.

. . . A short story that comes back to me over and over again is Kurt Vonnegut’s ‘Harrison Bergeron’, or one small bit of it. Since all men and women aren’t exactly created equal, in this dystopian bit of science fiction a future America makes them equal by force: ballerinas wear weights so they won’t be more graceful than anyone else, and really smart people wear earpieces that produce bursts of noise every few minutes to interrupt their thought processes. They are ‘required by law to wear it at all times. It was tuned to a government transmitter. Every twenty seconds or so, the transmitter would send out some sharp noise to keep people like George from taking unfair advantage of their brains.’ For the smartest person in Vonnegut’s story, the radio transmitter isn’t enough: ‘Instead of a little ear radio for a mental handicap, he wore a tremendous pair of earphones, and spectacles with thick wavy lenses. The spectacles were intended to make him not only half blind, but to give him whanging headaches besides.’

We have all signed up to wear those earpieces, a future form of new media that will chop our consciousnesses into small dice. Google has made real the interruptors that Vonnegut thought of as a fantasy evil for his dystopian 2081.

MORE: “Diary: In the Day of the Postman

Teeming Links – August 30, 2013

FireHeadImage courtesy of Salvatore Vuono / FreeDigitalPhotos.net

Today’s opening word is actually double: two opening words. The first is from John Michael Greer, writing with his typically casual and powerful lucidity. The second is from international studies expert Charles Hill, who writes with equal power. They’re lengthy, so please feel free to skip on down to the list of links. But I think you’ll find something interesting if you first read these excerpts, and ruminate on them, and see if you can spot a deep connection between them.

First, from Mr. Greer:

Plunge into the heart of the fracking storm . . . and you’ll find yourself face to face with a foredoomed attempt to maintain one of the core beliefs of the civil religion of progress in the teeth of all the evidence. The stakes here go far beyond making a bunch of financiers their umpteenth million, or providing believers in the myth of progress with a familiar ritual drama to bolster their faith; they cut straight to the heart of that faith, and thus to some of the most fundamental presuppositions that are guiding today’s industrial societies along their road to history’s scrapheap.

. . . The implication that has to be faced is that the age of petroleum, and everything that unfolded from it, was exactly the same sort of temporary condition as the age of antibiotics and the Green Revolution. Believers in the religion of progress like to think that Man conquered distance and made the world smaller by inventing internal combustion engines, aircraft, and an assortment of other ways to burn plenty of petroleum products. What actually happened, though, was that drilling rigs and a few other technologies gave our species a temporary boost of cheap liquid fuel to play with, and we proceeded to waste most of it on the assumption that Nature’s energy resources had been conquered and could be expected to fork over another cheap abundant energy source as soon as we wanted one.

. . . [T]he fact that Wall Street office fauna are shoveling smoke about, ahem, “limitless amounts of oil and natural gas” from fracked wells, may make them their umpteenth million and keep the clueless neatly sedated for a few more years, but it’s not going to do a thing to change the hard facts of the predicament that’s closing around us all.

— John Michael Greer, “Terms of Surrender,” The Archdruid Report, August 28, 2013

Second, from Dr. Hill:

This vast societal transformation might be called “The Great Virtue Shift.” Almost every act regarded in the mid-20th century as a vice was, by the opening of the 21st century, considered a virtue. As gambling, obscenity, pornography, drugs, divorce, homosexuality, abortion and sneering disaffection became The New Virtue, government at all levels began to move in on the action, starting with casinos and currently involving, in several states and the District of Columbia, an officially approved and bureaucratically managed narcotics trade.

The Great Virtue Shift has produced among its practitioners the appearance of profound moral concern, caring and legislated activism on behalf of the neediest cases and most immiserated populations at home and around the world. To this may be added the panoply of social agenda issues designed to ignite resentment and righteous indignation among the new “proletarian” elite. All this works to satisfy the cultural elite’s desire to feel morally superior about itself regarding collective moral issues of large magnitude even as they, as individuals, engage in outsized self-indulgent personal behavior.

. . . There is a logic chain at work here, too: a lack of self-limitation on individual liberty will produce excess and coarseness; virtue will retreat and, as it does, hypocritical moralizing about society’s deficiencies will increase. Widening irresponsibility coupled with public pressure for behavior modification will mount and be acted upon by government. The consequential loss of liberty scarcely will be noticed by the mass of people now indulging themselves, as Tocqueville predicted, in the “small and vulgar pleasures with which they fill their souls.” We will not as a result be ruled by tyrants but by schoolmasters in suits with law degrees, and be consoled in the knowledge that we ourselves elected them.

To retain liberty, or by now to repossess it, Americans must re-educate themselves in what has been made of Burke’s precept: “Liberty must be limited in order to be possessed.” Walt Whitman re-formulated this as, “The shallow consider liberty a release from all law, from every constraint. The wise man sees in it, on the contrary, the potent Law of Laws.” Learning what liberty is and what it requires of us is the only bulwark, ultimately, against American decadence. Pay no heed to the determinists: The choice is ours to make.

— Charles Hill, “On Decadence,” The American Interest, September/October 2013

If you made a Venn Diagram out of Hill’s and Greer’s respective ruminations, and if you meditated for a while on the shared middle ground between them, you might find something that would insightfully illuminate a lot of the material below.

* * *

Is America Addicted to War? (Foreign Policy, April 2011)
This exploration of “the top 5 reasons why we keep getting into foolish fights,” written by Harvard international affairs professor Stephen M. Walt in response to the United States’ military intervention in Libya’s civil war, is obviously and pointedly relevant to what’s going on right now with the Syria situation. “Why does this keep happening? Why do such different presidents keep doing such similar things? How can an electorate that seemed sick of war in 2008 watch passively while one war escalates in 2009 and another one gets launched in 2011? How can two political parties that are locked in a nasty partisan fight over every nickel in the government budget sit blithely by and watch a president start running up a $100 million per day tab in this latest adventure? What is going on here?

The real threat to our way of life? Not terrorists or faraway dictators, but our own politicians and securocrats (The Guardian)
“Convinced national security is for ever at risk, western governments mimic the fanaticism they claim to despise.”

The Leveraged Buyout of America (The Web of Debt Blog)
“Giant bank holding companies now own airports, toll roads, and ports; control power plants; and store and hoard vast quantities of commodities of all sorts. They are systematically buying up or gaining control of the essential lifelines of the economy. How have they pulled this off, and where have they gotten the money?”

Academy Fight Song (The Baffler)
This may be the most exhaustive, devastating, damning, dystopian, and dead-on essay-length critique of higher education in America that I’ve ever read. “Virtually every aspect of the higher-ed dream has been colonized by monopolies, cartels, and other unrestrained predators. . . . What actually will happen to higher ed, when the breaking point comes, will be an extension of what has already happened, what money wants to see happen. Another market-driven disaster will be understood as a disaster of socialism, requiring an ever deeper penetration of the university by market rationality.”

Why Teach English? (Adam Gopnik for The New Yorker)
“No sane person proposes or has ever proposed an entirely utilitarian, production-oriented view of human purpose. We cannot merely produce goods and services as efficiently as we can, sell them to each other as cheaply as possible, and die. . . . No civilization we think worth studying, or whose relics we think worth visiting, existed without what amounts to an English department — texts that mattered, people who argued about them as if they mattered, and a sense of shame among the wealthy if they couldn’t talk about them, at least a little, too. It’s what we call civilization.”

The Humanities Studies Debate (On Point with Tom Ashbrook)
A well-mounted, hour-long NPR radio debate. “Should American colleges and college students throw their resources, their minds, their futures, into the ancient pillars of learning — philosophy, language, literature, history, the arts. Or are those somehow less relevant, less urgent studies today in a hyper-competitive global economy? Defenders of the humanities say this is the very foundation of human insight. To study, as Socrates said, ‘the way one should live.’ Critics say: ‘Crunch some numbers. Get a job.’

Paper Versus Pixel (Nicholas Carr for Nautilus)
“On the occasion of the inaugural Nautilus Quarterly, we asked Nicholas Carr to survey the prospects for a print publication. Here he shows why asking if digital publications will supplant printed ones is the wrong question. ‘We were probably mistaken to think of words on screens as substitutes for words on paper’ [says Carr]. ‘They seem to be different things, suited to different kinds of reading and providing different sorts of aesthetic and intellectual experiences.'”

Japan Opens ‘Fasting Camps’ To Wean Kids Off Of Excessive Internet Usage (International Business Times)
“A government study found that up to 15 percent of Japanese students spend as much as five hours online everyday and even more time on the internet on weekends. As a result, the Tokyo government’s education ministry will introduce ‘web fasting camps’ to help young people disconnect from their PCs, laptops, mobile phones and hand-held devices.”

Cancer’s Primeval Power and Murderous Purpose (Bloomberg)
“It is a fundamental biological phenomenon. A single cell ‘decides’ (for lack of a better word) to strike off on its own. Mutation by mutation, it evolves — like a monster in the ecosystem of your body. Cancer is an occupying force with a will of its own. . . . What from the body’s point of view are dangerous mutations are, for the tumor, advantageous adaptations. . . . Susan Sontag called cancer ‘a demonic pregnancy,’ ‘a fetus with its own will.’ That is more than an arresting metaphor.”

New Exhibit Explains Why We’ve Been Fascinated By Witches For More Than 500 Years (The Huffington Post)
“A new exhibit at the Scottish National Gallery of Modern Art, aptly titled exhibition, ‘Witches & Wicked Bodies,’ is paying homage to art’s heated affair with witches. The show dives into darker depictions of witches hidden in prints, drawings, paintings, sculptures and more, shedding light on attitudes perpetuated by everyone from Francisco de Goya to Paula Rego.”

Beyond the Veil: Otherworld Experience as Archaeological Research (Prehistoric Shamanism)
“By ignoring trance experience of the otherworld, anthropologists could only understand part of the world shamanic people lived in. As it turned out, the otherworld and the existence of the spirits informed pretty much everything these people did. . . . The otherworld of the spirits that prehistoric people experienced is not made up, or a figment of a deluded mind, but is something wired into the brains of every human.”

William Gaines and the Birth of Horror Comics (Mysterious Universe)
“Through comics, films and television, ‘Tales From The Crypt’ and EC Comics have proven to be an enduring pop culture franchise and one that’s dear to the heart of many horror fans. Its legacy continues to manifest itself through the innumerable writers, directors and artists whose childhoods were shaped by nights reading those gloriously gruesome early comics by flashlight under the blankets.”