Category Archives: Science & Technology

Science or sacrilege? The trouble with mummies

Gebelein_Man

The mummified body of a Pre-dynastic Egyptian man known as Gebelein Man (formerly called Ginger) in the British Museum

Editing the mummy encyclopedia over the past year and a half has left me with a still-active internal radar that scans the media incessantly for mummy-related news, and a recent (May 20) piece in The Independent about a new exhibition at the British Museum titled “Ancient Lives, New Discoveries: Eight Mummies, Eight Stories” lit up the screen last week like an approaching aircraft carrier.

The teaser conveys the gist:

A blockbuster exhibition at the British Museum unwraps the mysteries of 5,000-year-old Egyptian mummies. Zoe Pilger is fascinated — but not sure they should be on show at all.

The article itself takes the form of an absorbing report in which Ms. Pilger, in addition to describing the exhibition’s content and execution in vivid detail, briefly summarizes the history of scientific mummy studies and the cultural phenomenon of “mummymania” that was ignited by the opening of Tutankhamun’s tomb in 1922.

She talks about mummy unrollings (the popular practice of unwrapping mummies as a public spectacle in the nineteenth and early twentieth centuries),  the cultural myth of a “mummy curse,” the incorporation of the curse motif into Universal Studios’ classic 1932 film The Mummy, the use of image scanning technologically (specifically, CT scans) to conduct non-invasive examinations of mummies, the ancient naturally mummified Egyptian body known as Gebelein Man, Margaret Murray’s famous unwrapping of an Egyptian mummy in front of a crowd of 500 onlookers at the Manchester Museum, and the troubling ethical questions that surround the act of examining and displaying human remains like this at all.

Each of these issues is also talked about at length and in detail in the mummy encyclopedia, the last of them (the ethical conundrum) in two separate articles, one titled “Displaying Mummies,” by scientist and mummy researcher Heather Gill-Frerking and the other titled “Collecting Mummies,” by literary scholar Richard Sugg (author of 2011′s Mummies, Cannibals and Vampires: The History of Corpse Medicine from the Renaissance to the Victorians).

I couldn’t help but be impressed at the way Ms. Pilger’s article, in the space of just 1300 words, offers an excellent primer on many crucial issues related to mummies and their scientific and cultural uses. I highly recommend reading the whole thing. Here’s a taste:

The first mummy on display is extraordinary. It seems charged with a supernatural energy and I half expected it to wake up. This young man was in his twenties or early thirties when he curled up into a foetal position and died. He was buried in a cemetery in Gebelein, Upper Egypt, and naturally mummified by the dry, hot sand. His remains are 5,000 years old but his presence is vivid. Most of his skin has been preserved: it covers his delicate bones. His feet are drawn up to his chest and his hands are cupped under his chin, as though pleading. He seems vulnerable, lit in a glass case in a dark room like a relic. He has been transformed into an object and put on display. To look at him provokes a primal feeling of horror. This is death made real.

. . . These bodies were not designed to be seen. There is a tyrannical tendency in Western culture to try to know everything — to decode, demystify, and disenchant even the most sacrosanct of secrets. A fascination with the “magic” of other cultures is coupled with a rationalist incredulity. We don’t believe, and yet we can’t stop investigating — historically, through violent methods. The curators of this exhibition seem aware of this danger. Rather than crude unwrappings as a form of public entertainment, these mummies are explored with scientific rigour and respect. Instead of revulsion, we are encouraged to feel a sense of shared humanity; they are dignified through the small detail of daily life — from the wigs they wore to the beer they drank. However, there is a feeling that they do not belong to us and should not be here.

MORE: “Egyptian mummies: Science or sacrilege?

Image: Photo of Gebelein Man by Jack1956 at the English language Wikipedia [GFDL (http://www.gnu.org/copyleft/fdl.html), CC-BY-SA-3.0 (http://creativecommons.org/licenses/by-sa/3.0/) or GFDL (http://www.gnu.org/copyleft/fdl.html)], via Wikimedia Commons

 

‘Mummies around the World’ now available for preorder

Mummies_around_the_World

I’m pleased to announce that my mummy encyclopedia is now available for preorder from the publisher, and also from Amazon, Barnes & Noble, and elsewhere. The scheduled publication date is November 30.

From the official publisher’s description:

Perfect for school and public libraries, this is the only reference book to combine pop culture with science to uncover the mystery behind mummies and the mummification phenomena.

Mortality and death have always fascinated humankind. Civilizations from all over the world have practiced mummification as a means of preserving life after death — a ritual which captures the imagination of scientists, artists, and laypeople alike. This comprehensive encyclopedia focuses on all aspects of mummies: their ancient and modern history; their scientific study; their occurrence around the world; the religious and cultural beliefs surrounding them; and their roles in literary and cinematic entertainment.

Author and horror guru Matt Cardin brings together 130 original articles written by an international roster of leading scientists and scholars to examine the art, science, and religious rituals of mummification throughout history. Through a combination of factual articles and topical essays, this book reviews cultural beliefs about death; the afterlife; and the interment, entombment, and cremation of human corpses in places like Egypt, Europe, Asia, and Central and South America. Additionally, the book covers the phenomenon of natural mummification, where environmental conditions result in the spontaneous preservation of human and animal remains.

Here’s an excerpt (slightly condensed) from my introduction to the book: Read the rest of this entry

The digital murder of the Gutenberg mind

Evolution_of_the_Book_Gutenberg

Here’s a double dose of dystopian cheer to accompany a warm and sunny Monday afternoon (or at least that’s the weather here in Central Texas).

First, Adam Kirsch, writing for The New Republic, in a piece dated May 2:

Everyone who ever swore to cling to typewriters, record players, and letters now uses word processors, iPods, and e-mail. There is no room for Bartlebys in the twenty-first century, and if a few still exist they are scorned. (Bartleby himself was scorned, which was the whole point of his preferring not to.) Extend this logic from physical technology to intellectual technology, and it seems almost like common sense to say that if we are not all digital humanists now, we will be in a few years. As the authors of Digital_Humanities write, with perfect confidence in the inexorability — and the desirability — of their goals, “the 8-page essay and the 25-page research paper will have to make room for the game design, the multi-player narrative, the video mash-up, the online exhibit and other new forms and formats as pedagogical exercises.”

. . . The best thing that the humanities could do at this moment, then, is not to embrace the momentum of the digital, the tech tsunami, but to resist it and to critique it. This is not Luddism; it is intellectual responsibility. Is it actually true that reading online is an adequate substitute for reading on paper? If not, perhaps we should not be concentrating on digitizing our books but on preserving and circulating them more effectively. Are images able to do the work of a complex discourse? If not, and reasoning is irreducibly linguistic, then it would be a grave mistake to move writing away from the center of a humanities education.

. . . The posture of skepticism is a wearisome one for the humanities, now perhaps more than ever, when technology is so confident and culture is so self-suspicious. It is no wonder that some humanists are tempted to throw off the traditional burden and infuse the humanities with the material resources and the militant confidence of the digital. The danger is that they will wake up one morning to find that they have sold their birthright for a mess of apps.

MORE: “The False Promise of the Digital Humanities

Second, Will Self, writing for The Guardian, in a piece also dated May 2:

The literary novel as an art work and a narrative art form central to our culture is indeed dying before our eyes. Let me refine my terms: I do not mean narrative prose fiction tout court is dying — the kidult boywizardsroman and the soft sadomasochistic porn fantasy are clearly in rude good health. And nor do I mean that serious novels will either cease to be written or read. But what is already no longer the case is the situation that obtained when I was a young man. In the early 1980s, and I would argue throughout the second half of the last century, the literary novel was perceived to be the prince of art forms, the cultural capstone and the apogee of creative endeavour. The capability words have when arranged sequentially to both mimic the free flow of human thought and investigate the physical expressions and interactions of thinking subjects; the way they may be shaped into a believable simulacrum of either the commonsensical world, or any number of invented ones; and the capability of the extended prose form itself, which, unlike any other art form, is able to enact self-analysis, to describe other aesthetic modes and even mimic them. All this led to a general acknowledgment: the novel was the true Wagnerian Gesamtkunstwerk.

. . . [T]he advent of digital media is not simply destructive of the codex, but of the Gutenberg mind itself. There is one question alone that you must ask yourself in order to establish whether the serious novel will still retain cultural primacy and centrality in another 20 years. This is the question: if you accept that by then the vast majority of text will be read in digital form on devices linked to the web, do you also believe that those readers will voluntarily choose to disable that connectivity? If your answer to this is no, then the death of the novel is sealed out of your own mouth.

. . . I believe the serious novel will continue to be written and read, but it will be an art form on a par with easel painting or classical music: confined to a defined social and demographic group, requiring a degree of subsidy, a subject for historical scholarship rather than public discourse. . . . I’ve no intention of writing fictions in the form of tweets or text messages — nor do I see my future in computer-games design. My apprenticeship as a novelist has lasted a long time now, and I still cherish hopes of eventually qualifying. Besides, as the possessor of a Gutenberg mind, it is quite impossible for me to foretell what the new dominant narrative art form will be — if, that is, there is to be one at all.

MORE: “The Novel Is Dead (This Time It’s for Real)

 

Image: Painting: John White Alexander (1856–1915); Photo: Andreas Praefcke (Own work (own photograph)) [Public domain or Public domain], via Wikimedia Commons

The bias of scientific materialism and the reality of paranormal experience

Opened_Doors_to_Heaven

In my recent post about Jeff Kripal’s article “Visions of the Impossible,” I mentioned that biologist and hardcore skeptical materialist Jerry Coyne published a scathing response to Jeff’s argument soon after it appeared. For those who would like to keep up with the conversation, here’s the heart of Coyne’s response (which, in its full version, shows him offering several direct responses to several long passages that he quotes from Jeff’s piece):

For some reason the Chronicle of Higher Education, a weekly publication that details doings (and available jobs) in American academia, has shown a penchant for bashing science and promoting anti-materialist views. . . . I’m not sure why that is, but I suspect it has something to do with supporting the humanities against the dreaded incursion of science — the bogus disease of “scientism.”

That’s certainly the case with a big new article in the Chronicle, “Visions of the impossible: how ‘fantastic’ stories unlock the nature of consciousness,” by Jeffrey J. Kripal, a professor of religious studies at Rice University in Texas. Given his position, it’s not surprising that Kripal’s piece is an argument about Why There is Something Out There Beyond Science. And although the piece is long, I can summarize its thesis in two sentences (these are my words, not Kripal’s):

“People have had weird experiences, like dreaming in great detail about something happening before it actually does; and because these events can’t be explained by science, the most likely explanation is that they are messages from some non-material realm beyond our ken. If you combine that with science’s complete failure to understand consciousness, we must conclude that naturalism is not sufficient to understand the universe, and that our brains are receiving some sort of ‘transhuman signals.'”

That sounds bizarre, especially for a distinguished periodical, but anti-naturalism seems to be replacing postmodernism as the latest way to bash science in academia.

. . . But our brain is not anything like a radio. The information processed in that organ comes not from a transhuman ether replete with other people’s thoughts, but from signals sent from one neuron to another, ultimately deriving from the effect of our physical environment on our senses. If you cut your optic nerves, you go blind; if you cut the auditory nerves, you become deaf. Without such sensory inputs, whose mechanisms we understand well, we simply don’t get information from the spooky channels promoted by Kripal.

When science manages to find reliable evidence for that kind of clairvoyance, I’ll begin to pay attention. Until then, the idea of our brain as a supernatural radio seems like a kind of twentieth-century alchemy—the resort of those whose will to believe outstrips their respect for the facts.

Full article: “Science Is Being Bashed by Academic Who Should Know Better

(An aside: Is it just me, or in his second paragraph above does Coyne effectively insult and dismiss the entire field of religious studies and all of the people who work in it?)

Jeff responded five days later in a second piece for the Chronicle, where he met Coyne’s criticisms head-on with words like these: Read the rest of this entry

Life guidance from Edward O. Wilson: ‘Search until you find a passion and go all out to excel in its expression’

Edward_O_Wilson

Edward O. Wilson, 2003

Edward O. Wilson is of course most famous as the seminal thinker, author, scientist, and figure in the field of sociobiology, which he defined in his 1975 book Sociobiology: The New Synthesis as the “systematic study of the biological basis of all social behavior.” Although there are many valid criticisms to be made, and that have been made, of the scientific reductionism that is built into sociobiology, this doesn’t mean Wilson isn’t a fascinating and inspiring figure in his own way, and these qualities come out beautifully in a recent interview that he gave to the Harvard Gazette.

The conversation ends on a high note as Wilson reflects on his lifelong excitement over the field of biology and offers some piercingly excellent advice — framed in terms of science but applicable on a wider basis — for students and young people in search of a life calling:

Q: What is most exciting about your field right now?

A: I haven’t changed since I was a 17-year-old entering the University of Alabama. I’m still basically a boy who’s excited by what’s going on. We are on a little-known planet. We have knowledge of two million species, but for the vast majority we know only the name and a little bit of the anatomy. We don’t know anything at all about their biology. There is conservatively at least eight million species in all, and it’s probably much more than that because of the bacteria and archaea and microorganisms we’re just beginning to explore. The number of species remaining to be discovered could easily go into the tens of millions.

. . . Q: What lessons can a young student starting out today, looking at your career and thinking, “I want to make an impact like that” — what lessons can he or she extract from your life?

A: It almost sounds trite, you hear it so often, but you don’t see enough of it in college-age students, and that is to acquire a passion. You probably already have one, but if you haven’t got one, search until you find a passion and go all out to excel in its expression. With reference to biology and science, do the opposite of the military dictum and march away from the sound of guns. Don’t get too enamored by what’s happening right at this moment and science heroes doing great things currently. Learn from them, but think further down the line: Move to an area where you can be a pioneer.

MORE: “For E. O. Wilson, Wonders Never Cease

 

Photo by Jim Harrison (PLoS) [CC-BY-2.5 (http://creativecommons.org/licenses/by/2.5)], via Wikimedia Commons

Scientism, the fantastic, and the nature of consciousness

Triangle_and_Eye

Religion scholar Jeffrey Kripal is one of the most lucid and brilliant voices in the current cultural conversation about the relationship between science and the paranormal, and about the rehabilitation of the latter as an important concept and category after a century of scorn, derision, and dismissal by the gatekeepers of mainstream cultural and intellectual respectability. (And yes, we’ve referenced his work many times here at The Teeming Brain.)

Recently, The Chronicle Review, from The Chronicle of Higher Education, published a superb essay by him that has become a lightning rod for both passionate attack and equally passionate defense. It has even brought a strong response — a scornful one, of course — from no less a defender of scientistic orthodoxy than Jerry Coyne. I’ll say more about these things in another post later this week, but for now here’s a representative excerpt that makes two things abundantly clear: first, why this essay serves as a wonderful condensation of and/or introduction to Jeff’s essential 2010 book Authors of the Impossible: The Paranormal and the Sacred and its semi-sequel, 2011’s Mutants and Mystics: Science Fiction, Superhero Comics, and the Paranormal; and second, why it’s so significant that something like this would be published in a venue like The Chronicle Review. The intellectual orthodoxy of the day is clearly undergoing a radical transformation when a respected religion scholar at a respected university (Jeff currently holds the J. Newton Rayzor Chair in Philosophy and Religious Thought at Rice University) can say things like this in a publication like that:

Because we’ve invested our energy, time, and money in particle physics, we are finding out all sorts of impossible things. But we will not invest those resources in the study of anomalous states of cognition and consciousness, and so we continue to work with the most banal models of mind — materialist and mechanistic ones. While it is true that some brain research has gone beyond assuming that “mind equals brain” and that the psyche works like, or is, a computer, we are still afraid of the likelihood that we are every bit as bizarre as the quantum world, and that we possess fantastic capacities that we have allowed ourselves to imagine only in science fiction, fantasy literature, and comic books.

. . . In the rules of this materialist game, the scholar of religion can never take seriously what makes an experience or expression religious, since that would involve some truly fantastic vision of human nature and destiny, some transhuman divinization, some mental telegraphy, dreamlike soul, clairvoyant seer, or cosmic consciousness. All of that is taken off the table, in principle, as inappropriate to the academic project. And then we are told that there is nothing “religious” about religion, which, of course, is true, since we have just discounted all of that other stuff.

Our present flatland models have rendered human nature something like the protagonist Scott Carey in the film The Incredible Shrinking Man (1957). With every passing decade, human nature gets tinier and tinier and less and less significant. In a few more years, maybe we’ll just blip out of existence (like poor Scott at the end of the film), reduced to nothing more than cognitive modules, replicating DNA, quantum-sensitive microtubules in the synapses of the brain, or whatever. We are constantly reminded of the “death of the subject” and told repeatedly that we are basically walking corpses with computers on top — in effect, technological zombies, moist robots, meat puppets. We are in the ridiculous situation of having conscious intellectuals tell us that consciousness does not really exist as such, that there is nothing to it except cognitive grids, software loops, and warm brain matter. If this were not so patently absurd and depressing, it would be funny.

. . . We now have two models of the brain and its relationship to mind, an Aristotelian one and a Platonic one, both of which fit the neuroscientific data well enough: the reigning production model (mind equals brain), and the much older but now suppressed transmission or filter model (mind is experienced through or mediated, shaped, reduced, or translated by brain but exists in its own right “outside” the skull cavity).

. . . There are . . . countless . . . clues in the history of religions that rule the radio theory in, and that suggest, though hardly prove, that the human brain may function as a super-evolved neurological radio or television and, in rare but revealing moments when the channel suddenly “switches,” as an imperfect receiver of some transhuman signal that simply does not play by the rules as we know them.

Although it relies on an imperfect technological metaphor, the beauty of the radio or transmission model is that it is symmetrical, intellectually generous, and — above all — capable of demonstrating what we actually see in the historical data, when we really look.

MORE: “Visions of the Impossible

 

Image courtesy of Dan / FreeDigitalPhotos.net

Superfluous humans in a world of smart machines

Robot_Hand_and_Earth_Globe

Remember Ray Bradbury’s classic dystopian short story “The Veldt” (excerpted here) with its nightmare vision of a soul-sapping high-technological future where monstrously narcissistic — and, as it turns out, sociopathic and homicidal — children resent even having to tie their own shoes and brush their own teeth, since they’re accustomed to having these things done for them by machines?

Remember Kubrick’s and Clarke’s 2001: A Space Odyssey, where HAL, the super-intelligent AI system that runs the spaceship Discovery, decides to kill the human crew that he has been created to serve, because he has realized/decided that humans are too defective and error-prone to be allowed to jeopardize the mission?

Remember that passage (which I’ve quoted here before) from John David Ebert’s The New Media Invasion in which Ebert identifies the dehumanizing technological trend that’s currently unfolding all around us? Humans, says Ebert, are becoming increasingly superfluous in a culture of technology worship:

Everywhere we look nowadays, we find the same worship of the machine at the expense of the human being, who always comes out of the equation looking like an inconvenient, leftover remainder: instead of librarians to check out your books for you, a machine will do it better; instead of clerks to ring up your groceries for you, a self-checkout will do it better; instead of a real live DJ on the radio, an electronic one will do the job better; instead of a policeman to write you a traffic ticket, a camera (connected to a computer) will do it better. In other words . . . the human being is actually disappearing from his own society, just as the automobile long ago caused him to disappear from the streets of his cities . . . . [O]ur society is increasingly coming to be run and operated by machines instead of people. Machines are making more and more of our decisions for us; soon, they will be making all of them.

Bear all of that in mind, and then read this, which is just the latest in a volley of media reports about the encroaching advent, both rhetorical and factual, of all these things in the real world:

A house that tracks your every movement through your car and automatically heats up before you get home. A toaster that talks to your refrigerator and announces when breakfast is ready through your TV. A toothbrush that tattles on kids by sending a text message to their parents. Exciting or frightening, these connected devices of the futuristic “smart” home may be familiar to fans of science fiction. Now the tech industry is making them a reality.

Mundane physical objects all around us are connecting to networks, communicating with mobile devices and each other to create what’s being called an “Internet of Things,” or IoT. Smart homes are just one segment — cars, clothing, factories and anything else you can imagine will eventually be “smart” as well.

. . . We won’t really know how the technology will change our lives until we get it into the hands of creative developers. “The guys who had been running mobile for 20 years had no idea that some developer was going to take the touchscreen and microphone and some graphical resources and turn a phone into a flute,” [Liat] Ben-Zur [of chipmaker Qualcomm] said.

The same may be true when developers start experimenting with apps for connected home appliances. “Exposing that, how your toothbrush and your water heater and your thermostat . . . are going to interact with you, with your school, that’s what’s next,” said Ben-Zur.

MORE: “The Internet of Things: Helping Smart Devices Talk to Each Other

Image courtesy of Victor Habbick / FreeDigitalPhotos.net

Rebranding Giordano Bruno: How the new ‘Cosmos’ spins the history of religion and science

The updated/remade version of the classic Carl Sagan series Cosmos has been drawing lots of attention in the past few weeks, both positive and negative, and one of the areas that has come under the most scrutiny is the show’s inaccurate portrayal of Giordano Bruno, the sixteenth-century philosopher, occultist, mystic, and proto-scientist whose life and death (he was burned at the stake for heresy in 1600) have been exalted to legendary status in the Western cultural narrative of the war between religion and science. (This is despite the fact that the man’s name and memory have remained relatively obscure in mainstream popular awareness.)

Bruno held and taught a heliocentric view of the universe whose scope exceeded even Galileo’s attempt to build on the Copernican model, and the story that is commonly told today — including by the new Cosmos — is that he was a martyr for science in an age of benighted and militant ignorance, when religious authorities waged a merciless campaign against freedom of thought.

Many observers have weighed in on the problems with this approach to Bruno in the past few weeks. The chatter has been extensive enough that it has even drawn a response from one of the series’ co-writers.

One entry in the conversation that I find to be especially astute and important comes from the pen/word processor of Daily Beast writer and editor David Sessions, who argues that the Cosmos portrayal underscores our tendency to rewrite the past to conform to currently fashionable biases, ideologies, and cultural narratives — in this case, the very narrative of the “war between religion and science” itself, with religion framed as the villain and science as the hero:

Bruno, according to Cosmos, wandered around Europe, arguing passionately but fruitlessly for his new explanation of the universe, only to be mocked, impoverished, and eventually imprisoned and executed. Catholic authorities are depicted as cartoon ghouls, and introduced with sinister theme music. [Host Neil Degrasse] Tyson explains that the church’s modus operandi was to “investigate and torment anyone who voiced views that differed from theirs.”

What Cosmos doesn’t mention is that Bruno’s conflict with the Catholic Church was theological, not scientific, even if it did involve his wild — and occasionally correct — guesses about the universe. As Discover magazine’s Corey Powell pointed out, the philosophers of the 16th century weren’t anything like scientists in the modern sense. Bruno, for instance, was a “pandeist,” which is the belief that God had transformed himself into all matter and ceased to exist as a distinct entity in himself. He believed in all sort of magic and spirits, and extrapolated those views far beyond his ideas about the infinity of the universe. In contrast to contemporaries who drew more modest conclusions from their similar ideas, Bruno agitated for an elaborate counter-theology, and was (unlike the poor, humble outcast portrayed in Cosmos) supported by powerful royal benefactors. The church didn’t even have a position on whether the Earth orbited the sun, and didn’t bring it up at Bruno’s trial. While the early-modern religious persecution certainly can’t be denied, Bruno was killed because he flamboyantly denied basic tenets of the Catholic faith, not because religious authorities were out to suppress all “freedom of thought.”

Cosmos’ treatment of Bruno as a “martyr for science” is just a small example of a kind of cultural myth we tell ourselves about the development of modern society, one that’s almost completely divorced from the messy reality. It’s a story of an upward march from ignorance and darkness, where bold, rebel intellectuals like Bruno faced down the tyrannical dogma of religion and eventually gave us secularism, democracy, and prosperity. Iconoclastic individuals are our heroes, and big, bad institutions — monarchies, patriarchies, churches — are the villains. In the process, our fascinating, convoluted history gets flattened into a kind of secular Bible story to remind us why individual freedom and “separation of church and state” are the most important things for us to believe in.

The real path to our modern selves is much more complicated — so complicated that academic historians still endlessly debate how it happened.

. . . [T]hat Cosmos added an unnecessary and skewed version of Bruno — especially one skewed in this particular way — is a good miniature lesson about our tendency to turn the past into propaganda for our preferred view of the present. There are cultural, religious, and even political reasons that the story of scientific progress and political enlightenment are [sic] so attractive, and filter down even into our children’s entertainment. It allows us to see ourselves as the apex of history, the culmination of an inevitable, upward surge of improvement. It reassures us that our political values are righteous, and reminds us who the enemies are. The messy, complex, non-linear movement of actual history, by contrast, is unsettling, humbling — even terrifying.

MORE: “How ‘Cosmos’ Bungles the History of Religion and Science

For more on the subtle history of the relationship between religion and science, and also the whitewashed/propagandistic mainstream secular narrative about it, I recommend David Metcalfe’s Teeming Brain column De Umbris Idearum, whose title is in fact drawn from the work of Giordano Bruno. See especially “Humility and Silence: Where True Science and True Spirituality Meet” and “Science, Philosophy, Theology: If the Mirrors We Make Are Monstrous, So Too Are We.”

Jacques Ellul’s nightmare vision of a technological dystopia

The_Technological_Society_by_Jacques_Ellul

It’s lovely to see one of my formative philosophical influences, and a man whose dystopian critique of technology is largely unknown to the populace at large these days — although it has deeply influenced such iconic cultural texts as Koyaanisqatsi — getting some mainstream attention (in The Boston Globe, two years ago):

Imagine for a moment that pretty much everything you think about technology is wrong. That the devices you believed are your friends are in fact your enemies. That they are involved in a vast conspiracy to colonize your mind and steal your soul. That their ultimate aim is to turn you into one of them: a machine.

It’s a staple of science fiction plots, and perhaps the fever dream of anyone who’s struggled too long with a crashing computer. But that nightmare vision is also a serious intellectual proposition, the legacy of a French social theorist who argued that the takeover by machines is actually happening, and that it’s much further along than we think. His name was Jacques Ellul, and a small but devoted group of followers consider him a genius.

To celebrate the centenary of his birth, a group of Ellul scholars will be gathering today at a conference to be held at Wheaton College near Chicago. The conference title: “Prophet in the Technological Wilderness.”

Ellul, who died in 1994, was the author of a series of books on the philosophy of technology, beginning with The Technological Society, published in France in 1954 and in English a decade later. His central argument is that we’re mistaken in thinking of technology as simply a bunch of different machines. In truth, Ellul contended, technology should be seen as a unified entity, an overwhelming force that has already escaped our control. That force is turning the world around us into something cold and mechanical, and — whether we realize it or not — transforming human beings along with it.

In an era of rampant technological enthusiasm, this is not a popular message, which is one reason Ellul isn’t well known. It doesn’t help that he refused to offer ready-made solutions for the problems he identified. His followers will tell you that neither of these things mean he wasn’t right; if nothing else, they say, Ellul provides one of the clearest existing analyses of what we’re up against. It’s not his fault it isn’t a pretty picture.

. . . Technology moves forward because we let it, he believed, and we let it because we worship it. “Technology becomes our fate only when we treat it as sacred,” says Darrell J. Fasching, a professor emeritus of religious studies at the University of South Florida. “And we tend to do that a lot.”

. . . “Ellul never opposed all participation in technology,” [says David Gill, founding president of the International Jacques Ellul Society and a professor of ethics at the Gordon-Conwell Theological Seminary]. “He didn’t live in the woods, he lived in a nice house with electric lights. He didn’t drive, but his wife did, and he rode in a car. But he knew how to create limits — he was able to say ‘no’ to technology. So using the Internet isn’t a contradiction. The point is that we have to say that there are limits.”

FULL STORY: “Jacques Ellul, technology doomsayer before his time

Teeming Links – October 4, 2013

FireHeadImage courtesy of Salvatore Vuono / FreeDigitalPhotos.net

To preface today’s (short but dense) collection of recommended and necessary reading, here’s a lengthy opening word about the ultimate closing word — which is to say, several excerpts from a recent article about the upsurge of apocalyptic themes in American entertainment. As we all know, there’s been a flood of articles and essays about this phenomenon in the past couple of years, many of them mentioned and linked to here. This one, which was published in print in The New York Times Magazine under the title “A Culture That Lurches About Within the Shadow of Its Own Extinction,” makes some particularly astute and interesting points, and does a particularly effective — and pithy — job of locating our apocalypse obsession within the wider history of that term’s signification. It also offers a credible and sobering reflection on what this obsession might portend:

As a form of disposable entertainment, the apocalypse market is booming. The question is why. The obvious answer is that these narratives tap into anxieties, conscious and otherwise, about the damage we’re doing to our species and to the planet. They allow us to safely fantasize about what might be required of us to survive.

Of course, people have been running around screaming about the end of the world for as long as we’ve been around to take notes. But in the past, the purpose of these stories was essentially prophetic. They were intended to bring man into accord with the will of God, or at least his own conscience. The newest wave of apocalyptic visions, whether they’re intended to make us laugh or shriek, are nearly all driven by acts of sadistic violence. Rather than inspiring audiences to reckon with the sources of our potential planetary ruin, they proceed from the notion that the apocalypse will usher in an era of sanctified Darwinism: survival of the most weaponized.

. . . The word “apocalypse” did not always signify the end of the world. Its original Greek meaning was an unveiling, or a revelation, as of God’s will. . . . In this sense, apocalyptic literature can be seen as a subset of prophetic writing. The crucial difference is that prophets like Isaiah and Jeremiah lived among the people they preached to and promised them specific forms of deliverance in return for repentance. Apocalyptic writers despised the fallen world around them, or at least deemed it beyond repair, and thus looked to a future in which paradise for a select few was reached only by upheaval.

. . . It’s impossible to read Revelation today without viewing it as a kind unintended template for the frantic and ornate mayhem that marks so many modern renditions of the apocalypse. The phantasmagoric battles rage on, in the heavens and on Earth, and Satan’s army grows ever more vivid and grotesque courtesy of C.G.I. These diversions offer no coherent moral agenda. They are what the English novelist and critic D. H. Lawrence called — in his book about Revelation — “death products,” elaborate revenge fantasies driven by “flamboyant hate and simple lust . . . for the end of the world.”

. . . It’s only natural that the apocalyptic canon has radically expanded in the past few decades. Never has our species been so besieged by doomsday scenarios. If our ancestors channeled their collective death instinct into religious myth, we now face a raft of scientific data that suggest the end might be truly nigh.

. . . Popular culture has moved beyond the prophetic phase represented by science fiction writers like Clarke or Ray Bradbury. We are deep into what D. H. Lawrence might have called the death-product era. For most of us, though, our obsession with the end times doesn’t arise from religious faith anymore. It is a secular impulse that marks a chilling regression.

Imagine, if you will, that a race of superior beings discovers Earth 10,000 years from now, or even 10 centuries, a world no longer inhabited by humans. In surveying the remains of our civilization, what would they make of a species so intellectually advanced as to understand the precise threats posed to its survival and yet so immature as to ignore these threats? And what of the vast troves they would find containing elaborate and childish simulations of our destruction?

It is entirely possible that they would look upon these artifacts not as harmless entertainments but dark prophecy.

— Steven Almond, “The Apocalypse Market Is Booming,” The New York Times, September 27, 2013

* * *

Civilization Has Lasted 5,000 Years. How About 5 Million?, Scholars Ask. (The Chronicle of Higher Education)
“On a recent humid day at the Library of Congress, a collection of astronomers, humanists, and writers set their eyes on a deadline well past the next debt showdown, or even the next election. They had a more distant horizon in mind. They had gathered in this 213-year-old institution to debate whether human civilization, which has had a good run over the past 5,000 years, could persist for longer than a geological blip in the Earth’s 4.5-billion-year history. . . . David Grinspoon, the day’s host and the inaugural chair of astrobiology at the Library of Congress’s John W. Kluge Center . . . had lured a panel of luminaries to wrestle with a loaded question: ‘Will we survive our world-changing technologies?'” (This article is located behind a subscriber paywall. Well worth seeking out if you can find access through a library or elsewhere.)

Fukushima Unit 4 Has Shown Signs of Collapsing (Disinfo)
In case you haven’t been following the news, there’s chatter emerging from various quarters that says the nuclear disaster situation at Fukushima may in fact be developing into, or perhaps already has developed into, the worst such disaster in history. Some people are talking in terms of an actual threat to human survival. Sensationalistic doom-mongering or authentic cause for concern? Click through, read this item at Disinfo (which links to several different news items and analyses), and mull it over for yourself. Also see the disturbing roundup of links about this subject over at The Daily Grail.

Musings_on_Mortality_by_Victor_BrombertIntimations of Mortality (The Chronicle of Higher Education)
“Nothing seems to help. Not even writing about death decreases the fear of it. . . . Perhaps all thought and all art ultimately find their source in intimations of mortality.” A deeply absorbing and moving essay by Princeton literature professor emeritus Victor Brombert, who fought in World War II and has been intimately acquainted with death since childhood. Adapted from his new book Musings on Mortality: From Tolstoy to Primo Levi.

The Horror, the Horror: Thirty-eight centuries of supernatural lit. (Michael Dirda for The Weekly Standard)
Michael Dirda reviews and discusses S. T. Joshi’s two-volume study Unutterable Horror: A History of Supernatural Fiction. Simply delightful, and filled with valuable observations and asides from Michael himself, as in this: “Nothing human is alien to supernatural fiction. Transgressive by definition, it ventures into the dark corners within all of us, probing our sexuality, religious beliefs, and family relationships, uncovering shameful yearnings and anxieties, questioning the meaning of life and death, even speculating about the nature of the cosmos. It’s no surprise that almost every canonical writer one can think of has occasionally, or more than occasionally, dabbled in ghostly fiction: Charles Dickens, Henry James, Somerset Maugham, Elizabeth Bowen, John Cheever, even Russell Kirk, to name just a few outstanding examples. The genre’s best stories are, after all, more than divertissements. They are works of art that make us think about who and what we are. And, yes, they are also scary. Sometimes really scary.”

Frankenstein: Birth of a Monster (BBC)
Full 2003 documentary, very nicely done. From the BBC’s description page: “In life, as in literature, Mary Shelley’s famous monster, Frankenstein, overshadows its creator. The story of Frankenstein has become a modern myth, one which has developed a life of its own, mutating with every re-telling. . . . Using Mary’s own words and accounts from the people who knew her, and dramatic reconstructions of events in Mary’s life and from her famous novel, Frankenstein: Birth of a Monster tells the true story of Frankenstein’s monster and the remarkable woman who created him.”