Category Archives: Science & Technology
Religion scholar Jeffrey Kripal is one of the most lucid and brilliant voices in the current cultural conversation about the relationship between science and the paranormal, and about the rehabilitation of the latter as an important concept and category after a century of scorn, derision, and dismissal by the gatekeepers of mainstream cultural and intellectual respectability. (And yes, we’ve referenced his work many times here at The Teeming Brain.)
Recently, The Chronicle Review, from The Chronicle of Higher Education, published a superb essay by him that has become a lightning rod for both passionate attack and equally passionate defense. It has even brought a strong response — a scornful one, of course — from no less a defender of scientistic orthodoxy than Jerry Coyne. I’ll say more about these things in another post later this week, but for now here’s a representative excerpt that makes two things abundantly clear: first, why this essay serves as a wonderful condensation of and/or introduction to Jeff’s essential 2010 book Authors of the Impossible: The Paranormal and the Sacred and its semi-sequel, 2011’s Mutants and Mystics: Science Fiction, Superhero Comics, and the Paranormal; and second, why it’s so significant that something like this would be published in a venue like The Chronicle Review. The intellectual orthodoxy of the day is clearly undergoing a radical transformation when a respected religion scholar at a respected university (Jeff currently holds the J. Newton Rayzor Chair in Philosophy and Religious Thought at Rice University) can say things like this in a publication like that:
Because we’ve invested our energy, time, and money in particle physics, we are finding out all sorts of impossible things. But we will not invest those resources in the study of anomalous states of cognition and consciousness, and so we continue to work with the most banal models of mind — materialist and mechanistic ones. While it is true that some brain research has gone beyond assuming that “mind equals brain” and that the psyche works like, or is, a computer, we are still afraid of the likelihood that we are every bit as bizarre as the quantum world, and that we possess fantastic capacities that we have allowed ourselves to imagine only in science fiction, fantasy literature, and comic books.
. . . In the rules of this materialist game, the scholar of religion can never take seriously what makes an experience or expression religious, since that would involve some truly fantastic vision of human nature and destiny, some transhuman divinization, some mental telegraphy, dreamlike soul, clairvoyant seer, or cosmic consciousness. All of that is taken off the table, in principle, as inappropriate to the academic project. And then we are told that there is nothing “religious” about religion, which, of course, is true, since we have just discounted all of that other stuff.
Our present flatland models have rendered human nature something like the protagonist Scott Carey in the film The Incredible Shrinking Man (1957). With every passing decade, human nature gets tinier and tinier and less and less significant. In a few more years, maybe we’ll just blip out of existence (like poor Scott at the end of the film), reduced to nothing more than cognitive modules, replicating DNA, quantum-sensitive microtubules in the synapses of the brain, or whatever. We are constantly reminded of the “death of the subject” and told repeatedly that we are basically walking corpses with computers on top — in effect, technological zombies, moist robots, meat puppets. We are in the ridiculous situation of having conscious intellectuals tell us that consciousness does not really exist as such, that there is nothing to it except cognitive grids, software loops, and warm brain matter. If this were not so patently absurd and depressing, it would be funny.
. . . We now have two models of the brain and its relationship to mind, an Aristotelian one and a Platonic one, both of which fit the neuroscientific data well enough: the reigning production model (mind equals brain), and the much older but now suppressed transmission or filter model (mind is experienced through or mediated, shaped, reduced, or translated by brain but exists in its own right “outside” the skull cavity).
. . . There are . . . countless . . . clues in the history of religions that rule the radio theory in, and that suggest, though hardly prove, that the human brain may function as a super-evolved neurological radio or television and, in rare but revealing moments when the channel suddenly “switches,” as an imperfect receiver of some transhuman signal that simply does not play by the rules as we know them.
Although it relies on an imperfect technological metaphor, the beauty of the radio or transmission model is that it is symmetrical, intellectually generous, and — above all — capable of demonstrating what we actually see in the historical data, when we really look.
MORE: “Visions of the Impossible“
Image courtesy of Dan / FreeDigitalPhotos.net
Remember Ray Bradbury’s classic dystopian short story “The Veldt” (excerpted here) with its nightmare vision of a soul-sapping high-technological future where monstrously narcissistic — and, as it turns out, sociopathic and homicidal — children resent even having to tie their own shoes and brush their own teeth, since they’re accustomed to having these things done for them by machines?
Remember Kubrick’s and Clarke’s 2001: A Space Odyssey, where HAL, the super-intelligent AI system that runs the spaceship Discovery, decides to kill the human crew that he has been created to serve, because he has realized/decided that humans are too defective and error-prone to be allowed to jeopardize the mission?
Remember that passage (which I’ve quoted here before) from John David Ebert’s The New Media Invasion in which Ebert identifies the dehumanizing technological trend that’s currently unfolding all around us? Humans, says Ebert, are becoming increasingly superfluous in a culture of technology worship:
Everywhere we look nowadays, we find the same worship of the machine at the expense of the human being, who always comes out of the equation looking like an inconvenient, leftover remainder: instead of librarians to check out your books for you, a machine will do it better; instead of clerks to ring up your groceries for you, a self-checkout will do it better; instead of a real live DJ on the radio, an electronic one will do the job better; instead of a policeman to write you a traffic ticket, a camera (connected to a computer) will do it better. In other words . . . the human being is actually disappearing from his own society, just as the automobile long ago caused him to disappear from the streets of his cities . . . . [O]ur society is increasingly coming to be run and operated by machines instead of people. Machines are making more and more of our decisions for us; soon, they will be making all of them.
Bear all of that in mind, and then read this, which is just the latest in a volley of media reports about the encroaching advent, both rhetorical and factual, of all these things in the real world:
A house that tracks your every movement through your car and automatically heats up before you get home. A toaster that talks to your refrigerator and announces when breakfast is ready through your TV. A toothbrush that tattles on kids by sending a text message to their parents. Exciting or frightening, these connected devices of the futuristic “smart” home may be familiar to fans of science fiction. Now the tech industry is making them a reality.
Mundane physical objects all around us are connecting to networks, communicating with mobile devices and each other to create what’s being called an “Internet of Things,” or IoT. Smart homes are just one segment — cars, clothing, factories and anything else you can imagine will eventually be “smart” as well.
. . . We won’t really know how the technology will change our lives until we get it into the hands of creative developers. “The guys who had been running mobile for 20 years had no idea that some developer was going to take the touchscreen and microphone and some graphical resources and turn a phone into a flute,” [Liat] Ben-Zur [of chipmaker Qualcomm] said.
The same may be true when developers start experimenting with apps for connected home appliances. “Exposing that, how your toothbrush and your water heater and your thermostat . . . are going to interact with you, with your school, that’s what’s next,” said Ben-Zur.
Image courtesy of Victor Habbick / FreeDigitalPhotos.net
The updated/remade version of the classic Carl Sagan series Cosmos has been drawing lots of attention in the past few weeks, both positive and negative, and one of the areas that has come under the most scrutiny is the show’s inaccurate portrayal of Giordano Bruno, the sixteenth-century philosopher, occultist, mystic, and proto-scientist whose life and death (he was burned at the stake for heresy in 1600) have been exalted to legendary status in the Western cultural narrative of the war between religion and science. (This is despite the fact that the man’s name and memory have remained relatively obscure in mainstream popular awareness.)
Bruno held and taught a heliocentric view of the universe whose scope exceeded even Galileo’s attempt to build on the Copernican model, and the story that is commonly told today — including by the new Cosmos — is that he was a martyr for science in an age of benighted and militant ignorance, when religious authorities waged a merciless campaign against freedom of thought.
Many observers have weighed in on the problems with this approach to Bruno in the past few weeks. The chatter has been extensive enough that it has even drawn a response from one of the series’ co-writers.
One entry in the conversation that I find to be especially astute and important comes from the pen/word processor of Daily Beast writer and editor David Sessions, who argues that the Cosmos portrayal underscores our tendency to rewrite the past to conform to currently fashionable biases, ideologies, and cultural narratives — in this case, the very narrative of the “war between religion and science” itself, with religion framed as the villain and science as the hero:
Bruno, according to Cosmos, wandered around Europe, arguing passionately but fruitlessly for his new explanation of the universe, only to be mocked, impoverished, and eventually imprisoned and executed. Catholic authorities are depicted as cartoon ghouls, and introduced with sinister theme music. [Host Neil Degrasse] Tyson explains that the church’s modus operandi was to “investigate and torment anyone who voiced views that differed from theirs.”
What Cosmos doesn’t mention is that Bruno’s conflict with the Catholic Church was theological, not scientific, even if it did involve his wild — and occasionally correct — guesses about the universe. As Discover magazine’s Corey Powell pointed out, the philosophers of the 16th century weren’t anything like scientists in the modern sense. Bruno, for instance, was a “pandeist,” which is the belief that God had transformed himself into all matter and ceased to exist as a distinct entity in himself. He believed in all sort of magic and spirits, and extrapolated those views far beyond his ideas about the infinity of the universe. In contrast to contemporaries who drew more modest conclusions from their similar ideas, Bruno agitated for an elaborate counter-theology, and was (unlike the poor, humble outcast portrayed in Cosmos) supported by powerful royal benefactors. The church didn’t even have a position on whether the Earth orbited the sun, and didn’t bring it up at Bruno’s trial. While the early-modern religious persecution certainly can’t be denied, Bruno was killed because he flamboyantly denied basic tenets of the Catholic faith, not because religious authorities were out to suppress all “freedom of thought.”
Cosmos’ treatment of Bruno as a “martyr for science” is just a small example of a kind of cultural myth we tell ourselves about the development of modern society, one that’s almost completely divorced from the messy reality. It’s a story of an upward march from ignorance and darkness, where bold, rebel intellectuals like Bruno faced down the tyrannical dogma of religion and eventually gave us secularism, democracy, and prosperity. Iconoclastic individuals are our heroes, and big, bad institutions — monarchies, patriarchies, churches — are the villains. In the process, our fascinating, convoluted history gets flattened into a kind of secular Bible story to remind us why individual freedom and “separation of church and state” are the most important things for us to believe in.
The real path to our modern selves is much more complicated — so complicated that academic historians still endlessly debate how it happened.
. . . [T]hat Cosmos added an unnecessary and skewed version of Bruno — especially one skewed in this particular way — is a good miniature lesson about our tendency to turn the past into propaganda for our preferred view of the present. There are cultural, religious, and even political reasons that the story of scientific progress and political enlightenment are [sic] so attractive, and filter down even into our children’s entertainment. It allows us to see ourselves as the apex of history, the culmination of an inevitable, upward surge of improvement. It reassures us that our political values are righteous, and reminds us who the enemies are. The messy, complex, non-linear movement of actual history, by contrast, is unsettling, humbling — even terrifying.
For more on the subtle history of the relationship between religion and science, and also the whitewashed/propagandistic mainstream secular narrative about it, I recommend David Metcalfe’s Teeming Brain column De Umbris Idearum, whose title is in fact drawn from the work of Giordano Bruno. See especially “Humility and Silence: Where True Science and True Spirituality Meet” and “Science, Philosophy, Theology: If the Mirrors We Make Are Monstrous, So Too Are We.”
It’s lovely to see one of my formative philosophical influences, and a man whose dystopian critique of technology is largely unknown to the populace at large these days — although it has deeply influenced such iconic cultural texts as Koyaanisqatsi — getting some mainstream attention (in The Boston Globe, two years ago):
Imagine for a moment that pretty much everything you think about technology is wrong. That the devices you believed are your friends are in fact your enemies. That they are involved in a vast conspiracy to colonize your mind and steal your soul. That their ultimate aim is to turn you into one of them: a machine.
It’s a staple of science fiction plots, and perhaps the fever dream of anyone who’s struggled too long with a crashing computer. But that nightmare vision is also a serious intellectual proposition, the legacy of a French social theorist who argued that the takeover by machines is actually happening, and that it’s much further along than we think. His name was Jacques Ellul, and a small but devoted group of followers consider him a genius.
To celebrate the centenary of his birth, a group of Ellul scholars will be gathering today at a conference to be held at Wheaton College near Chicago. The conference title: “Prophet in the Technological Wilderness.”
Ellul, who died in 1994, was the author of a series of books on the philosophy of technology, beginning with The Technological Society, published in France in 1954 and in English a decade later. His central argument is that we’re mistaken in thinking of technology as simply a bunch of different machines. In truth, Ellul contended, technology should be seen as a unified entity, an overwhelming force that has already escaped our control. That force is turning the world around us into something cold and mechanical, and — whether we realize it or not — transforming human beings along with it.
In an era of rampant technological enthusiasm, this is not a popular message, which is one reason Ellul isn’t well known. It doesn’t help that he refused to offer ready-made solutions for the problems he identified. His followers will tell you that neither of these things mean he wasn’t right; if nothing else, they say, Ellul provides one of the clearest existing analyses of what we’re up against. It’s not his fault it isn’t a pretty picture.
. . . Technology moves forward because we let it, he believed, and we let it because we worship it. “Technology becomes our fate only when we treat it as sacred,” says Darrell J. Fasching, a professor emeritus of religious studies at the University of South Florida. “And we tend to do that a lot.”
. . . “Ellul never opposed all participation in technology,” [says David Gill, founding president of the International Jacques Ellul Society and a professor of ethics at the Gordon-Conwell Theological Seminary]. “He didn’t live in the woods, he lived in a nice house with electric lights. He didn’t drive, but his wife did, and he rode in a car. But he knew how to create limits — he was able to say ‘no’ to technology. So using the Internet isn’t a contradiction. The point is that we have to say that there are limits.”
FULL STORY: “Jacques Ellul, technology doomsayer before his time“
You can take your pick of Cassandras: Michael Crichton, Mary Shelley, whoever made Gattaca. Literature and pop culture never stop obsessing about the bastard spawn of technology and biology, although movies love to have it both ways, wallowing happily in high-tech gadgetry even as they deplore its effects.
Feverish as all this artistic angst is, what’s remarkable is that it barely keeps pace with reality. We are hurtling ever faster toward a point of no return. Consider that, just earlier this year, MIT researchers managed to implant false memories in mice. Or that the now-common procedure of preimplantation genetic diagnosis (PGD) lets would-be parents in fertility treatment test their multiple embryos for defects and discard the embryos they don’t want. One of these days, we may also be able to slow down aging by stopping the degradation of telomeres. (Telomeres are the caps on the ends of chromosomes that keep them from fraying.)
. . . Given how close reality has come to surpassing imagination, what do the Atwoods of the world have to offer? Only what good novelists have always offered: a sense of the tragic, a respect for the power of malevolence, a grasp of how things go awry. In her most recent works, a trilogy in the anti-utopian tradition of Brave New World and 1984 that she began with Oryx and Crake in 2003 and ended this September with MaddAddam, transhumanism meets capitalism. In place of Orwell’s totalitarian state, Atwood gives us an all-powerful genetic-engineering industry. Biotech corporations have superseded governments and turned criminal. Since they are so good at keeping people healthy, they have to come up with new profit centers, so they add viruses to their vitamins.
— Judith Shulevitz, “Margaret Atwood: Our Most Important Prophet of Doom,” The New Republic, September 25, 2013
Also see the September 20 radio interview with Atwood (nearly an hour long, downloadable or streamable) on NPR’s On Point:
Margaret Atwood writes “speculative fiction” — but don’t call it science fiction, she says. It could all happen. And maybe it is. Her latest novel is the culmination of a mind-bending trilogy story of the end of the world that seems all too hideously possible. The world, debauched and wrecked by human over-reach. A designer plague has wiped out almost all of old humanity. Gene-altered pigs and a successor race of leaf-eating humanoids are all over. A new Genesis story is unfolding. For a new world. Up next On Point: novelist Margaret Atwood, and after us.
— “Margaret Atwood Will Make You Afraid of Her Tomorrow,” On Point, NPR, September 20, 2013
Thessaly la Force: It struck me when you said we must “trust the peripheral vision of our mind.” It seems like a muscle in your body that you have to develop by training some other part of you.
Marilynne Robinson: One reaches for analogies. I think it’s probably a lot like meditation — which I have never practiced. But from what I understand, it is a capacity that develops itself and that people who practice it successfully have access to aspects of consciousness that they would not otherwise have. They find these large and authoritative experiences. I think that, by the same discipline of introspection, you have access to a much greater part of your awareness than you would otherwise. Things come to mind. Your mind makes selections — this deeper mind — on other terms than your front-office mind. You will remember that once, in some time, in some place, you saw a person standing alone, and their posture suggested to you an enormous narrative around them. And you never spoke to them, you don’t know them, you were never within ten feet of them. But at the same time, you discover that your mind privileges them over something like the Tour d’Eiffel. There’s a very pleasant consequence of that, which is the most ordinary experience can be the most valuable experience. If you’re philosophically attentive you don’t need to seek these things out.
. . . [I]t’s finding access into your life more deeply than you would otherwise. Consider this incredibly brief, incredibly strange experience that we have as this hypersensitive creature on a tiny planet in the middle of somewhere that looks a lot like nowhere. It’s assigning an appropriate value to the uniqueness of our situation and every individual situation.
. . . I think that we have almost taught ourselves to have a cynical view of other people. So much of the scientism that I complain about is this reductionist notion that people are really very small and simple. That their motives, if you were truly aware of them, would not bring them any credit. That’s so ugly. And so inimical to the best of everything we’ve tried to do as a civilization and so consistent with the worst of everything we’ve ever done as a civilization.
MORE: “A Teacher and Her Student“
There simply are no words. And I mean that literally, as you’re about to see.
When I learned recently of the imminent release of a new film by director Godfrey Reggio, he of Koyaanisqatsi, Powaqqatsi, and Naqoyqatsi fame, I was fairly stunned. Then the sensation was augmented when I watched the trailers. As I explained here three months ago, Koyaanisqatsi literally changed my life, and more than one person contacted me after I published that post to let me know they feel the very same way.
And now comes Visitors. Like Reggio’s first three films, this one features an original musical score by Philip Glass. Like Naqoyqatsi, it features visual design by filmmaker Jon Kane. It will premier at the Toronto International Film Festival on September 8, and will be presented there by Steven Soderbergh.
Here is the just-released teaser trailer, followed by an earlier trailer (from 2011) that was released when the project was being developed under the alternate title The Holy See. Even though they’re similar, be sure to watch the second one to its conclusion, which offers a striking “payoff.”
Here’s the film’s official description:
Thirty years after Koyaanisqatsi, Godfrey Reggio — with the support of Philip Glass and Jon Kane — once again leapfrogs over earthbound filmmakers and creates another stunning, wordless portrait of modern life. Presented by Steven Soderbergh in stunning B&W 4K, Visitors reveals humanity’s trancelike relationship with technology, which, when commandeered by extreme emotional states, produces massive effects far beyond the human species. The film is visceral, offering the audience an experience beyond information about the moment in which we live. Comprised of only 74 shots, Visitors takes viewers on a journey to the moon and back to confront them with themselves.
For what it’s worth, I predict a positively mythic impact.