In my recent post about Jeff Kripal’s article “Visions of the Impossible,” I mentioned that biologist and hardcore skeptical materialist Jerry Coyne published a scathing response to Jeff’s argument soon after it appeared. For those who would like to keep up with the conversation, here’s the heart of Coyne’s response (which, in its full version, shows him offering several direct responses to several long passages that he quotes from Jeff’s piece):
For some reason the Chronicle of Higher Education, a weekly publication that details doings (and available jobs) in American academia, has shown a penchant for bashing science and promoting anti-materialist views. . . . I’m not sure why that is, but I suspect it has something to do with supporting the humanities against the dreaded incursion of science — the bogus disease of “scientism.”
That’s certainly the case with a big new article in the Chronicle, “Visions of the impossible: how ‘fantastic’ stories unlock the nature of consciousness,” by Jeffrey J. Kripal, a professor of religious studies at Rice University in Texas. Given his position, it’s not surprising that Kripal’s piece is an argument about Why There is Something Out There Beyond Science. And although the piece is long, I can summarize its thesis in two sentences (these are my words, not Kripal’s):
“People have had weird experiences, like dreaming in great detail about something happening before it actually does; and because these events can’t be explained by science, the most likely explanation is that they are messages from some non-material realm beyond our ken. If you combine that with science’s complete failure to understand consciousness, we must conclude that naturalism is not sufficient to understand the universe, and that our brains are receiving some sort of ‘transhuman signals.'”
That sounds bizarre, especially for a distinguished periodical, but anti-naturalism seems to be replacing postmodernism as the latest way to bash science in academia.
. . . But our brain is not anything like a radio. The information processed in that organ comes not from a transhuman ether replete with other people’s thoughts, but from signals sent from one neuron to another, ultimately deriving from the effect of our physical environment on our senses. If you cut your optic nerves, you go blind; if you cut the auditory nerves, you become deaf. Without such sensory inputs, whose mechanisms we understand well, we simply don’t get information from the spooky channels promoted by Kripal.
When science manages to find reliable evidence for that kind of clairvoyance, I’ll begin to pay attention. Until then, the idea of our brain as a supernatural radio seems like a kind of twentieth-century alchemy—the resort of those whose will to believe outstrips their respect for the facts.
Full article: “Science Is Being Bashed by Academic Who Should Know Better“
(An aside: Is it just me, or in his second paragraph above does Coyne effectively insult and dismiss the entire field of religious studies and all of the people who work in it?)
Jeff responded five days later in a second piece for the Chronicle, where he met Coyne’s criticisms head-on with words like these: Read the rest of this entry
Religion scholar Jeffrey Kripal is one of the most lucid and brilliant voices in the current cultural conversation about the relationship between science and the paranormal, and about the rehabilitation of the latter as an important concept and category after a century of scorn, derision, and dismissal by the gatekeepers of mainstream cultural and intellectual respectability. (And yes, we’ve referenced his work many times here at The Teeming Brain.)
Recently, The Chronicle Review, from The Chronicle of Higher Education, published a superb essay by him that has become a lightning rod for both passionate attack and equally passionate defense. It has even brought a strong response — a scornful one, of course — from no less a defender of scientistic orthodoxy than Jerry Coyne. I’ll say more about these things in another post later this week, but for now here’s a representative excerpt that makes two things abundantly clear: first, why this essay serves as a wonderful condensation of and/or introduction to Jeff’s essential 2010 book Authors of the Impossible: The Paranormal and the Sacred and its semi-sequel, 2011’s Mutants and Mystics: Science Fiction, Superhero Comics, and the Paranormal; and second, why it’s so significant that something like this would be published in a venue like The Chronicle Review. The intellectual orthodoxy of the day is clearly undergoing a radical transformation when a respected religion scholar at a respected university (Jeff currently holds the J. Newton Rayzor Chair in Philosophy and Religious Thought at Rice University) can say things like this in a publication like that:
Because we’ve invested our energy, time, and money in particle physics, we are finding out all sorts of impossible things. But we will not invest those resources in the study of anomalous states of cognition and consciousness, and so we continue to work with the most banal models of mind — materialist and mechanistic ones. While it is true that some brain research has gone beyond assuming that “mind equals brain” and that the psyche works like, or is, a computer, we are still afraid of the likelihood that we are every bit as bizarre as the quantum world, and that we possess fantastic capacities that we have allowed ourselves to imagine only in science fiction, fantasy literature, and comic books.
. . . In the rules of this materialist game, the scholar of religion can never take seriously what makes an experience or expression religious, since that would involve some truly fantastic vision of human nature and destiny, some transhuman divinization, some mental telegraphy, dreamlike soul, clairvoyant seer, or cosmic consciousness. All of that is taken off the table, in principle, as inappropriate to the academic project. And then we are told that there is nothing “religious” about religion, which, of course, is true, since we have just discounted all of that other stuff.
Our present flatland models have rendered human nature something like the protagonist Scott Carey in the film The Incredible Shrinking Man (1957). With every passing decade, human nature gets tinier and tinier and less and less significant. In a few more years, maybe we’ll just blip out of existence (like poor Scott at the end of the film), reduced to nothing more than cognitive modules, replicating DNA, quantum-sensitive microtubules in the synapses of the brain, or whatever. We are constantly reminded of the “death of the subject” and told repeatedly that we are basically walking corpses with computers on top — in effect, technological zombies, moist robots, meat puppets. We are in the ridiculous situation of having conscious intellectuals tell us that consciousness does not really exist as such, that there is nothing to it except cognitive grids, software loops, and warm brain matter. If this were not so patently absurd and depressing, it would be funny.
. . . We now have two models of the brain and its relationship to mind, an Aristotelian one and a Platonic one, both of which fit the neuroscientific data well enough: the reigning production model (mind equals brain), and the much older but now suppressed transmission or filter model (mind is experienced through or mediated, shaped, reduced, or translated by brain but exists in its own right “outside” the skull cavity).
. . . There are . . . countless . . . clues in the history of religions that rule the radio theory in, and that suggest, though hardly prove, that the human brain may function as a super-evolved neurological radio or television and, in rare but revealing moments when the channel suddenly “switches,” as an imperfect receiver of some transhuman signal that simply does not play by the rules as we know them.
Although it relies on an imperfect technological metaphor, the beauty of the radio or transmission model is that it is symmetrical, intellectually generous, and — above all — capable of demonstrating what we actually see in the historical data, when we really look.
MORE: “Visions of the Impossible“
Image courtesy of Dan / FreeDigitalPhotos.net
Yesterday Geoffrey Pullum, Gerard visiting professor of cognitive, linguistic, and psychological sciences at Brown University and professor of general linguistics at the University of Edinburgh, penned a blog post for the Lingua Franca blog at The Chronicle of Higher Education about his recent visit to a couple of Lovecraftian sites in Providence. I was pleased to see Lovecraft being brought up like this at the Chronicle, and then I was even more interested when I noticed the tone of both Pullum’s post and some of the comments it had drawn. A lurking disdain for the Old Gent from Providence was on display right from the start, and I felt HPL was taking a subtle, and in some cases overt, drubbing of the type that properly should have been laid to rest with his ascent to canonical status around the turn of the new millennium. I also felt there was a misreading of not just his work but his worldview that was afoot.
Pullum starts his post on a strikingly negative note by recalling his first boyhood encounter with Lovecraft’s writing and giving it a retroactive trashing before allowing a backhanded compliment:
As a 14-year-old budding collector of supernatural horror fiction, browsing a bookstore in England, I happened upon a paperback collection of stories by H. P. Lovecraft. I opened it and read the first sentence of “The Lurking Fear”:
“There was thunder in the air on the night I went to the deserted mansion atop Tempest Mountain to find the lurking fear.”
That must be one of the worst opening lines in all of horror fiction, I now realize. It reads like an entry in San Jose State’s Bulwer-Lytton Fiction Contest, inspired by the ludicrous opening of the novel Paul Clifford by Edward “It was a dark and stormy night” Bulwer-Lytton. And when I tell you that the last words of Lovecraft’s tale are “They were never heard of again,” you may find it hard to believe that even a 14-year-old would not be sophisticated enough to laugh out loud. Yet somehow, for a boy craving escape from the mundane world of the suburbs south of London, Lovecraft’s overwrought ghastliness rang an eerie distant bell in some haunted mansion of my imagination.
— Geoffrey Pullum, “Lovecraft’s Providence,” Lingua Franca, The Chronicle of Higher Education, September 17, 2012
He goes on to describe how last week, after a day of teaching at Brown, “the fact that I am now living and working in Lovecraft’s beloved home town suddenly struck me as very significant.” Moved by this emotion, and setting out “For some reason I could not name,” he went and visited a couple of the famous Lovecraftian sites and structures in Providence — something I myself did several years ago during my sole (so far) trip to New England. Read the rest of this entry
Like so many of my fellow Gen-X-ers, I led a childhood that was significantly Disneyfied.
The first movie I ever saw, as relayed to me by my parents (since it occurred at an age far too young for me to remember), was Disney’s Cinderella. Beginning at the age of four, I took several trips with my family to Disney World and Disney Land. And also to Epcot Center, whose plastic sci fi utopia enchanted me. Most Sunday evenings I watched The Wonderful World of Disney on ABC, except for when I was obligated to go to church, which wasn’t nearly as fun or interesting as Disney. Far from being a typical passive sponge for the Disney meme, I actively soaked it up.
Based on this manifest interest, one Christmas my family gifted me with a copy of the massive uber-tome Disney Animation: The Illusion of Life (576 pages, coffee-table sized, lavishly illustrated). Written by two of Disney’s legendary classic-era animators, Frank Thomas and Ollie Johnston, it still stands today as the single most comprehensive, authoritative, and valuable tome about its title subject. I reveled in the book for years.
Along similar lines, I once convinced my parents to buy an 8-track of an audio play titled “Disney’s Christmas Carol” — the progenitor of the later short film, “Mickey’s Christmas Carol” — from a television ad. It was accompanied by an 8-track of Christmas carols sung by Disney-voiced characters. I listened frequently to both of them for a couple of years, even when it wasn’t Christmastime.
But then I grew up and, as I saw it, left childish things behind. In college I learned to scoff at the artificiality of Walt Disney’s saccharine, anti-real or hyper-real portrayal of human life, and also his gaudy techno-utopian future vision for the human race, not to mention his virtually totalitarian corporate leadership style and his exploitation of the underside of American proletarian values (anti-intellectualism, consumerism, etc.). This, despite the above-described Disneyfied childhood.
Now a new article in The Chronicle of Higher Education — “Walt Disney, Reanimated” (March 21) by Randy Malamud, an English professor at Georgia State University — is, to put it bluntly, fascinating the hell out of me. Malamud reviews the new Walt Disney Family Museum, which opened in October at the Presidio complex in San Francisco, and finds it a worthy, non-hagiographic presentation and examination of Disney’s life, ethos, and contribution to America’s culture.
He notes the scorn that became prevalent among American academics and intellectuals over a span of decades, and then points to a raft of recent books that have begun to reshape the conversation by taking a more open-minded and less condemnatory approach to Disney — the man, the media empire, the artistic/entertainment legacy, and the cultural force. “If Walt Disney,” writes Malamud,
is a hugely overdetermined figure — and he himself bears considerable responsibility for that — it’s a valuable corrective to have this museum return us to the actual flesh-and-blood man behind the curtain, and back to the work itself. . . . Before visiting, I had wondered if the Disney Museum would be a hagiography, or a glorified gift shop, or a propagandistic reification of the Disney empire. It isn’t any of those things. It’s a collection of ideas and documents, a diverse array of archival, filmic, and pop-cultural texts that historicizes Disney’s work and compels us to think twice about how we appraise it. The museum energizes the fascinatingly charged scholarly debate that the Disney phenomenon has provoked, shaking the worn, staid, sometimes cynical images we have of Disney and his empire, bringing to them renewed color and motion.
I haven’t kept up with any of the cultural and philosophical criticism leveled at Disney over the years. The last I really remember reading anything about it was when I browsed through Beaudrillard’s Travels in Hyperreality nearly two decades ago. But now, for some reason — one that I suspect is tied as much to my innate interest in cultural studies and ideas as it is to the Disneyfication of my childhood — the news that some critics and observers are starting to sing a different tune really snags my attention.
On initial inspection, from the tiny bit of poking around that I’ve now started to do in this area, the observations of these critics appear sound. The scorn has been overbaked and overblown. The Disney wave-and-meme really does represent something that deserves to be engaged with rather than dismissed or used as scholarly cannon fodder, and this is true both because of its inherent qualities and because of the general and pleasantly fresh-smelling fact that, as Malamud points out (drawing on a very worthwhile Chronicle article from last year titled “What’s the Matter with Cultural Studies?”), “scholars should respect and engage with the mass appeal of popular cultural texts rather than dismiss ones deemed politically or aesthetically flawed as evidence of the audience’s false consciousness.”
Thanks for that, Mr. Malamud.
Welcome back into my Fockerian circle of trust, Uncle Walt. It’s nice to remake your acquaintance.
This is the first of a three-part series. (Also see parts 2 and 3.)In this post I’ll simply point to the problem and refer to a couple of recently published pieces that lay it out in bleak detail. In the next two, to be published over the course of this week, I’ll lay out some of my reactions.
* * * * *
America is in the midst of a real economic crisis. That’s not news. What may be news to some (although it probably isn’t) is that America’s colleges and universities are staggering right through the center of it.
According to education consultant and former university professor and administrator Peter A. Facione, America’s higher education institutions are going to have to buckle down and make hard decisions if they want to survive.
Note the stark emphasis: Colleges are in a fight not just to thrive but to survive. That’s how serious the crisis really and presently is, as argued by Facione in “A Straight-Talk Survival Guide for Colleges” (The Chronicle of Higher Education online, March 20, 2009).
He begins by diagnosing the situation in unflinching terms:
It is time for some straight talk, starting with the realization that organizations that can’t or won’t adapt will fail. This recession has caused many of the nation’s largest retailers, banks, airlines, manufacturers, and brokerage houses to do so. Millions of Americans have lost jobs and homes. Why would we think colleges, and those employed by them, would be exempt from the same fate? The market sorts itself out at times like these. Industries realign.
….[H]igher education is part of the larger economic system. There will be casualties, just as commercial businesses will fail and other worthy nonprofit organizations will go broke. If a state’s tax revenues fall by large percentages, given that the priorities of the states are usually public safety, unemployment support, transportation, basic services, and a balanced budget, then something will have to go. Often that something will be support for higher education.
….If you as a college administrator think you are in a sailboat during a gale, you are right.
Then he offers his prescription, which consists of a long list of recommended action steps, attitude shifts, institutional reorganizations, and policy changes for college and university administrators, faculty, and staff to make if they want to survive. These are fairly dramatic and include suspending programs, reducing salaries, imposing freezes on hiring and searches, closing campuses, and more.
Not incidentally, Bloomberg agrees about the severity of the problem, as explained in detail in a May 1 article:
[T]he American system of higher education is in turmoil….Independent colleges that lack a national name or must-have majors are hardest hit. Many gorged on debt for construction, technology and creature comforts. Now, as endowments tumble and bills mount, they’re struggling to attract cash-strapped families who are navigating their own financial woes. Such mid-tier institutions may be forced to change what they do to survive. In the best case, they’ll merge with bigger schools, sell themselves to for-profit organizations or offer vocational training that elite colleges eschew, says Sandy Baum, a senior policy analyst at the College Board. In the worst case, they’ll shutter their doors for good (“Colleges Flunk Economics Test as Harvard Model Destroys Budgets“).
Briefly, and in anticipation of what I’ll talk about in the next post in this series, I’ll say that when I read this kind of thing I can’t help thinking of what the likes of, e.g., Kunstler and Greer and Berman have been saying off and on for years about the non-future of the American higher education scene in its current form. I also can’t help noticing that Facione and the Bloomberg reporter naturally think and talk in terms of market conditions, competition, the market “sorting itself out,” and all of that. I know, of course, that it really is necessary to devote attention to this “nuts and bolts” end of things — but I also keep hoping to read something in a mainstream publication like The Chronicle or Bloomberg that doesn’t just talk the same old tired economic language of business as usual but recognizes the need for an explosive paradigm shift away from the higher education world’s pervasive current model of college as a purely market-driven enterprise.
But I’ll say more about these matters in the next installment. Check back in a couple of days.
See the next installments in this series:
This post is in response to a query somebody made at the Shocklines forum. In various conversations at that board, people have recently been mentioning a supposed surge of anti-intellectualism in America today. One person responded with the following:
I’ve been hearing a lot about this ‘wave of anti-intellectualism’. I’m curious about it.
All artistic ventures aren’t immediately dismissed by the general public. Memento springs to mind; it was certainly a different sort of film, but it also had reasonable legs as a movie which didn’t even break 600 screens, and its DVD sales seemed pretty strong. While it’s undeniably true that the most innovative movies do not have corresponding box office receipts (hey, Shallow Hal beat out Memento by a long shot) it’s also true that this is not a new thing. I don’t recall a time when the most innovative films racked up the best box office.
What is the root of the anti-intellectualism argument?
I could go on and on about this topic all day, and would end up thanking you for the provocation to vent. But I’ll restrain myself, relatively speaking. Apologies in advance if I sound smotheringly didactic at points. I’ve recognized that fact about my writing for years but have thus far been unable to overcome it.
I think the basic idea behind the anti-intellectualist argument presents at least two aspects. One of these is the simple recognition that “dumb is in.” I remember seeing Saturday Night Live’s Tina Fey mention this in an interview a couple of years ago. When the interviewer brought up the subject of Ms. Fey’s reputation for intelligence and wit, she jumped on the opportunity to express serious concerns about the fact that in American pop culture, which for several decades has been synonymous with (prepackaged) youth culture, it’s become hip to be stupid. She talked about kids, and especially girls, feeling pressured to suppress their intelligence and appear stupid and vapid in order to fit in. And she contrasted this with her parents’ generation, when the counterculture was in full swing and it was hip to be über-intelligent and well-read so that you could effectively criticize the American government or the radical commie sympathizers or whomever, depending on your stance.
So this is the first and easiest-to-get-at arm of the argument, this pointing-out of what might be called the Bill & Ted syndrome, or the Harry & Lloyd syndrome, or the Jesse & Chester syndrome. Especially among the under-thirty crowd, there’s a cultural pressure to act stupid even if you’re not, and this is hostile to intelligence.
The deeper and more extended aspect of the argument represents a kind of medical diagnosis of a peculiarly American pathology that has now infected the rest of the world by means of cultural imperialism — that is, via the aggressive exporting of a lifestyle centered around consumerism and mass media entertainment. The idea is that America is in the throes of a systemic crisis that is largely economic in nature, the effects and implications of which have inevitably spun off into a detrimental effect on the American intellectual character. Then there’s also the related recognition of America’s longstanding bias in favor of what might be called “down home-ism” and against anything perceived as highfalutin, a tendency that has been alternately muted and dominant at various periods in the nation’s history. People who point to current anti-intellectual trends like to say the tendency has now moved dramatically and perhaps definitively to the fore, with youth culture’s “dumb is in” phenomenon representing just the tip of the iceberg.
Please pardon me while I let other people do much of my thinking and speaking. When I first started writing this reply to your query, I was just out of bed and my brain was quite foggy. (I’ve never been able to fathom how or why so many writers find this time of day to be the best for doing their work, since I myself can barely put two words together until mid-morning.) So I’m just going to offer some quotations from, summaries of, and links to a number of books and articles whose ideas have amplified, shaped, and/or coincided with my own. Read the rest of this entry