Life guidance from Edward O. Wilson: ‘Search until you find a passion and go all out to excel in its expression’
Edward O. Wilson, 2003
Edward O. Wilson is of course most famous as the seminal thinker, author, scientist, and figure in the field of sociobiology, which he defined in his 1975 book Sociobiology: The New Synthesis as the “systematic study of the biological basis of all social behavior.” Although there are many valid criticisms to be made, and that have been made, of the scientific reductionism that is built into sociobiology, this doesn’t mean Wilson isn’t a fascinating and inspiring figure in his own way, and these qualities come out beautifully in a recent interview that he gave to the Harvard Gazette.
The conversation ends on a high note as Wilson reflects on his lifelong excitement over the field of biology and offers some piercingly excellent advice — framed in terms of science but applicable on a wider basis — for students and young people in search of a life calling:
Q: What is most exciting about your field right now?
A: I haven’t changed since I was a 17-year-old entering the University of Alabama. I’m still basically a boy who’s excited by what’s going on. We are on a little-known planet. We have knowledge of two million species, but for the vast majority we know only the name and a little bit of the anatomy. We don’t know anything at all about their biology. There is conservatively at least eight million species in all, and it’s probably much more than that because of the bacteria and archaea and microorganisms we’re just beginning to explore. The number of species remaining to be discovered could easily go into the tens of millions.
. . . Q: What lessons can a young student starting out today, looking at your career and thinking, “I want to make an impact like that” — what lessons can he or she extract from your life?
A: It almost sounds trite, you hear it so often, but you don’t see enough of it in college-age students, and that is to acquire a passion. You probably already have one, but if you haven’t got one, search until you find a passion and go all out to excel in its expression. With reference to biology and science, do the opposite of the military dictum and march away from the sound of guns. Don’t get too enamored by what’s happening right at this moment and science heroes doing great things currently. Learn from them, but think further down the line: Move to an area where you can be a pioneer.
Photo by Jim Harrison (PLoS) [CC-BY-2.5 (http://creativecommons.org/licenses/by/2.5)], via Wikimedia Commons
EDITOR’S NOTE: Last year, in the wake of the NecronomiCon Providence convention, I posted a video of S. T. Joshi’s keynote address in which he focused on the long and winding history of H. P. Lovecraft’s literary reputation. These many months later, a video of much higher quality, with multiple camera angles and nice production values, has just been published, and it shows not just S. T.’s speech but the convention’s entire opening ceremony:
In light of this, it seems an appropriate time to publish Teeming Brain columnist Jason V. Brock’s brief reflection on the convention and its significance. He wrote the following words several months ago, but I failed to publish them during the blog’s winter break. Especially since there’s another NecronomiCon Providence in the works for August 2015, I think Jason’s comments about the way last August’s convention represented a generational passing of the torch for the weird fiction community are hardly out of date. In fact, they grow more timely with every passing day. – MC
* * *
There are, and always have been, acolytes of various subdomains of interest, and the current period is no exception. Indeed, one major link in this chain has been the development of H. P. Lovecraft as a cult figure of some renown. To that end, I’d like to offer insight into one landmark event in particular: NecronomiCon Providence I, the Lovecraft convention that took place last summer in in Providence, Rhode Island.
It is hard to encapsulate such a sprawling, enormous event as this convention. Originally envisioned as an homage to Lovecraft and weird fiction, it bloomed into something not only of the genre, but something transcending it. The gathering had been building momentum for nearly two years, and it finally came to fruition in late August, 2013, in large part due to the donors of the online Kickstarter crowdfunding campaign, as well as through the generous time and support of sponsors and volunteers and the hard work of the organizing committee, headed by Niels Hobbs. The size and range of this gathering of writers, artists, filmmakers, patrons, fans, and scholars was daunting, but it (mostly) came off without a hitch.
This was the inaugural event in what will hopefully be a new series of these conventions, all to be located in Providence, the former domicile of Lovecraft and the current residence of several Lovecraft-inspired creators, among them writers Jonathan Thomas and Sam Gafford, as well as author Caitlín Rebekah Kiernan (The Drowning Girl). These are planned to convene every other year: the next one is scheduled for 2015.
While paying respect to the core and origin of this type of fiction, and also to the works it has inspired, this con clearly showed the passing of the torch from the Third Generation to the Fourth (a trend that I discussed in a previous installment of this column). It was all quite fascinating to witness, and I was pleased to have a role in it, however modest.
As I recall the staggeringly rich and varied interactions and activities that unfolded last August, I realize there is really nothing more to say, except that this was likely the single greatest congregation of Lovecraft/weird fiction professionals in history, and that it took place in a fantastic, beautiful setting. The panel discussions were informative and interesting, the new research presented was stimulating, the socializing was epic, and everyone was excited, happy, and enthusiastic. If you missed it, be aware that this was one for the record books.
My advice: Don’t miss the next one!
Part of me is still wondering if this was faked with the help of CG, but after watching it twice, I’m inclined to think it’s real, and that’s also the consensus among the zillion sites where this viral video has already proliferated. Even if it’s fake, it’s an amazing concept, beautifully executed, and frankly mesmerizing. And if it’s really real — well, then, wow. Human capabilities just advanced a notch on the mind-blow meter.
Well, that’s a relief. After years of fanning the flames of religious doomsday fears, television preacher John Hagee, long one of the most prominent banner carriers for fundamentalist Protestant bluster and bombast, has decided to enter the apocalypse sweepstakes for real by giving a specific timetable for — well, something non-specific. But he says it will be “a world-shaking event,” and he says it will happen between now and October 2015.
Hagee is not, of course, alone in this. The blood moon phenomenon has set off an apocalyptic debate among many Christians. And suddenly I’m gripped by memories of myself, at age 17, watching the apocalyptic religious horror flick The Seventh Sign and finding it so cool as a Jewish kid sits translating a famous end-times passage from the biblical Book of Joel — specifically, Joel 2:31, which states that “the sun will be turned to darkness and the moon to blood before the coming of the great and dreadful Day of the LORD” — when he looks out his window and sees the large full moon suddenly overtaken by a wave of crimson that turns it a deep bloody color. Demi Moore, what have you wrought?
But all joking aside, I think it’s important to recognize that the type of apocalyptic religious theorizing advanced by Hagee pointedly ignores certain aspects of the very sacred text that he and his ilk claim to take as absolute, infallible, and unchangeable holy writ. Consider, for example, the fact that the biblical/canonical Jesus’s right-hand man, the apostle Peter, states explicitly in Acts 2 that the Joel prophecy, including the part about the moon turning to blood, is already fulfilled in the descent of the Holy Spirit on the apostles at the Day of Pentecost. Obviously, such a claim represents a distinctly different understanding and interpretation of apocalyptic matters than the model of a literal timetable advanced by the Hagees of the world. Likewise for Jesus’s statement in Luke 17:20-21, where he directly tells the pharisees, who have asked when the kingdom will arrive, that it is not the type of thing that will come by looking for external signs, because God’s kingdom is already within or among them. Call me naive, but I doubt we’ll hear Pastor Hagee addressing the clash between his claims and this subtler view as he continues to spout and tout his literalistic apocalyptic views over the next 18 months.
“The Flood” by Johann Heinrich Schönfeld (1634/35)
Via Art and the Bible, Fair Use
I recently saw the Noah movie, and I’m pleased to report that I really liked it. The angle taken by writer-director Darren Aronofsky and his co-writer Ari Handel struck me as deeply engrossing and just right for our collective cultural moment. I was pretty well swept away by their deliberate re-visioning of the basic Bible story as an epic tale of antediluvian human civilization and planetary apocalypse, all revolving around the deep mystery of “the Creator” (the only term used throughout the film to refer to the deity) and His inscrutable nature and terrifying intentions for a world that has been thoroughly corrupted and perverted from its original purpose by humans.
One of the more fascinating changes was Aronofsky’s and Handel’s decision to incorporate an explicitly shamanic theme into the story, largely centered on the person of Methuselah. In the Bible, the Genesis genealogy does present Methuselah as Noah’s grandfather, but he is nowhere mentioned in the flood story itself. The film, by contrast, makes him an important supporting character, and it portrays him as a wise and mysterious old shaman-like figure who gives Noah a psychoactive brew to help him gain a clear vision of what the Creator has been calling him to do in a series of horrifying apocalyptic dreams. As described by Drew McWeeny of HitFix, upon drinking the brew
Noah is propelled into a vision of the Garden and the snake and Adam and Eve’s fall and Cain and Abel’s violence, and he sees the flood, and he sees the Ark, and he knows, with one complete revelation, what he is supposed to do. Methuselah isn’t remotely surprised. He knew that this particular brew would give Noah a direct pipeline to the voice of God, and Aronofsky uses a very real-world visual vocabulary to show a direct communion with the supernatural.
It’s a fascinating way to imagine what a prehistoric, pre-flood religion or spirituality in the general context of this particular tale and tradition might have looked like. It also strikes me as true in spirit to the history and probable prehistory of real-world religion. In the world of the Noah film, religion is experiential, not propositional or intellectual, and it involves a direct sense of communication with the invisible deity, along with an agonized struggle to interpret and understand the meanings of dreams and visions with the help of wise old mediator figures and psychoactive substances.
Methuselah is played by Anthony Hopkins, who does a marvelous job in the role. He also does a marvelous job in a recent interview with McWeeny for HitFix, where in addition to discussing the filmmakers’ decision to include Methuselah in the story he discusses the shamanic matters under question and the explicitly philosophical side of the screenplay as he compares its portrayal of Methuselah to the real-world philosophical figures of Socrates, Plato, and Diogenes. Then he ends with a brief comment on the way Aronofsky managed to create a film that presents “a landscape of . . . darkness and horror,” where the main character is “in tune with some inner signal” as “the ground of all being” speaks within him. It all adds up to a rare moment of true depth in a show-biz industry interview.
Religion scholar Jeffrey Kripal is one of the most lucid and brilliant voices in the current cultural conversation about the relationship between science and the paranormal, and about the rehabilitation of the latter as an important concept and category after a century of scorn, derision, and dismissal by the gatekeepers of mainstream cultural and intellectual respectability. (And yes, we’ve referenced his work many times here at The Teeming Brain.)
Recently, The Chronicle Review, from The Chronicle of Higher Education, published a superb essay by him that has become a lightning rod for both passionate attack and equally passionate defense. It has even brought a strong response — a scornful one, of course — from no less a defender of scientistic orthodoxy than Jerry Coyne. I’ll say more about these things in another post later this week, but for now here’s a representative excerpt that makes two things abundantly clear: first, why this essay serves as a wonderful condensation of and/or introduction to Jeff’s essential 2010 book Authors of the Impossible: The Paranormal and the Sacred and its semi-sequel, 2011′s Mutants and Mystics: Science Fiction, Superhero Comics, and the Paranormal; and second, why it’s so significant that something like this would be published in a venue like The Chronicle Review. The intellectual orthodoxy of the day is clearly undergoing a radical transformation when a respected religion scholar at a respected university (Jeff currently holds the J. Newton Rayzor Chair in Philosophy and Religious Thought at Rice University) can say things like this in a publication like that:
Because we’ve invested our energy, time, and money in particle physics, we are finding out all sorts of impossible things. But we will not invest those resources in the study of anomalous states of cognition and consciousness, and so we continue to work with the most banal models of mind — materialist and mechanistic ones. While it is true that some brain research has gone beyond assuming that “mind equals brain” and that the psyche works like, or is, a computer, we are still afraid of the likelihood that we are every bit as bizarre as the quantum world, and that we possess fantastic capacities that we have allowed ourselves to imagine only in science fiction, fantasy literature, and comic books.
. . . In the rules of this materialist game, the scholar of religion can never take seriously what makes an experience or expression religious, since that would involve some truly fantastic vision of human nature and destiny, some transhuman divinization, some mental telegraphy, dreamlike soul, clairvoyant seer, or cosmic consciousness. All of that is taken off the table, in principle, as inappropriate to the academic project. And then we are told that there is nothing “religious” about religion, which, of course, is true, since we have just discounted all of that other stuff.
Our present flatland models have rendered human nature something like the protagonist Scott Carey in the film The Incredible Shrinking Man (1957). With every passing decade, human nature gets tinier and tinier and less and less significant. In a few more years, maybe we’ll just blip out of existence (like poor Scott at the end of the film), reduced to nothing more than cognitive modules, replicating DNA, quantum-sensitive microtubules in the synapses of the brain, or whatever. We are constantly reminded of the “death of the subject” and told repeatedly that we are basically walking corpses with computers on top — in effect, technological zombies, moist robots, meat puppets. We are in the ridiculous situation of having conscious intellectuals tell us that consciousness does not really exist as such, that there is nothing to it except cognitive grids, software loops, and warm brain matter. If this were not so patently absurd and depressing, it would be funny.
. . . We now have two models of the brain and its relationship to mind, an Aristotelian one and a Platonic one, both of which fit the neuroscientific data well enough: the reigning production model (mind equals brain), and the much older but now suppressed transmission or filter model (mind is experienced through or mediated, shaped, reduced, or translated by brain but exists in its own right “outside” the skull cavity).
. . . There are . . . countless . . . clues in the history of religions that rule the radio theory in, and that suggest, though hardly prove, that the human brain may function as a super-evolved neurological radio or television and, in rare but revealing moments when the channel suddenly “switches,” as an imperfect receiver of some transhuman signal that simply does not play by the rules as we know them.
Although it relies on an imperfect technological metaphor, the beauty of the radio or transmission model is that it is symmetrical, intellectually generous, and — above all — capable of demonstrating what we actually see in the historical data, when we really look.
MORE: “Visions of the Impossible“
Image courtesy of Dan / FreeDigitalPhotos.net
Remember Ray Bradbury’s classic dystopian short story “The Veldt” (excerpted here) with its nightmare vision of a soul-sapping high-technological future where monstrously narcissistic — and, as it turns out, sociopathic and homicidal — children resent even having to tie their own shoes and brush their own teeth, since they’re accustomed to having these things done for them by machines?
Remember Kubrick’s and Clarke’s 2001: A Space Odyssey, where HAL, the super-intelligent AI system that runs the spaceship Discovery, decides to kill the human crew that he has been created to serve, because he has realized/decided that humans are too defective and error-prone to be allowed to jeopardize the mission?
Remember that passage (which I’ve quoted here before) from John David Ebert’s The New Media Invasion in which Ebert identifies the dehumanizing technological trend that’s currently unfolding all around us? Humans, says Ebert, are becoming increasingly superfluous in a culture of technology worship:
Everywhere we look nowadays, we find the same worship of the machine at the expense of the human being, who always comes out of the equation looking like an inconvenient, leftover remainder: instead of librarians to check out your books for you, a machine will do it better; instead of clerks to ring up your groceries for you, a self-checkout will do it better; instead of a real live DJ on the radio, an electronic one will do the job better; instead of a policeman to write you a traffic ticket, a camera (connected to a computer) will do it better. In other words . . . the human being is actually disappearing from his own society, just as the automobile long ago caused him to disappear from the streets of his cities . . . . [O]ur society is increasingly coming to be run and operated by machines instead of people. Machines are making more and more of our decisions for us; soon, they will be making all of them.
Bear all of that in mind, and then read this, which is just the latest in a volley of media reports about the encroaching advent, both rhetorical and factual, of all these things in the real world:
A house that tracks your every movement through your car and automatically heats up before you get home. A toaster that talks to your refrigerator and announces when breakfast is ready through your TV. A toothbrush that tattles on kids by sending a text message to their parents. Exciting or frightening, these connected devices of the futuristic “smart” home may be familiar to fans of science fiction. Now the tech industry is making them a reality.
Mundane physical objects all around us are connecting to networks, communicating with mobile devices and each other to create what’s being called an “Internet of Things,” or IoT. Smart homes are just one segment — cars, clothing, factories and anything else you can imagine will eventually be “smart” as well.
. . . We won’t really know how the technology will change our lives until we get it into the hands of creative developers. “The guys who had been running mobile for 20 years had no idea that some developer was going to take the touchscreen and microphone and some graphical resources and turn a phone into a flute,” [Liat] Ben-Zur [of chipmaker Qualcomm] said.
The same may be true when developers start experimenting with apps for connected home appliances. “Exposing that, how your toothbrush and your water heater and your thermostat . . . are going to interact with you, with your school, that’s what’s next,” said Ben-Zur.
Image courtesy of Victor Habbick / FreeDigitalPhotos.net
Invasion of the Body Snatchers is the first film I remember seeing that actually terrified me. I was so young — I saw it when I was still just a highly impressionable child — and the concept driving the film was utterly, perfectly terrifying:
What if your loved ones were replaced by emotionless duplicates? Worse, what if you were replaced by an emotionless duplicate?
Everything the same, everything slightly different. Existential dread in a teatime SF film.
The heroes: Kevin McCarthy’s square-jawed doctor and Dana Wynter’s plucky divorcee look like they can handle themselves. Still, they are strangely vulnerable.
The cure: medicine and psychiatry will save the day. Or could it be that excessive rationality is actually part of the problem?
Minimal special effects: the fanciest come when the pods split open to reveal their duplicates.
Screenplay by Daniel “Out of the Past” Mainwaring. Directed by Don “The Verdict” Siegel. McCarthy’s voiceover is classic noir, as is the framing device where he tells his story to disbelieving authority figures. The only difference is that, instead of the police, he is talking to doctors. And the whole thing plays out in bleak black and white.
And oh God, the ending! It seems tame now, but my innocent young mind couldn’t cope with the stark ambiguity. I was terrified that pod people had actually taken over the earth and that they were such perfect duplicates that nobody had even noticed. My friend tried to ease my paranoia by talking me through it logically: if pod people had taken over the world we would know we were pod people, and as we didn’t, we couldn’t be.
I eyed him suspiciously: “But that’s exactly what you’d say to me if you were a pod person!”
Invasion of the Body Snatchers wasn’t the first film I remember seeing that actually terrified me. I was a little older — I saw it when I was an adult — but even so, the concept driving the film remained utterly, perfectly terrifying:
Everyone you know — all your friends, all your rivals — is replaced by an emotionless duplicate. Would your love survive? Would your hate?
Everything different, everything eerily similar. Existential dread in a late night shocker.
The victims: Donald Sutherland and Brooke Adams appearout of their depth; ordinary people in extraordinary circumstances. Still, they have hidden strengths.
The problem: pop psychology and New Age pseudoscience will condemn the day. Or could it be that credulity is actually part of the solution?
Minimal special effects: the fanciest come when a pod duplication goes horrifyingly wrong.
Screenplay by W.D. “Peeper” Richter. Cameo by Don “Dirty Harry” Siegel. Sutherland’s cynical everyman is classic neo-noir, as is his struggle to unravel the conspiracy theory. The only difference is that, instead of dealing with corrupt police and politicians, he is dealing with alien invaders. And the whole thing plays out in bleached, soulless color.
And oh God, the ending! It was bleak beyond belief, yet my jaded adult mind was able cope with the horror. That inhuman screech, overlapping a terrified scream as the last human emotion echoed around the world. My friend had trouble understanding the ending, so I talked him through it: a pod person pretending to be human was indistinguishable from a human pretending to be a pod person, until that final moment of revelation.
I eyed him suspiciously: “Should I be worried that you’re called Stuart, too?”