The sun was gone. The sky lingered its colors for a time while they sat in the clearing. At last, he heard a whispering. She was getting up. She put out her hand to take his. He stood beside her, and they looked at the woods around them and the distant hills. They began to walk away from the path and the car, away from the highway and the town. A spring moon rose over the land while they were walking.
The breath of nightfall was rising up out of the separate blades of grass, a warm sighing of air, quiet and endless. They reached the top of the hill, and without a word, sat there watching the sky.
He thought to himself that this was impossible; that such things did not happen. He wondered who she was, and what she was doing here.
Ten miles away, a train whistled in the spring night and went on its way over the dark evening earth, flashing a brief fire. And then, again, he remembered the old story, the old dream. The thing he and his friend had discussed, so many years ago.
There must be one night in your life that you will remember forever. There must be one night for everyone. And if you know that the night is coming on and that this night will be that particular night, then take it and don’t question it and don’t talk about it to anyone ever after that. For if you let it pass it might not come again. Many have let it pass, many have seen it go by and have never seen another like it, when all the circumstances of weather, light, moon and time, of night hill and warm grass and train and town and distance were balanced upon the trembling of a finger. . . .
He woke during the night. She was awake, near him.
“Who are you?” he whispered. She said nothing.
“I could stay another night,” he said.
But he knew that one can never stay another night. One night is the night, and only one. After that, the gods turn their backs.
Her eyes were closed, but she was awake.
“But I don’t know who you are,” he said.
“You could come with me,” he said, “to New York.”
But he knew that she could never be there, or anywhere but here, on this night.
“And I can’t stay here,” he said, knowing that this was the truest and most empty part of all.
He waited for a time and then said again, “Are you real? Are you really real?”
They slept. The moon went down the sky toward morning.
He walked out of the hills and the forest at dawn, to find the car covered with dew. He unlocked it and climbed in behind the wheel, and sat for a moment looking back at the path he had made in the wet grass.
He moved over, preparatory to getting out of the car again. He put his hand on the inside of the door and gazed steadily out. The forest was empty and still. The path was deserted. The highway was motionless and serene. There was no movement anywhere in a thousand miles.
He started the car motor and let it idle. The car was pointed east, where the orange sun was now rising slowly.
“All right,” he said quietly. “Everyone, here I come. What a shame you’re all still alive. What a shame the world isn’t just hills and hills, and nothing else to drive over but hills, and never coming to a town.”
He drove away east, without looking back.
— From Ray Bradbury, “One Night in Your Life,” in The Toynbee Convector
Recently Daryl Bem defended his famous research into precognition in a letter to The Chronicle of Higher Education. More recently, as in this week, Salon published a major piece about Bem and his research that delves deeply into its implications for the whole of contemporary science — especially psychology and the other social sciences (or “social sciences”), but also the wider of world of science in general — and shows how Bem’s research, and the reactions to it, have highlighted, underscored, and called out some very serious problems:
Bem’s 10-year investigation, his nine experiments, his thousand subjects—all of it would have to be taken seriously. He’d shown, with more rigor than anyone ever had before, that it might be possible to see into the future. Bem knew his research would not convince the die-hard skeptics. But he also knew it couldn’t be ignored.
When the study went public, about six months later, some of Bem’s colleagues guessed it was a hoax. Other scholars, those who believed in ESP — theirs is a small but fervent field of study — saw his paper as validation of their work and a chance for mainstream credibility.
But for most observers, at least the mainstream ones, the paper posed a very difficult dilemma. It was both methodologically sound and logically insane. Daryl Bem had seemed to prove that time can flow in two directions — that ESP is real. If you bought into those results, you’d be admitting that much of what you understood about the universe was wrong. If you rejected them, you’d be admitting something almost as momentous: that the standard methods of psychology cannot be trusted, and that much of what gets published in the field — and thus, much of what we think we understand about the mind — could be total bunk.
If one had to choose a single moment that set off the “replication crisis” in psychology — an event that nudged the discipline into its present and anarchic state, where even textbook findings have been cast in doubt — this might be it: the publication, in early 2011, of Daryl Bem’s experiments on second sight.
The replication crisis as it’s understood today may yet prove to be a passing worry or else a mild problem calling for a soft corrective. It might also grow and spread in years to come, flaring from the social sciences into other disciplines, burning trails of cinder through medicine, neuroscience, and chemistry. It’s hard to see into the future. But here’s one thing we can say about the past: The final research project of Bem’s career landed like an ember in the underbrush and set his field ablaze. . . .
When Bem started investigating ESP, he realized the details of his research methods would be scrutinized with far more care than they had been before. In the years since his work was published, those higher standards have increasingly applied to a broad range of research, not just studies of the paranormal. “I get more credit for having started the revolution in questioning mainstream psychological methods than I deserve,” Bem told me. “I was in the right place at the right time. The groundwork was already pre-prepared, and I just made it all startlingly clear.”
Looking back, however, his research offered something more than a vivid illustration of problems in the field of psychology. It opened up a platform for discussion. Bem hadn’t simply published a set of inconceivable findings; he’d done so in a way that explicitly invited introspection. In his paper proving ESP is real, Bem used the word replication 33 times. Even as he made the claim for precognition, he pleaded for its review.
“Credit to Daryl Bem himself,” [University of California-Berkeley business school professor] Leif Nelson told me. “He’s such a smart, interesting man. . . . In that paper, he actively encouraged replication in a way that no one ever does. He said, ‘This is an extraordinary claim, so we need to be open with our procedures.’ . . . It was a prompt for skepticism and action.”
Bem meant to satisfy the skeptics, but in the end he did the opposite: He energized their doubts and helped incite a dawning revolution. Yet again, one of the world’s leading social psychologists had made a lasting contribution and influenced his peers. “I’m sort of proud of that,” Bem conceded at the end of our conversation. “But I’d rather they started to believe in psi as well. I’d rather they remember my work for the ideas.”
Note that the article also contains, in its middle section, a fascinating personal profile and mini-biography of Bem himself, including a recounting of his life-long interest in mentalism, which began in his teen years and persisted into his career in academia:
As a young professor at Carnegie Mellon University, Bem liked to close out each semester by performing as a mentalist. After putting on his show, he’d tell his students that he didn’t really have ESP. In class, he also stressed how easily people can be fooled into believing they’ve witnessed paranormal phenomena.
This brief video essay on the source of our collective craving for “the awful futures of apocalyptic fiction” is really well done. Skillfully executed and thought-provoking. A worthwhile investment of five reflective minutes. Here’s the description:
In the first two decades of the new millennium, stories of the post-apocalypse have permeated pop culture, from books such as Cormac McCarthy’s The Road (2006), Paolo Bacigalupi’s The Windup Girl (2009) and Emily St John Mandel’s Station Eleven (2014) to films and TV programmes such as The Walking Dead (2010-), the Hunger Games series (2012-15) and Mad Max: Fury Road (2015). While post-apocalyptic fictions of previous eras largely served as cautionary tales — against nuclear brinksmanship in On the Beach (1959) or weaponised biology in The Stand (1978) — today’s versions of these tales depict less alterable, more oblique and diffuse visions of our doom. So why can’t we seem to get enough of humanity’s unavoidable collapse and its bleak aftermath?
Dispatches from the Ruins reflects on what these stories — set among crumbling buildings, overgrown lots and barren wastelands — might be telling us about modern fears and fantasies. This Aeon original video is adapted from an Aeon essay by the US writer Frank Bures. Bures is also the author of The Geography of Madness (2016), a book about cultural syndromes across the world. His work has been included in the Best American Travel Writing and appeared in Harper’s, Lapham’s Quarterly and the Washington Post Magazine, among others.
For eight minutes of pure, unadulterated awesome, here’s Doc Severinsen, from his 1970 LP Doc Severinsen’s Closet, performing King Crimson’s “In the Court of the Crimson King.” No, this is not a hallucination, although it may represent some kind of ripple in the Matrix. Many thanks to Richard Metzger at Dangerous Minds for unearthing this, and to Joe Pulver for calling attention to it at Facebook. Metzger accurately conveys the feel of Severinsen’s cover when he describes it as “moving from an almost Morricone-like spaghetti western-sounding beginning” to an “(inspired) James Bond-ish bit (and back again).”
Personally, I have a soft spot for Severinsen not just because he’s a musical genius, and not just because I grew up during the era when he and his band were the house musical act on The Tonight Show during Johnny Carson’s tenure, but because in my former career as a video and media professional I was on the camera crew at The Grand Palace in Branson, Missouri, when Doc and the band came through town for a performance. (What? Doc Severinsen playing in Branson? No, really, there’s even newspaper evidence.) It was fully as cool as one would have hoped, and I even had a chance to chat with the band backstage. There was no “Crimson King” in their set, though. Which is probably for the best, since I strongly suspect this song would have melted the minds of that mostly Southern and Midwestern audience who had come to Branson mainly for country music and a big dose of manufactured nostalgia.
One Nation under Many Gods: In a Fractious and Fractured Political Age, New Age Mysticism Still Unites Americans
A version of the reverse of the Great Seal of the United States printed in a 1909 U.S. Government booklet on the Great Seal. According to Henry A. Wallace, this was the version that caught his eye, causing him to suggest to President Franklin Roosevelt to put the design on a coin, at which point Roosevelt decided to put it on the back of the dollar bill.
A newly published article at Salon by Mitch Horowitz is typically insightful and well-written, and well worth your time. And despite the headline, it’s not really about Steve Bannon. I mean, yes, it does contain the revelation that Horowitz knows Bannon, and that his view of the man diverges sharply from the widespread popular one that reigns in the mass media:
Although the media have characterized Bannon as the Disraeli of the dark side following his rise to power in the Trump administration, I knew him, and still do, as a deeply read and erudite observer of the American religious scene, with a keen appetite for mystical thought.
But the article’s overall topic is much broader, as indicated in the provided editorial teaser: “If you think New Age alternative spirituality is solely the domain of lefty hippies, you don’t know your history.” In just under two thousand words Horowitz discusses such things as the influence of Manly P. Hall on Ronald Reagan, Madame Blavatsky’s promulgation of the idea of “America as the catalyst for a revolution in human potential,” Donald Trump’s association with Norman Vincent Peale, FDR’s decision to put the eye-and-pyramid of the Great Seal of the United States on the dollar bill, Hillary Clinton’s visioneering meetings Jean Houston (who once told Bill Clinton that he was an “undeveloped shaman,” at which point he got up and walked out), and more. Horowitz’s basic point is that none of this represents a conspiracy, notwithstanding the claims of the paranoid conspiracy theorizing crowd:
Rather than fomenting secrecy or subterfuge, America’s embrace of esotericism is often characterized by a chin-out earnestness, something that many observers and conspiracy-mongers miss.. . . . Today, cable television producers and radio hosts often urge me to postulate some kind of occult “pact” between the Bushes and the dark side (cue up Skull and Bones). But such things are fantasy. The truth is, Americans have always been, well, a little strange. As a historian, I feel affection for that aspect of American life. Shadowy figures have long hung around the fringes of power in many nations; but rarely have they done so with the ingenuousness and transparency of those I’ve been considering.
And to cap it off, he ends on a note that is positively eloquent and inspiring:
If there is a central principle in American life, one valued across our political spectrum, it is a belief in the protection of the individual search for meaning. The presence and persistence of esoteric and unusual religious ideas in our political culture, including in its most conservative quarters, serves as evidence that that core principle is still working. In the U.S. military, religiously observant service members and veterans can now choose among more than 65 “emblems of belief,” including pentagrams, druidic symbols and every variety of mystical insignia. We are truly one nation under many gods — a fact that unites us across our fractured political divide.
The following paragraphs are from a talk delivered by Pinboard founder Maciej Cegłowski at the recent Emerging Technologies for the Enterprise conference in Philadelphia. Citing as Exhibit A the colossal train wreck that was the 2016 American presidential election, Cegłowski basically explains how, in the current version of the Internet that has emerged over the past decade-plus, we have collectively created a technology that is perfectly calibrated for undermining Western democratic societies and ideals.
But as incisive as his analysis is, I seriously doubt that his (equally incisive) proposed solutions, described later in the piece, will ever be implemented to any meaningful extent. I mean, if we’re going to employ the explicitly Frankensteinian metaphor of “building a monster,” then it’s important to bear in mind that Victor Frankenstein and his wretched creation did not find their way to anything resembling a happy ending. (And note that Cegłowski himself acknowledges as much at the end of his piece when he closes his discussion of proposed solutions by asserting that “even though we’re likely to fail, all we can do is try.”)
This year especially there’s an uncomfortable feeling in the tech industry that we did something wrong, that in following our credo of “move fast and break things,” some of what we knocked down were the load-bearing walls of our democracy. . . .
A question few are asking is whether the tools of mass surveillance and social control we spent the last decade building could have had anything to do with the debacle of the 2017 [sic] election, or whether destroying local journalism and making national journalism so dependent on our platforms was, in retrospect, a good idea. . . .
We built the commercial internet by mastering techniques of persuasion and surveillance that we’ve extended to billions of people, including essentially the entire population of the Western democracies. But admitting that this tool of social control might be conducive to authoritarianism is not something we’re ready to face. After all, we’re good people. We like freedom. How could we have built tools that subvert it? . . .
The economic basis of the Internet is surveillance. Every interaction with a computing device leaves a data trail, and whole industries exist to consume this data. Unlike dystopian visions from the past, this surveillance is not just being conducted by governments or faceless corporations. Instead, it’s the work of a small number of sympathetic tech companies with likable founders, whose real dream is to build robots and Mars rockets and do cool things that make the world better. Surveillance just pays the bills. . . .
Orwell imagined a world in which the state could shamelessly rewrite the past. The Internet has taught us that people are happy to do this work themselves, provided they have their peer group with them, and a common enemy to unite against. They will happily construct alternative realities for themselves, and adjust them as necessary to fit the changing facts . . . .
A lot of what we call “disruption” in the tech industry has just been killing flawed but established institutions, and mining them for parts. When we do this, we make a dangerous assumption about our ability to undo our own bad decisions, or the time span required to build institutions that match the needs of new realities.
Right now, a small caste of programmers is in charge of the surveillance economy, and has broad latitude to change it. But this situation will not last for long. The kinds of black-box machine learning that have been so successful in the age of mass surveillance are going to become commoditized and will no longer require skilled artisans to deploy. . . .
Unless something happens to mobilize the tech workforce, or unless the advertising bubble finally bursts, we can expect the weird, topsy-turvy status quo of 2017 to solidify into the new reality.
Here in North Texas we’re currently experiencing the warmest start to a year on record. This comes on the heels of the warmest winter in Texas history. A few years ago we had the dramatic wildfire apocalypse — enabled by an epic drought — that engulfed huge portions of the state, and that had me nervously watching huge plumes of smoke billow up from behind the hillside in back of my house. The drought was ended by historic flooding. The same year as the floods, a positively crazy chain of severe spring thunderstorms tore right through the area where my family and I live, spawning a line of repeated tornadoes, one after the other, all afternoon and overnight. This is something that has always been more common back in the Missouri Ozarks where I’m from. Nor was the perception of something different down here merely a subjective one; 2015 ended up being a record year for tornadoes in Texas. Last year there was more severe flooding, including right where I live. Thus far, my entire time in Texas has been marked by one natural disaster after another. And to think, one reason my family and I moved down here in the first place was to leave behind the increasingly severe weather in Missouri, especially the brutal winters where crippling ice storms have become much more frequent during the past ten and fifteen years than they were during my entire previous life up there.
So in light of such things, this meditation in The New York Times Magazine on not just the future but the present reality of climate change really hits home.
The future we’ve been warned about is beginning to saturate the present. We tend to imagine climate change as a destroyer. But it also traffics in disruption, disarray: increasingly frequent and more powerful storms and droughts; heightened flooding; expanded ranges of pests turning forests into fuel for wildfires; stretches of inhospitable heat. So many facets of our existence — agriculture, transportation, cities and the architecture they spawned — were designed to suit specific environments. Now they are being slowly transplanted into different, more volatile ones, without ever actually moving. . . .
We seem able to normalize catastrophes as we absorb them, a phenomenon that points to what Peter Kahn, a professor of psychology at the University of Washington, calls “environmental generational amnesia.” Each generation, Kahn argues, can recognize only the ecological changes its members witness during their lifetimes. . . .
Scenarios that might sound dystopian or satirical as broad-strokes future projections unassumingly materialize as reality. Last year, melting permafrost in Siberia released a strain of anthrax, which had been sealed in a frozen reindeer carcass, sickening 100 people and killing one child. In July 2015, during the hottest month ever recorded on earth (until the following year), and the hottest day ever recorded in England (until the following summer), the Guardian newspaper had to shut down its live-blogging of the heat wave when the servers overheated. And low-lying cities around the world are experiencing increased “clear-sky flooding,” in which streets or entire neighborhoods are washed out temporarily by high tides and storm surges. Parts of Washington now experience flooding 30 days a year, a figure that has roughly quadrupled since 1960. In Wilmington, N.C., the number is 90 days. But scientists and city planners have conjured a term of art that defuses that astonishing reality: “nuisance flooding,” they call it.
Kahn calls our environmental generational amnesia “one of the central psychological problems of our lifetime,” because it obscures the magnitude of so many concrete problems. You can wind up not looking away, exactly, but zoomed in too tightly to see things for what they are. Still, the tide is always rising in the background, swallowing something. And the longer you live, the more anxiously trapped you may feel between the losses already sustained and the ones you see coming. . . .
The future is always somebody else’s present — it will very likely feel as authentic, and only as horrific, as our moment does to us. But the present is also somebody else’s future: We are already standing on someone else’s ludicrous map. Except none of us are in on the joke, and I’m guessing that it won’t feel funny any time soon.
Here’s the ever-reliable Nick Ripatrazone discussing the inspirational influence of Marshall McLuhan on David Cronenberg as the latter was conceiving and making 1983’s Videodrome, which Ripatrazone characterizes — correctly, I think — as “perfect viewing for 2017 — the year a man baptized by television becomes president.” The article also provides an able introduction to McLuhan’s legacy, reputation, and influence.
In his audio commentary for the film, Cronenberg admits that the professor [Brian O’Blivion, who runs the sinister Videodrome broadcast and its attendant “mission” for homeless people, the Cathode Ray Mission] was inspired by the “communications guru” Marshall McLuhan. McLuhan taught at the University of Toronto while Cronenberg attended, but to his “everlasting regret,” he never took a course with the media icon. Cronenberg said that McLuhan’s “influence was felt everywhere at the university” — a mystical-tinged description that McLuhan would have appreciated. . . .
McLuhan was a scholar of James Joyce, a purveyor of print. He documented the advent of the electric eye, but he didn’t desire it. Although he had “nothing but distaste for the process of change,” he said you had to “keep cool during our descent into the maelstrom.” Max can’t keep cool. He is infected by Videodrome; the show’s reality subverts its unreal medium. Max discovers that Professor O’Blivion helped create Videodrome because “he saw it as the next phase in the evolution of man as a technological animal.” Sustained viewing of Videodrome creates tumors and hallucinations. Max is being played by the remaining originators of Videodrome, whose philosophy sounds downright familiar: “North America’s getting soft, and the rest of the world is getting tough. We’re entering savage new times, and we’re going to have to be pure and direct and strong if we’re going to survive them.” Videodrome is a way to identify the derelicts by giving them what they most crave — real violence — and then incapacitate them into submission.
McLuhan’s idea that “mental breakdown is the very common result of uprooting and inundation with new information,” and his simultaneous interest in, and skepticism of, the “electric eye” finds a gory literalism in Cronenberg’s film. Videodrome is what happens when a self-described existentialist atheist channels McLuhan — but makes McLuhan’s Catholic-infused media analysis more secular and raw. Cronenberg was able to foretell our electronic evolution, the quasi-Eucharistic way we “taste and see” the Internet. The film’s gore and gush might now strike us as campy, but Videodrome shows what happens when mind and device become one. “Death is not the end,” one character says, but “the beginning of the new flesh.” We’re already there.
Love this video essay from filmmaker (and former Buddhist Studies scholar) Daniel Clarkson Fisher. Perhaps you will, too. It’s great stuff, excellently conceived and executed. Perhaps I don’t agree with absolutely all of the political statements made in it. But I agree with enough of them. And anyway, it’s about Carpenter’s They Live. So what else matters?
From the included interviews:
Slavov Zizek: They Live from 1988 is definitely one of the forgotten masterpieces of the Hollywood Left. It tells the story of John Nada — nada, of course, in Spanish, means “nothing,” a pure subject deprived of all substantial content — a homeless worker in L.A. who, drifting around, one day enters an abandoned church and finds there a strange box full of sunglasses. And when he puts one of them on, walking along the L.A. streets, he discovers something weird: that these glasses function like “critique of ideology” glasses. They allow you to see the real message beneath all the propaganda, publicity glitz, posters, and so on.
John Carpenter: I was reflecting on a lot of the values that I saw around me at the time, mainly inspired by Ronald Reagan’s conservative revolution. There was a great deal of obsession with greed and making a lot of money, and some of the values that I grew up with had been pushed aside. So I decided to scream out in the middle of the night and make a statement about that. And They Live is partially a political statement. It’s partially a tract on the world that we live in today. And as a matter of fact, right now it’s even more true than it was then.
Here’s science writer Carrie Arnold, in a newly published article at Aeon titled “Watchers of the Earth,” discussing the possibility that indigenous myths may carry warning signals for natural disasters:
Shortly before 8am on 26 December 2004, the cicadas fell silent and the ground shook in dismay. The Moken, an isolated tribe on the Andaman Islands in the Indian Ocean, knew that the Laboon, the ‘wave that eats people’, had stirred from his ocean lair. The Moken also knew what was next: a towering wall of water washing over their island, cleansing it of all that was evil and impure. To heed the Laboon’s warning signs, elders told their children, run to high ground.
The tiny Andaman and Nicobar Islands were directly in the path of the tsunami generated by the magnitude 9.1 earthquake off the coast of Sumatra. Final totals put the islands’ death toll at 1,879, with another 5,600 people missing. When relief workers finally came ashore, however, they realised that the death toll was skewed. The islanders who had heard the stories about the Laboon or similar mythological figures survived the tsunami essentially unscathed. Most of the casualties occurred in the southern Nicobar Islands. Part of the reason was the area’s geography, which generated a higher wave. But also at the root was the lack of a legacy; many residents in the city of Port Blair were outsiders, leaving them with no indigenous tsunami warning system to guide them to higher ground.
Humanity has always courted disaster. We have lived, died and even thrived alongside vengeful volcanoes and merciless waves. Some disasters arrive without warning, leaving survival to luck. Often, however, there is a small window of time giving people a chance to escape. Learning how to crack open this window can be difficult when a given catastrophe strikes once every few generations. So humans passed down stories through the ages that helped cultures to cope when disaster inevitably struck. These stories were fodder for anthropologists and social scientists, but in the past decade, geologists have begun to pay more attention to how indigenous peoples understood, and prepared for, disaster. These stories, which couched myth in metaphor, could ultimately help scientists prepare for cataclysms to come.
Reading this triggered a flood of associated thoughts this morning, mostly related to things I’ve read elsewhere that resonate with it. Although the basic focus is different, for me this article somewhat recalls a starkly apocalyptic and millenarian passage from the ending to Benjamin Hoff’s The Te of Piglet (1992), a book that many readers found off-putting for its semi-grimness, which represented a departure from the more charmingly whimsical presentation of Taoism that Hoff had adopted in its predecessor, The Tao of Pooh: Read the rest of this entry