Category Archives: Society & Culture
This brief video essay on the source of our collective craving for “the awful futures of apocalyptic fiction” is really well done. Skillfully executed and thought-provoking. A worthwhile investment of five reflective minutes. Here’s the description:
In the first two decades of the new millennium, stories of the post-apocalypse have permeated pop culture, from books such as Cormac McCarthy’s The Road (2006), Paolo Bacigalupi’s The Windup Girl (2009) and Emily St John Mandel’s Station Eleven (2014) to films and TV programmes such as The Walking Dead (2010-), the Hunger Games series (2012-15) and Mad Max: Fury Road (2015). While post-apocalyptic fictions of previous eras largely served as cautionary tales — against nuclear brinksmanship in On the Beach (1959) or weaponised biology in The Stand (1978) — today’s versions of these tales depict less alterable, more oblique and diffuse visions of our doom. So why can’t we seem to get enough of humanity’s unavoidable collapse and its bleak aftermath?
Dispatches from the Ruins reflects on what these stories — set among crumbling buildings, overgrown lots and barren wastelands — might be telling us about modern fears and fantasies. This Aeon original video is adapted from an Aeon essay by the US writer Frank Bures. Bures is also the author of The Geography of Madness (2016), a book about cultural syndromes across the world. His work has been included in the Best American Travel Writing and appeared in Harper’s, Lapham’s Quarterly and the Washington Post Magazine, among others.
One Nation under Many Gods: In a Fractious and Fractured Political Age, New Age Mysticism Still Unites Americans
A version of the reverse of the Great Seal of the United States printed in a 1909 U.S. Government booklet on the Great Seal. According to Henry A. Wallace, this was the version that caught his eye, causing him to suggest to President Franklin Roosevelt to put the design on a coin, at which point Roosevelt decided to put it on the back of the dollar bill.
A newly published article at Salon by Mitch Horowitz is typically insightful and well-written, and well worth your time. And despite the headline, it’s not really about Steve Bannon. I mean, yes, it does contain the revelation that Horowitz knows Bannon, and that his view of the man diverges sharply from the widespread popular one that reigns in the mass media:
Although the media have characterized Bannon as the Disraeli of the dark side following his rise to power in the Trump administration, I knew him, and still do, as a deeply read and erudite observer of the American religious scene, with a keen appetite for mystical thought.
But the article’s overall topic is much broader, as indicated in the provided editorial teaser: “If you think New Age alternative spirituality is solely the domain of lefty hippies, you don’t know your history.” In just under two thousand words Horowitz discusses such things as the influence of Manly P. Hall on Ronald Reagan, Madame Blavatsky’s promulgation of the idea of “America as the catalyst for a revolution in human potential,” Donald Trump’s association with Norman Vincent Peale, FDR’s decision to put the eye-and-pyramid of the Great Seal of the United States on the dollar bill, Hillary Clinton’s visioneering meetings Jean Houston (who once told Bill Clinton that he was an “undeveloped shaman,” at which point he got up and walked out), and more. Horowitz’s basic point is that none of this represents a conspiracy, notwithstanding the claims of the paranoid conspiracy theorizing crowd:
Rather than fomenting secrecy or subterfuge, America’s embrace of esotericism is often characterized by a chin-out earnestness, something that many observers and conspiracy-mongers miss.. . . . Today, cable television producers and radio hosts often urge me to postulate some kind of occult “pact” between the Bushes and the dark side (cue up Skull and Bones). But such things are fantasy. The truth is, Americans have always been, well, a little strange. As a historian, I feel affection for that aspect of American life. Shadowy figures have long hung around the fringes of power in many nations; but rarely have they done so with the ingenuousness and transparency of those I’ve been considering.
And to cap it off, he ends on a note that is positively eloquent and inspiring:
If there is a central principle in American life, one valued across our political spectrum, it is a belief in the protection of the individual search for meaning. The presence and persistence of esoteric and unusual religious ideas in our political culture, including in its most conservative quarters, serves as evidence that that core principle is still working. In the U.S. military, religiously observant service members and veterans can now choose among more than 65 “emblems of belief,” including pentagrams, druidic symbols and every variety of mystical insignia. We are truly one nation under many gods — a fact that unites us across our fractured political divide.
The following paragraphs are from a talk delivered by Pinboard founder Maciej Cegłowski at the recent Emerging Technologies for the Enterprise conference in Philadelphia. Citing as Exhibit A the colossal train wreck that was the 2016 American presidential election, Cegłowski basically explains how, in the current version of the Internet that has emerged over the past decade-plus, we have collectively created a technology that is perfectly calibrated for undermining Western democratic societies and ideals.
But as incisive as his analysis is, I seriously doubt that his (equally incisive) proposed solutions, described later in the piece, will ever be implemented to any meaningful extent. I mean, if we’re going to employ the explicitly Frankensteinian metaphor of “building a monster,” then it’s important to bear in mind that Victor Frankenstein and his wretched creation did not find their way to anything resembling a happy ending. (And note that Cegłowski himself acknowledges as much at the end of his piece when he closes his discussion of proposed solutions by asserting that “even though we’re likely to fail, all we can do is try.”)
This year especially there’s an uncomfortable feeling in the tech industry that we did something wrong, that in following our credo of “move fast and break things,” some of what we knocked down were the load-bearing walls of our democracy. . . .
A question few are asking is whether the tools of mass surveillance and social control we spent the last decade building could have had anything to do with the debacle of the 2017 [sic] election, or whether destroying local journalism and making national journalism so dependent on our platforms was, in retrospect, a good idea. . . .
We built the commercial internet by mastering techniques of persuasion and surveillance that we’ve extended to billions of people, including essentially the entire population of the Western democracies. But admitting that this tool of social control might be conducive to authoritarianism is not something we’re ready to face. After all, we’re good people. We like freedom. How could we have built tools that subvert it? . . .
The economic basis of the Internet is surveillance. Every interaction with a computing device leaves a data trail, and whole industries exist to consume this data. Unlike dystopian visions from the past, this surveillance is not just being conducted by governments or faceless corporations. Instead, it’s the work of a small number of sympathetic tech companies with likable founders, whose real dream is to build robots and Mars rockets and do cool things that make the world better. Surveillance just pays the bills. . . .
Orwell imagined a world in which the state could shamelessly rewrite the past. The Internet has taught us that people are happy to do this work themselves, provided they have their peer group with them, and a common enemy to unite against. They will happily construct alternative realities for themselves, and adjust them as necessary to fit the changing facts . . . .
A lot of what we call “disruption” in the tech industry has just been killing flawed but established institutions, and mining them for parts. When we do this, we make a dangerous assumption about our ability to undo our own bad decisions, or the time span required to build institutions that match the needs of new realities.
Right now, a small caste of programmers is in charge of the surveillance economy, and has broad latitude to change it. But this situation will not last for long. The kinds of black-box machine learning that have been so successful in the age of mass surveillance are going to become commoditized and will no longer require skilled artisans to deploy. . . .
Unless something happens to mobilize the tech workforce, or unless the advertising bubble finally bursts, we can expect the weird, topsy-turvy status quo of 2017 to solidify into the new reality.
Love this video essay from filmmaker (and former Buddhist Studies scholar) Daniel Clarkson Fisher. Perhaps you will, too. It’s great stuff, excellently conceived and executed. Perhaps I don’t agree with absolutely all of the political statements made in it. But I agree with enough of them. And anyway, it’s about Carpenter’s They Live. So what else matters?
From the included interviews:
Slavov Zizek: They Live from 1988 is definitely one of the forgotten masterpieces of the Hollywood Left. It tells the story of John Nada — nada, of course, in Spanish, means “nothing,” a pure subject deprived of all substantial content — a homeless worker in L.A. who, drifting around, one day enters an abandoned church and finds there a strange box full of sunglasses. And when he puts one of them on, walking along the L.A. streets, he discovers something weird: that these glasses function like “critique of ideology” glasses. They allow you to see the real message beneath all the propaganda, publicity glitz, posters, and so on.
John Carpenter: I was reflecting on a lot of the values that I saw around me at the time, mainly inspired by Ronald Reagan’s conservative revolution. There was a great deal of obsession with greed and making a lot of money, and some of the values that I grew up with had been pushed aside. So I decided to scream out in the middle of the night and make a statement about that. And They Live is partially a political statement. It’s partially a tract on the world that we live in today. And as a matter of fact, right now it’s even more true than it was then.
Here’s science writer Carrie Arnold, in a newly published article at Aeon titled “Watchers of the Earth,” discussing the possibility that indigenous myths may carry warning signals for natural disasters:
Shortly before 8am on 26 December 2004, the cicadas fell silent and the ground shook in dismay. The Moken, an isolated tribe on the Andaman Islands in the Indian Ocean, knew that the Laboon, the ‘wave that eats people’, had stirred from his ocean lair. The Moken also knew what was next: a towering wall of water washing over their island, cleansing it of all that was evil and impure. To heed the Laboon’s warning signs, elders told their children, run to high ground.
The tiny Andaman and Nicobar Islands were directly in the path of the tsunami generated by the magnitude 9.1 earthquake off the coast of Sumatra. Final totals put the islands’ death toll at 1,879, with another 5,600 people missing. When relief workers finally came ashore, however, they realised that the death toll was skewed. The islanders who had heard the stories about the Laboon or similar mythological figures survived the tsunami essentially unscathed. Most of the casualties occurred in the southern Nicobar Islands. Part of the reason was the area’s geography, which generated a higher wave. But also at the root was the lack of a legacy; many residents in the city of Port Blair were outsiders, leaving them with no indigenous tsunami warning system to guide them to higher ground.
Humanity has always courted disaster. We have lived, died and even thrived alongside vengeful volcanoes and merciless waves. Some disasters arrive without warning, leaving survival to luck. Often, however, there is a small window of time giving people a chance to escape. Learning how to crack open this window can be difficult when a given catastrophe strikes once every few generations. So humans passed down stories through the ages that helped cultures to cope when disaster inevitably struck. These stories were fodder for anthropologists and social scientists, but in the past decade, geologists have begun to pay more attention to how indigenous peoples understood, and prepared for, disaster. These stories, which couched myth in metaphor, could ultimately help scientists prepare for cataclysms to come.
Reading this triggered a flood of associated thoughts this morning, mostly related to things I’ve read elsewhere that resonate with it. Although the basic focus is different, for me this article somewhat recalls a starkly apocalyptic and millenarian passage from the ending to Benjamin Hoff’s The Te of Piglet (1992), a book that many readers found off-putting for its semi-grimness, which represented a departure from the more charmingly whimsical presentation of Taoism that Hoff had adopted in its predecessor, The Tao of Pooh: Read the rest of this entry
The next time somebody tries to recommend a TED talk to me, I may recommend this piece, or else the book it’s excerpted from, Daniel Drezner’s The Ideas Industry: How Pessimists, Partisans, and Plutocrats are Transforming the Marketplace of Ideas. It’s not that there aren’t any worthwhile TED talks, of course. But Drezner’s words hit home in this era of “thought leaders.”
When I refer to “public intellectuals,” I mean experts who are versed and trained enough to be able to comment on a wide range of public policy issues. The public intellectual serves a vital purpose in democratic discourse: exposing shibboleths masquerading as accepted wisdom….
How is a thought leader distinct from a public intellectual? A thought leader is an intellectual evangelist. Thought leaders develop their own singular lens to explain the world, and then proselytize that worldview to anyone within earshot….
Public intellectuals know enough about many things to be able to point out intellectual charlatans. Thought leaders know one big thing and believe that their important idea will change the world.
What is happening is that the marketplace of ideas has turned into the Ideas Industry. The twenty-first century public sphere is bigger, louder, and more lucrative than ever before. A surge of high-level panels, conference circuits, and speaker confabs allows intellectuals to mix with other members of the political, economic, and cultural elite in a way that would have been inconceivable a half century ago….
As America’s elite has gotten richer and richer, they can afford to do anything they want. A century ago, America’s plutocrats converted their wealth into university endowments, think tanks, or philanthropic foundations. Today’s wealthy set up their own intellectual salons and publishing platforms—and they are not hands-off about the intellectual output of their namesakes.
FULL ARTICLE: The Decline of Public Intellectuals
Greetings, Teeming Brainers. I’m just peeking in from the digital wings, amid much ongoing blog silence, to observe that many of the issues and developments — sociocultural, technological, and more — that I began furiously tracking here way back in 2006 are continuing to head in pretty much the same direction. A case in point is provided by the alarming information, presented in a frankly alarmed tone, that appears in this new piece from Scientific American (originally published in SA’s German-language sister publication, Spektrum der Wissenschaft):
Everything started quite harmlessly. Search engines and recommendation platforms began to offer us personalised suggestions for products and services. This information is based on personal and meta-data that has been gathered from previous searches, purchases and mobility behaviour, as well as social interactions. While officially, the identity of the user is protected, it can, in practice, be inferred quite easily. Today, algorithms know pretty well what we do, what we think and how we feel — possibly even better than our friends and family or even ourselves. Often the recommendations we are offered fit so well that the resulting decisions feel as if they were our own, even though they are actually not our decisions. In fact, we are being remotely controlled ever more successfully in this manner. The more is known about us, the less likely our choices are to be free and not predetermined by others.
But it won’t stop there. Some software platforms are moving towards “persuasive computing.” In the future, using sophisticated manipulation technologies, these platforms will be able to steer us through entire courses of action, be it for the execution of complex work processes or to generate free content for Internet platforms, from which corporations earn billions. The trend goes from programming computers to programming people. . . .
[I]t can be said that we are now at a crossroads. Big data, artificial intelligence, cybernetics and behavioral economics are shaping our society — for better or worse. If such widespread technologies are not compatible with our society’s core values, sooner or later they will cause extensive damage. They could lead to an automated society with totalitarian features. In the worst case, a centralized artificial intelligence would control what we know, what we think and how we act. We are at the historic moment, where we have to decide on the right path — a path that allows us all to benefit from the digital revolution
Oh, and for a concrete illustration of all the above, check this out:
How would behavioural and social control impact our lives? The concept of a Citizen Score, which is now being implemented in China, gives an idea. There, all citizens are rated on a one-dimensional ranking scale. Everything they do gives plus or minus points. This is not only aimed at mass surveillance. The score depends on an individual’s clicks on the Internet and their politically-correct conduct or not, and it determines their credit terms, their access to certain jobs, and travel visas. Therefore, the Citizen Score is about behavioural and social control. Even the behaviour of friends and acquaintances affects this score, i.e. the principle of clan liability is also applied: everyone becomes both a guardian of virtue and a kind of snooping informant, at the same time; unorthodox thinkers are isolated. Were similar principles to spread in democratic countries, it would be ultimately irrelevant whether it was the state or influential companies that set the rules. In both cases, the pillars of democracy would be directly threatened.
FULL ARTICLE: “Will Democracy Survive Big Data and Artificial Intelligence?”
Of course, none of this is real news to anybody who has been paying attention. It’s just something that people like me, and maybe like you, find troubling enough to highlight and comment on. And maybe, in the end, Cipher from The Matrix will turn out to have been right: Maybe ignorance really is bliss. Because from where I’m sitting, there doesn’t appear to be anything one can do to stop this streamrollering, metastasizing, runaway train-like dystopian trend. Talking about it is just that: talk. Which is one reason why I’ve lost a portion of the will that originally kept me blogging here for so many years. You can only play the role of Cassandra for so long before the intrinsic attraction begins to dissipate. Read the rest of this entry
Riveting and unsettling: Here’s Robert Stolz, Associate Professor of History at the University of Virginia, drawing on a recent interview with nuclear engineer and anti-nuclear activist Dr. Hiroake Koide to write in The Asia-Pacific Journal about the truly cosmic-horrific implications of radiation exposure in our present nuclear age, as related not just to events like Fukushima and Chernobyl but to the entire unfolding of this new era that began with the extensive nuclear tests that were conducted in the middle decades of the twentieth century. And he writes in ways that recall the dark musings of, say, Eugene Thacker on the literal unthinkability of the forces we have now unleashed, complete with references to the deep tradition of cosmic and supernatural horror fiction, including a direct quote from Ligotti’s The Conspiracy against the Human Race.
Because of the very nature of radiation, namely its spatial and temporal scales, in many ways we lack a language adequate to a world lorded over by radiation. The literary genre called Cosmic Horror of Algernon Blackwood or H. P. Lovecraft has long attempted to grasp the frightening realities of unleashing a force that operates on such a-human scales and temporalities as plutonium-239 (half-life over 24,000 years) or uranium-235 (half-life over 700 million years). The Horror writer and arch-pessimist Thomas Ligotti perhaps comes closest to describing the implications of unleashing truly astronomical forces into human everyday life when he writes:
“Such is the motif of supernatural horror: Something terrible in its being comes forward and makes its claim as a shareholder in our reality, or what we think is our reality and ours alone. It may be an emissary from the grave, or an esoteric monstrosity. . . . It may be the offspring of a scientific experiment with unintended consequences. . . . Or it may be a world unto itself of pure morbidity, one suffused with a profound sense of doom without a name — Edgar Allan Poe’s world.”
In our present of 2016 the sense of doom does have a name: Hoshanō sekai — Radiation’s World. Radiation’s World announces that the earth — or at least large parts of it — is no longer exclusively ours. We have rendered huge spaces of the planet off-limits for time periods beyond any scale of recorded history. Parallel to but different than the rapacious depletion of the natural world from forests to cod stocks to fossil fuels that took millennia to build up but are consumed in decades, as we mine deeper temporalities in pursuit of open-ended consumption we have also unleashed anti-human temporalities incompatible with continued production or consumption. It is these spaces that are now ruled by radiation and are no longer part of human society. Like the old Horror trope, we have unleashed forces that we cannot contain. But unlike Horror, there is no discrete monster to kill at the end. Pessimism is surely called for.
— Robert Stolz, “Nuclear Disasters: A Much Greater Event Has Already Taken Place,” The Asia-Pacific Journal, Vol. 14, Issue 16, No. 3 (March 5, 2o16)
Here are some choice passages from an insight-rich essay by historian James McWilliams at The American Scholar, in which he discusses two major and complementary options for dealing with digital technology’s epochal assault on the stable self: first, take serious and substantial steps to humanize the digital world; second, retain (or return to) a serious relationship with the physical book.
The underlying concern with the Internet is not whether it will fragment our attention spans or mold our minds to the bit-work of modernity. In the end, it will likely do both. The deeper question is what can be done when we realize that we want some control over the exchange between our brains and the Web, that we want to protect our deeper sense of self from digital media’s dominance over modern life. . . .
The essence of our dilemma, one that weighs especially heavily on Generation Xers and millennials, is that the digital world disarms our ability to oppose it while luring us with assurances of convenience. It’s critical not only that we identify this process but also that we fully understand how digital media co-opt our sense of self while inhibiting our ability to reclaim it. . . .
This is not to suggest that we should aim to abolish digital media or disconnect completely — not at all. Instead, we must learn to humanize digital life as actively as we’ve digitized human life.
No one solution can restore equity to the human-digital relationship. Still, whatever means we pursue must be readily available (and cheap) and offer the convenience of information, entertainment, and social engagement while promoting identity-building experiences that anchor the self in society. Plato might not have approved, but the tool that’s best suited to achieve these goals today is an object so simple that I can almost feel the eye-rolls coming in response to such a nostalgic fix for a modern dilemma: the book. Saving the self in the age of the selfie may require nothing more or less complicated than recovering the lost art of serious reading. . . .
[A]s the fog of digital life descends, making us increasingly stressed out and unempathetic, solipsistic yet globally connected, and seeking solutions in the crucible of our own angst, it’s worth reiterating what reading does for the searching self. A physical book, which liberates us from pop-up ads and the temptation to click into oblivion when the prose gets dull, represents everything that an identity requires to discover Heidegger’s nearness amid digital tyranny. It offers immersion into inner experience, engagement in impassioned discussion, humility within a larger community, and the affirmation of an ineluctable quest to experience the consciousness of fellow humans. In this way, books can save us.
Full text: “Saving the Self in the Age of the Selfie“
Joan W. Scott in The Nation:
“Civility” has become a watch word for academic administrators. Earlier this year, Inside Higher Ed released a survey of college and university chief academic officers, which found that “a majority of provosts are concerned about declining faculty civility in American higher education.” Most of these provosts also “believe that civility is a legitimate criterion in hiring and evaluating faculty members,” and most think that faculty incivility is directed primarily at administrators. The survey brought into the open what has perhaps long been an unarticulated requirement for promotion and tenure: a certain kind of deference to those in power.
But what exactly is civility — and is it a prerequisite for a vibrant intellectual climate? As it turns out, the definitions on offer are porous and vague. University of Illinois professor Cary Nelson, who supported the decision not to hire Salaita, sees it as a “reluctance to indulge in mutual hatred,” thereby placing a limit on violence and campus warfare. Others stress courteous and respectful behavior and its concomitants: comfort, safety, and security. The University of Missouri’s “Show Me Respect” project includes a “toolbox” that offers 20 ways to achieve civility (including the reminder to “do unto others as you would have them do unto you”). At the University of Wisconsin, Oshkosh, a 2011 conference offered these words of wisdom: “Academic freedom and free speech require open, safe, civil and collegial campus environments.” And a statement from a University of Maryland discussion paper on civility in 2013 defines it “simply as ‘niceness to others.’… Additionally, the definition may be used broadly to spur discussions on how ‘nice guys and gals finish first’ and how cordiality and kindness can be tracked across campus to ensure faculty, staff, and students are indeed playing nice.”
The attempts to secure the comfort and safety of students — now recognized for their economic value as paying clients who need to be satisfied — are subjugating language and thinking to their own ends. These dictates seem to know no limits and are evident in other policies, such as the call for “trigger warnings” in college classrooms. Professors are being asked by the representatives of some students or groups — and by the anxious deans who rush to satisfy their complaints — to avoid assigning material that might provoke flashbacks or even attention to discomforting violence. The demand for trigger warnings has the same intent as the emphasis on comfort and civility in the Salaita affair and the statement to the UC Berkeley community by Dirks: to stifle thought on the part of both teachers and students who might otherwise express opinions that could make others “uncomfortable.”
All of these efforts presume a certain benign self-evidence for the use of the term “civility.” As the University of Maryland statement puts it, “niceness” is “easily understood by all parties”: We know civility when we see it. Left aside in these invocations are not only interpretive differences among individuals and groups (one man’s or woman’s presumed civility may strike another as uncivil), but also the history of the term. Although, as with any word, the meanings of “civility” have changed, the concept still carries traces of its earlier use. I’d argue further that although the contexts and specific applications have varied over time, the notion of civility consistently establishes relations of power whenever it is invoked. Moreover, it is always the powerful who determine its meaning — one that, whatever its specific content, demeans and delegitimizes those who do not meet its test.
MORE: “The New Thought Police“