The following paragraphs are from a talk delivered by Pinboard founder Maciej Cegłowski at the recent Emerging Technologies for the Enterprise conference in Philadelphia. Citing as Exhibit A the colossal train wreck that was the 2016 American presidential election, Cegłowski basically explains how, in the current version of the Internet that has emerged over the past decade-plus, we have collectively created a technology that is perfectly calibrated for undermining Western democratic societies and ideals.
But as incisive as his analysis is, I seriously doubt that his (equally incisive) proposed solutions, described later in the piece, will ever be implemented to any meaningful extent. I mean, if we’re going to employ the explicitly Frankensteinian metaphor of “building a monster,” then it’s important to bear in mind that Victor Frankenstein and his wretched creation did not find their way to anything resembling a happy ending. (And note that Cegłowski himself acknowledges as much at the end of his piece when he closes his discussion of proposed solutions by asserting that “even though we’re likely to fail, all we can do is try.”)
This year especially there’s an uncomfortable feeling in the tech industry that we did something wrong, that in following our credo of “move fast and break things,” some of what we knocked down were the load-bearing walls of our democracy. . . .
A question few are asking is whether the tools of mass surveillance and social control we spent the last decade building could have had anything to do with the debacle of the 2017 [sic] election, or whether destroying local journalism and making national journalism so dependent on our platforms was, in retrospect, a good idea. . . .
We built the commercial internet by mastering techniques of persuasion and surveillance that we’ve extended to billions of people, including essentially the entire population of the Western democracies. But admitting that this tool of social control might be conducive to authoritarianism is not something we’re ready to face. After all, we’re good people. We like freedom. How could we have built tools that subvert it? . . .
The economic basis of the Internet is surveillance. Every interaction with a computing device leaves a data trail, and whole industries exist to consume this data. Unlike dystopian visions from the past, this surveillance is not just being conducted by governments or faceless corporations. Instead, it’s the work of a small number of sympathetic tech companies with likable founders, whose real dream is to build robots and Mars rockets and do cool things that make the world better. Surveillance just pays the bills. . . .
Orwell imagined a world in which the state could shamelessly rewrite the past. The Internet has taught us that people are happy to do this work themselves, provided they have their peer group with them, and a common enemy to unite against. They will happily construct alternative realities for themselves, and adjust them as necessary to fit the changing facts . . . .
A lot of what we call “disruption” in the tech industry has just been killing flawed but established institutions, and mining them for parts. When we do this, we make a dangerous assumption about our ability to undo our own bad decisions, or the time span required to build institutions that match the needs of new realities.
Right now, a small caste of programmers is in charge of the surveillance economy, and has broad latitude to change it. But this situation will not last for long. The kinds of black-box machine learning that have been so successful in the age of mass surveillance are going to become commoditized and will no longer require skilled artisans to deploy. . . .
Unless something happens to mobilize the tech workforce, or unless the advertising bubble finally bursts, we can expect the weird, topsy-turvy status quo of 2017 to solidify into the new reality.
Here in North Texas we’re currently experiencing the warmest start to a year on record. This comes on the heels of the warmest winter in Texas history. A few years ago we had the dramatic wildfire apocalypse — enabled by an epic drought — that engulfed huge portions of the state, and that had me nervously watching huge plumes of smoke billow up from behind the hillside in back of my house. The drought was ended by historic flooding. The same year as the floods, a positively crazy chain of severe spring thunderstorms tore right through the area where my family and I live, spawning a line of repeated tornadoes, one after the other, all afternoon and overnight. This is something that has always been more common back in the Missouri Ozarks where I’m from. Nor was the perception of something different down here merely a subjective one; 2015 ended up being a record year for tornadoes in Texas. Last year there was more severe flooding, including right where I live. Thus far, my entire time in Texas has been marked by one natural disaster after another. And to think, one reason my family and I moved down here in the first place was to leave behind the increasingly severe weather in Missouri, especially the brutal winters where crippling ice storms have become much more frequent during the past ten and fifteen years than they were during my entire previous life up there.
So in light of such things, this meditation in The New York Times Magazine on not just the future but the present reality of climate change really hits home.
The future we’ve been warned about is beginning to saturate the present. We tend to imagine climate change as a destroyer. But it also traffics in disruption, disarray: increasingly frequent and more powerful storms and droughts; heightened flooding; expanded ranges of pests turning forests into fuel for wildfires; stretches of inhospitable heat. So many facets of our existence — agriculture, transportation, cities and the architecture they spawned — were designed to suit specific environments. Now they are being slowly transplanted into different, more volatile ones, without ever actually moving. . . .
We seem able to normalize catastrophes as we absorb them, a phenomenon that points to what Peter Kahn, a professor of psychology at the University of Washington, calls “environmental generational amnesia.” Each generation, Kahn argues, can recognize only the ecological changes its members witness during their lifetimes. . . .
Scenarios that might sound dystopian or satirical as broad-strokes future projections unassumingly materialize as reality. Last year, melting permafrost in Siberia released a strain of anthrax, which had been sealed in a frozen reindeer carcass, sickening 100 people and killing one child. In July 2015, during the hottest month ever recorded on earth (until the following year), and the hottest day ever recorded in England (until the following summer), the Guardian newspaper had to shut down its live-blogging of the heat wave when the servers overheated. And low-lying cities around the world are experiencing increased “clear-sky flooding,” in which streets or entire neighborhoods are washed out temporarily by high tides and storm surges. Parts of Washington now experience flooding 30 days a year, a figure that has roughly quadrupled since 1960. In Wilmington, N.C., the number is 90 days. But scientists and city planners have conjured a term of art that defuses that astonishing reality: “nuisance flooding,” they call it.
Kahn calls our environmental generational amnesia “one of the central psychological problems of our lifetime,” because it obscures the magnitude of so many concrete problems. You can wind up not looking away, exactly, but zoomed in too tightly to see things for what they are. Still, the tide is always rising in the background, swallowing something. And the longer you live, the more anxiously trapped you may feel between the losses already sustained and the ones you see coming. . . .
The future is always somebody else’s present — it will very likely feel as authentic, and only as horrific, as our moment does to us. But the present is also somebody else’s future: We are already standing on someone else’s ludicrous map. Except none of us are in on the joke, and I’m guessing that it won’t feel funny any time soon.
Here’s the ever-reliable Nick Ripatrazone discussing the inspirational influence of Marshall McLuhan on David Cronenberg as the latter was conceiving and making 1983’s Videodrome, which Ripatrazone characterizes — correctly, I think — as “perfect viewing for 2017 — the year a man baptized by television becomes president.” The article also provides an able introduction to McLuhan’s legacy, reputation, and influence.
In his audio commentary for the film, Cronenberg admits that the professor [Brian O’Blivion, who runs the sinister Videodrome broadcast and its attendant “mission” for homeless people, the Cathode Ray Mission] was inspired by the “communications guru” Marshall McLuhan. McLuhan taught at the University of Toronto while Cronenberg attended, but to his “everlasting regret,” he never took a course with the media icon. Cronenberg said that McLuhan’s “influence was felt everywhere at the university” — a mystical-tinged description that McLuhan would have appreciated. . . .
McLuhan was a scholar of James Joyce, a purveyor of print. He documented the advent of the electric eye, but he didn’t desire it. Although he had “nothing but distaste for the process of change,” he said you had to “keep cool during our descent into the maelstrom.” Max can’t keep cool. He is infected by Videodrome; the show’s reality subverts its unreal medium. Max discovers that Professor O’Blivion helped create Videodrome because “he saw it as the next phase in the evolution of man as a technological animal.” Sustained viewing of Videodrome creates tumors and hallucinations. Max is being played by the remaining originators of Videodrome, whose philosophy sounds downright familiar: “North America’s getting soft, and the rest of the world is getting tough. We’re entering savage new times, and we’re going to have to be pure and direct and strong if we’re going to survive them.” Videodrome is a way to identify the derelicts by giving them what they most crave — real violence — and then incapacitate them into submission.
McLuhan’s idea that “mental breakdown is the very common result of uprooting and inundation with new information,” and his simultaneous interest in, and skepticism of, the “electric eye” finds a gory literalism in Cronenberg’s film. Videodrome is what happens when a self-described existentialist atheist channels McLuhan — but makes McLuhan’s Catholic-infused media analysis more secular and raw. Cronenberg was able to foretell our electronic evolution, the quasi-Eucharistic way we “taste and see” the Internet. The film’s gore and gush might now strike us as campy, but Videodrome shows what happens when mind and device become one. “Death is not the end,” one character says, but “the beginning of the new flesh.” We’re already there.
Love this video essay from filmmaker (and former Buddhist Studies scholar) Daniel Clarkson Fisher. Perhaps you will, too. It’s great stuff, excellently conceived and executed. Perhaps I don’t agree with absolutely all of the political statements made in it. But I agree with enough of them. And anyway, it’s about Carpenter’s They Live. So what else matters?
From the included interviews:
Slavov Zizek: They Live from 1988 is definitely one of the forgotten masterpieces of the Hollywood Left. It tells the story of John Nada — nada, of course, in Spanish, means “nothing,” a pure subject deprived of all substantial content — a homeless worker in L.A. who, drifting around, one day enters an abandoned church and finds there a strange box full of sunglasses. And when he puts one of them on, walking along the L.A. streets, he discovers something weird: that these glasses function like “critique of ideology” glasses. They allow you to see the real message beneath all the propaganda, publicity glitz, posters, and so on.
John Carpenter: I was reflecting on a lot of the values that I saw around me at the time, mainly inspired by Ronald Reagan’s conservative revolution. There was a great deal of obsession with greed and making a lot of money, and some of the values that I grew up with had been pushed aside. So I decided to scream out in the middle of the night and make a statement about that. And They Live is partially a political statement. It’s partially a tract on the world that we live in today. And as a matter of fact, right now it’s even more true than it was then.
Here’s science writer Carrie Arnold, in a newly published article at Aeon titled “Watchers of the Earth,” discussing the possibility that indigenous myths may carry warning signals for natural disasters:
Shortly before 8am on 26 December 2004, the cicadas fell silent and the ground shook in dismay. The Moken, an isolated tribe on the Andaman Islands in the Indian Ocean, knew that the Laboon, the ‘wave that eats people’, had stirred from his ocean lair. The Moken also knew what was next: a towering wall of water washing over their island, cleansing it of all that was evil and impure. To heed the Laboon’s warning signs, elders told their children, run to high ground.
The tiny Andaman and Nicobar Islands were directly in the path of the tsunami generated by the magnitude 9.1 earthquake off the coast of Sumatra. Final totals put the islands’ death toll at 1,879, with another 5,600 people missing. When relief workers finally came ashore, however, they realised that the death toll was skewed. The islanders who had heard the stories about the Laboon or similar mythological figures survived the tsunami essentially unscathed. Most of the casualties occurred in the southern Nicobar Islands. Part of the reason was the area’s geography, which generated a higher wave. But also at the root was the lack of a legacy; many residents in the city of Port Blair were outsiders, leaving them with no indigenous tsunami warning system to guide them to higher ground.
Humanity has always courted disaster. We have lived, died and even thrived alongside vengeful volcanoes and merciless waves. Some disasters arrive without warning, leaving survival to luck. Often, however, there is a small window of time giving people a chance to escape. Learning how to crack open this window can be difficult when a given catastrophe strikes once every few generations. So humans passed down stories through the ages that helped cultures to cope when disaster inevitably struck. These stories were fodder for anthropologists and social scientists, but in the past decade, geologists have begun to pay more attention to how indigenous peoples understood, and prepared for, disaster. These stories, which couched myth in metaphor, could ultimately help scientists prepare for cataclysms to come.
Reading this triggered a flood of associated thoughts this morning, mostly related to things I’ve read elsewhere that resonate with it. Although the basic focus is different, for me this article somewhat recalls a starkly apocalyptic and millenarian passage from the ending to Benjamin Hoff’s The Te of Piglet (1992), a book that many readers found off-putting for its semi-grimness, which represented a departure from the more charmingly whimsical presentation of Taoism that Hoff had adopted in its predecessor, The Tao of Pooh: Read the rest of this entry
The next time somebody tries to recommend a TED talk to me, I may recommend this piece, or else the book it’s excerpted from, Daniel Drezner’s The Ideas Industry: How Pessimists, Partisans, and Plutocrats are Transforming the Marketplace of Ideas. It’s not that there aren’t any worthwhile TED talks, of course. But Drezner’s words hit home in this era of “thought leaders.”
When I refer to “public intellectuals,” I mean experts who are versed and trained enough to be able to comment on a wide range of public policy issues. The public intellectual serves a vital purpose in democratic discourse: exposing shibboleths masquerading as accepted wisdom….
How is a thought leader distinct from a public intellectual? A thought leader is an intellectual evangelist. Thought leaders develop their own singular lens to explain the world, and then proselytize that worldview to anyone within earshot….
Public intellectuals know enough about many things to be able to point out intellectual charlatans. Thought leaders know one big thing and believe that their important idea will change the world.
What is happening is that the marketplace of ideas has turned into the Ideas Industry. The twenty-first century public sphere is bigger, louder, and more lucrative than ever before. A surge of high-level panels, conference circuits, and speaker confabs allows intellectuals to mix with other members of the political, economic, and cultural elite in a way that would have been inconceivable a half century ago….
As America’s elite has gotten richer and richer, they can afford to do anything they want. A century ago, America’s plutocrats converted their wealth into university endowments, think tanks, or philanthropic foundations. Today’s wealthy set up their own intellectual salons and publishing platforms—and they are not hands-off about the intellectual output of their namesakes.
FULL ARTICLE: The Decline of Public Intellectuals
Interesting: Last month The Chronicle of Higher Education published an article by Tom Bartlett, their senior science editor, titled “Spoiled Science.” It’s about the way Cornell University’s renowned Food and Brand Lab has taken a credibility hit in the wake of revelations about multiple statistical anomalies that have been discovered in papers co-authored by its director, Brian Wansink, who is also a celebrity scholar due to appearance on the likes of 60 Minutes and Rachael Ray. The heart of the article’s import is laid out in this paragraph:
The slow-motion credibility crisis in social science has taken the shine off a slew of once-brilliant reputations and thrown years of research into doubt. It’s also led to an undercurrent of anxiety among scientists who fear that their labs and their publication records might come under attack from a feisty cadre of freelance critics. The specifics of these skirmishes can seem technical at times, with talk of p-values and sample sizes, but they go straight to the heart of how new knowledge is created and disseminated, and whether some of what we call science really deserves that label.
In the middle of the piece, Bartlett sudden mentions Daryl Bem’s famous precognition research from a few years ago, and subjects it to a brief but withering moment of scorn:
This isn’t the first time Cornell has had to cope with a blow to its research reputation. In 2011, Daryl Bem, an emeritus professor of psychology, published a paper in which he showed, or seemed to show, that subjects could anticipate pornographic images before they appeared on a computer screen. If true, Bem’s finding would upend what we understand about the nature of time and causation. It would be a big deal. That paper, “Feeling the Future,” was widely ridiculed and failed to replicate, though Bem himself has stood by his results.
Yesterday Bem responded with a letter to the Chronicle titled “In Defense of Research on Precognition,” in which he sets the record straight. He begins by pointing out that his paper was published in Journal of Personality and Social Psychology, which has a rejection rate of 80 percent, and where the paper was approved by four referees and two editors before publication.
He then points out that Barlett’s claim about the experiment’s failure to replicate is patently false: “In 2015, three colleagues and I published a follow-up meta-analysis of 90 such experiments conducted by 33 laboratories in 14 countries. The results strongly support my original findings. In particular, the independent replications are robust and highly significant statistically.”
Finally, he shares this salient fact:
Bartlett further asserts that this research was widely ridiculed and constituted a blow to Cornell’s research reputation. But it was Cornell’s own public-affairs office that was proactively instrumental in setting up interviews with the press and other media following the publication of the original article. New Scientist, Discover, Wired, New York Magazine, and Cornell’s own in-house publications all described the research findings seriously and without ridicule.
Me, I’m just fascinated to see mentions of such matters cropping up repeatedly in a place like The Chronicle of Higher Education, whose publication of essays by Jeffrey Kripal on the paranormal I discussed at some length a few years ago. (And of course I’d be lying if I denied that I simply enjoyed reading Bem’s refutation of Bartlett’s belittling.)
It was around 2010 that I first became aware of Jerry Martin’s book in progress titled God: An Autobiography as Told to a Philosopher. I was deep into blogging at the (now-defunct) Demon Muse site at the time, and I was developing A Course in Demonic Creativity from those materials. So thoughts about the experience of perceived communication from an external psychological or spiritual source were very much on my mind. And when I began reading excerpts and even entire chapters from the God book at its website, I was transfixed. The official description that accompanies the eventually published full version of the book will indicate why:
The voice announced, “I am God.” For Jerry Martin, that encounter began a personal, intellectual, and spiritual adventure. He had not believed in God. He was a philosopher, trained to be skeptical — to doubt everything. So his first question was: Is this really God talking? There were other urgent questions: What will my wife think? Why would God want to talk to me? Does God want me to do something? He began asking all the questions about life and death and ultimate things to which he — and all of us — have sought answers: Love and loss. Happiness and suffering. Good and evil. Death and the afterlife. The world’s religions. The ways God communicates with us. How to live in harmony with God. God: An Autobiography tells the story of these mind-opening conversations with God.
Jerry L. Martin was raised in a Christian home. By the time he left college, he was not a believer. But he was interested in the big questions and so he studied the great thinkers. He became a philosophy professor and served as head of the philosophy department at the University of Colorado at Boulder and of the National Endowment for the Humanities. In addition to scholarly articles on epistemology, the philosophy of mind, and public policy, he wrote reports on education that received national attention and was invited to testify before Congress. He stepped down from that career to write this book.
So you can understand my interest. I got involved in some of the online communications with Jerry that appreciative readers were conducting through the book site, and it swiftly became evident that he and I shared a similar set of concerns, although my own experiences have imparted a decidedly darker cast to my thoughts and writings about the perception of divine and daemonic communication. Jerry and I also conversed through Facebook (to which I only recently returned, with a new account, after a multi-year hiatus), and I found him to be a very kind and generous-spirited correspondent. When the full, final edition of God: An Autobiography was published last year by Calladium, I was pleased to see it make slow but sure and steady headway as readers began to catch on to its import. Kirkus Reviews weighed in with a sparklingly positive (and nicely informative) response in which they called the book “a captivating religious dialogue for the modern age.” A writer for Reading Religion, the book review publication of the American Academy of Religion, praised God: An Autobiography as “the most path-breaking material for future philosophical and theological reflection I have come across in a long time.”
I concur with both assessments. Today I finally got around to writing my own brief review of the book. I posted it at Amazon a little while ago (making it only the second Amazon review that I have ever written; the first was for The Secret of Ventriloquism by Jon Padgett). Now I’m sharing it with Teeming Brain readers, many of whom I suspect will find it, and the book itself, of interest. Read the rest of this entry
As I have mentioned in the past, my good friend Jon Padgett’s debut horror fiction collection The Secret of Ventriloquism, featuring an introduction by me, is a very special piece of work. It has been gratifying to see how events in the several months since I last talked about it have borne this out. Rue Morgue Magazine selected it as the Best Fiction Book of 2016. Michael Calia praised it in The Wall Street Journal. It has gained additional reviews and enthusiastic endorsements from the likes of Paul Tremblay, who describes it as “a horror revelation,” and Weird Fiction Review, where reviewer Adams Mills asserts that it is “a collection that begs to be read as a whole, and then also to be revisited past the first reading.”
If this whets your appetite, be advised that right now, for a limited time, the Kindle edition can be downloaded for free. (After reading it, I think you might also find that you want to buy a physical copy.) [UPDATE 3/26/17: Alas, this offer has now expired.]
To whet your appetite even further, here’s the complete text of my introduction.
* * *
The Secret of Ventriloquism by Jon Padgett
S. T. Joshi has famously argued that the truly great authors of weird fiction have been great precisely because they use their stories as a vehicle for expressing a coherent worldview. I would here like to advance an alternative thesis. I would like to assert that one of the characteristics of great weird fiction, and most especially weird horror—not the sole characteristic, of course, since weird horror is a multifaceted jewel, but a characteristic that is crucial and irreducible in those works of the weird that lodge in the reader’s mind with unforgettable force and intensity—is a vivid and distinct authorial voice.
Can you imagine Poe’s “The Fall of the House of Usher” without the sonorous narrative voice that speaks from the very first page in tones of absolute gloom and abject dread? Can you imagine Lovecraft’s “The Music of Erich Zann” minus its voice of detached, dreamlike trepidation tinged with cosmic horror, as generated by the author’s distinctive deployment of diction and artistry of prose style? Or Shirley Jackson’s The Haunting of Hill House without the striking establishment of voice in the classic opening paragraph (“No live organism can continue for long to exist sanely under conditions of absolute reality; even larks and katydids are supposed, by some, to dream…”), which then develops over the course of the novel into a sustained tone of mingled dread, loneliness, and melancholy? Or what about Ligotti’s “The Last Feast of Harlequin” without its measured tone of fearful discovery foregrounded against an emotional backdrop of desolate inner wintriness, as delivered in the narrative voice of an unnamed social anthropologist investigating a strange clown festival in an American Midwestern town? Each of these stories would be not just diminished but fundamentally altered—neutered, hamstrung, eviscerated—by the removal of its distinctive voice, which, vitally, is not just the narrative voice of the individual story but the voice of the author expressing itself through the environment of that particular work. Read the rest of this entry
After several months of deliberation and development, I have just launched a brand new version of my author site, www.mattcardin.com. The layout and structure are completely new, with easy navigation, a modern look, and an overall sleeker design. Have a look and let me know what you think.