‘Videodrome’ and Marshall McLuhan: The New Flesh meets the New Media

Here’s the ever-reliable Nick Ripatrazone discussing the inspirational influence of Marshall McLuhan on David Cronenberg as the latter was conceiving and making 1983’s Videodrome, which Ripatrazone characterizes — correctly, I think — as “perfect viewing for 2017 — the year a man baptized by television becomes president.” The article also provides an able introduction to McLuhan’s legacy, reputation, and influence.

In his audio commentary for the film, Cronenberg admits that the professor [Brian O’Blivion, who runs the sinister Videodrome broadcast and its attendant “mission” for homeless people, the Cathode Ray Mission] was inspired by the “communications guru” Marshall McLuhan. McLuhan taught at the University of Toronto while Cronenberg attended, but to his “everlasting regret,” he never took a course with the media icon. Cronenberg said that McLuhan’s “influence was felt everywhere at the university” — a mystical-tinged description that McLuhan would have appreciated. . . .

McLuhan was a scholar of James Joyce, a purveyor of print. He documented the advent of the electric eye, but he didn’t desire it. Although he had “nothing but distaste for the process of change,” he said you had to “keep cool during our descent into the maelstrom.” Max can’t keep cool. He is infected by Videodrome; the show’s reality subverts its unreal medium. Max discovers that Professor O’Blivion helped create Videodrome because “he saw it as the next phase in the evolution of man as a technological animal.” Sustained viewing of Videodrome creates tumors and hallucinations. Max is being played by the remaining originators of Videodrome, whose philosophy sounds downright familiar: “North America’s getting soft, and the rest of the world is getting tough. We’re entering savage new times, and we’re going to have to be pure and direct and strong if we’re going to survive them.” Videodrome is a way to identify the derelicts by giving them what they most crave — real violence — and then incapacitate them into submission.

McLuhan’s idea that “mental breakdown is the very common result of uprooting and inundation with new information,” and his simultaneous interest in, and skepticism of, the “electric eye” finds a gory literalism in Cronenberg’s film. Videodrome is what happens when a self-described existentialist atheist channels McLuhan — but makes McLuhan’s Catholic-infused media analysis more secular and raw. Cronenberg was able to foretell our electronic evolution, the quasi-Eucharistic way we “taste and see” the Internet. The film’s gore and gush might now strike us as campy, but Videodrome shows what happens when mind and device become one. “Death is not the end,” one character says, but “the beginning of the new flesh.” We’re already there.

FULL TEXT: The Video Word Made Flesh: ‘Videodrome’ and Marshall McLuhan

WE SLEEP – John Carpenter’s ‘They Live’ as Prophecy (video essay)

Love this video essay from filmmaker (and former Buddhist Studies scholar) Daniel Clarkson Fisher. Perhaps you will, too. It’s great stuff, excellently conceived and executed. Perhaps I don’t agree with absolutely all of the political statements made in it. But I agree with enough of them. And anyway, it’s about Carpenter’s They Live. So what else matters?

From the included interviews:

Slavov Zizek: They Live from 1988 is definitely one of the forgotten masterpieces of the Hollywood Left. It tells the story of John Nada — nada, of course, in Spanish, means “nothing,” a pure subject deprived of all substantial content — a homeless worker in L.A. who, drifting around, one day enters an abandoned church and finds there a strange box full of sunglasses. And when he puts one of them on, walking along the L.A. streets, he discovers something weird: that these glasses function like “critique of ideology” glasses. They allow you to see the real message beneath all the propaganda, publicity glitz, posters, and so on.

John Carpenter: I was reflecting on a lot of the values that I saw around me at the time, mainly inspired by Ronald Reagan’s conservative revolution. There was a great deal of obsession with greed and making a lot of money, and some of the values that I grew up with had been pushed aside. So I decided to scream out in the middle of the night and make a statement about that. And They Live is partially a political statement. It’s partially a tract on the world that we live in today. And as a matter of fact, right now it’s even more true than it was then.

Indigenous myths, animal ESP, and portents of apocalyptic transformation

Here’s science writer Carrie Arnold, in a newly published article at Aeon titled “Watchers of the Earth,” discussing the possibility that indigenous myths may carry warning signals for natural disasters:

Shortly before 8am on 26 December 2004, the cicadas fell silent and the ground shook in dismay. The Moken, an isolated tribe on the Andaman Islands in the Indian Ocean, knew that the Laboon, the ‘wave that eats people’, had stirred from his ocean lair. The Moken also knew what was next: a towering wall of water washing over their island, cleansing it of all that was evil and impure. To heed the Laboon’s warning signs, elders told their children, run to high ground.

The tiny Andaman and Nicobar Islands were directly in the path of the tsunami generated by the magnitude 9.1 earthquake off the coast of Sumatra. Final totals put the islands’ death toll at 1,879, with another 5,600 people missing. When relief workers finally came ashore, however, they realised that the death toll was skewed. The islanders who had heard the stories about the Laboon or similar mythological figures survived the tsunami essentially unscathed. Most of the casualties occurred in the southern Nicobar Islands. Part of the reason was the area’s geography, which generated a higher wave. But also at the root was the lack of a legacy; many residents in the city of Port Blair were outsiders, leaving them with no indigenous tsunami warning system to guide them to higher ground.

Humanity has always courted disaster. We have lived, died and even thrived alongside vengeful volcanoes and merciless waves. Some disasters arrive without warning, leaving survival to luck. Often, however, there is a small window of time giving people a chance to escape. Learning how to crack open this window can be difficult when a given catastrophe strikes once every few generations. So humans passed down stories through the ages that helped cultures to cope when disaster inevitably struck. These stories were fodder for anthropologists and social scientists, but in the past decade, geologists have begun to pay more attention to how indigenous peoples understood, and prepared for, disaster. These stories, which couched myth in metaphor, could ultimately help scientists prepare for cataclysms to come.

Reading this triggered a flood of associated thoughts this morning, mostly related to things I’ve read elsewhere that resonate with it. Although the basic focus is different, for me this article somewhat recalls a starkly apocalyptic and millenarian passage from the ending to Benjamin Hoff’s The Te of Piglet (1992), a book that many readers found off-putting for its semi-grimness, which represented a departure from the more charmingly whimsical presentation of Taoism that Hoff had adopted in its predecessor, The Tao of Pooh: Read the rest of this entry

How thought leaders displaced public intellectuals

The next time somebody tries to recommend a TED talk to me, I may recommend this piece, or else the book it’s excerpted from, Daniel Drezner’s The Ideas Industry: How Pessimists, Partisans, and Plutocrats are Transforming the Marketplace of Ideas. It’s not that there aren’t any worthwhile TED talks, of course. But Drezner’s words hit home in this era of “thought leaders.”

When I refer to “public intellectuals,” I mean experts who are versed and trained enough to be able to comment on a wide range of public policy issues. The public intellectual serves a vital purpose in democratic discourse: exposing shibboleths masquerading as accepted wisdom….

How is a thought leader distinct from a public intellectual? A thought leader is an intellectual evangelist. Thought leaders develop their own singular lens to explain the world, and then proselytize that worldview to anyone within earshot….

Public intellectuals know enough about many things to be able to point out intellectual charlatans. Thought leaders know one big thing and believe that their important idea will change the world.

What is happening is that the marketplace of ideas has turned into the Ideas Industry. The twenty-first century public sphere is bigger, louder, and more lucrative than ever before. A surge of high-level panels, conference circuits, and speaker confabs allows intellectuals to mix with other members of the political, economic, and cultural elite in a way that would have been inconceivable a half century ago….

As America’s elite has gotten richer and richer, they can afford to do anything they want. A century ago, America’s plutocrats converted their wealth into university endowments, think tanks, or philanthropic foundations. Today’s wealthy set up their own intellectual salons and publishing platforms—and they are not hands-off about the intellectual output of their namesakes.

Defending precognition research in ‘The Chronicle of Higher Education’

Interesting: Last month The Chronicle of Higher Education published an article by Tom Bartlett, their senior science editor, titled “Spoiled Science.” It’s about the way Cornell University’s renowned Food and Brand Lab has taken a credibility hit in the wake of revelations about multiple statistical anomalies that have been discovered in papers co-authored by its director, Brian Wansink, who is also a celebrity scholar due to appearance on the likes of 60 Minutes and Rachael Ray. The heart of the article’s import is laid out in this paragraph:

The slow-motion credibility crisis in social science has taken the shine off a slew of once-brilliant reputations and thrown years of research into doubt. It’s also led to an undercurrent of anxiety among scientists who fear that their labs and their publication records might come under attack from a feisty cadre of freelance critics. The specifics of these skirmishes can seem technical at times, with talk of p-values and sample sizes, but they go straight to the heart of how new knowledge is created and disseminated, and whether some of what we call science really deserves that label.

In the middle of the piece, Bartlett sudden mentions Daryl Bem’s famous precognition research from a few years ago, and subjects it to a brief but withering moment of scorn:

This isn’t the first time Cornell has had to cope with a blow to its research reputation. In 2011, Daryl Bem, an emeritus professor of psychology, published a paper in which he showed, or seemed to show, that subjects could anticipate pornographic images before they appeared on a computer screen. If true, Bem’s finding would upend what we understand about the nature of time and causation. It would be a big deal. That paper, “Feeling the Future,” was widely ridiculed and failed to replicate, though Bem himself has stood by his results.

Yesterday Bem responded with a letter to the Chronicle titled “In Defense of Research on Precognition,” in which he sets the record straight. He begins by pointing out that his paper was published in Journal of Personality and Social Psychology, which has a rejection rate of 80 percent, and where the paper was approved by four referees and two editors before publication.

He then points out that Barlett’s claim about the experiment’s failure to replicate is patently false: “In 2015, three colleagues and I published a follow-up meta-analysis of 90 such experiments conducted by 33 laboratories in 14 countries. The results strongly support my original findings. In particular, the independent replications are robust and highly significant statistically.”

Finally, he shares this salient fact:

Bartlett further asserts that this research was widely ridiculed and constituted a blow to Cornell’s research reputation. But it was Cornell’s own public-affairs office that was proactively instrumental in setting up interviews with the press and other media following the publication of the original article. New Scientist, Discover, Wired, New York Magazine, and Cornell’s own in-house publications all described the research findings seriously and without ridicule.

Me, I’m just fascinated to see mentions of such matters cropping up repeatedly in a place like The Chronicle of Higher Education, whose publication of essays by Jeffrey Kripal on the paranormal I discussed at some length a few years ago. (And of course I’d be lying if I denied that I simply enjoyed reading Bem’s refutation of Bartlett’s belittling.)

A fascinating and important re-visioning of God and religion: “God: An Autobiography”

It was around 2010 that I first became aware of Jerry Martin’s book in progress titled God: An Autobiography as Told to a Philosopher. I was deep into blogging at the (now-defunct) Demon Muse site at the time, and I was developing A Course in Demonic Creativity from those materials. So thoughts about the experience of perceived communication from an external psychological or spiritual source were very much on my mind. And when I began reading excerpts and even entire chapters from the God book at its website, I was transfixed. The official description that accompanies the eventually published full version of the book will indicate why:

The voice announced, “I am God.” For Jerry Martin, that encounter began a personal, intellectual, and spiritual adventure. He had not believed in God. He was a philosopher, trained to be skeptical — to doubt everything. So his first question was: Is this really God talking? There were other urgent questions: What will my wife think? Why would God want to talk to me? Does God want me to do something? He began asking all the questions about life and death and ultimate things to which he — and all of us — have sought answers: Love and loss. Happiness and suffering. Good and evil. Death and the afterlife. The world’s religions. The ways God communicates with us. How to live in harmony with God. God: An Autobiography tells the story of these mind-opening conversations with God.

Jerry L. Martin was raised in a Christian home. By the time he left college, he was not a believer. But he was interested in the big questions and so he studied the great thinkers. He became a philosophy professor and served as head of the philosophy department at the University of Colorado at Boulder and of the National Endowment for the Humanities. In addition to scholarly articles on epistemology, the philosophy of mind, and public policy, he wrote reports on education that received national attention and was invited to testify before Congress. He stepped down from that career to write this book.

So you can understand my interest. I got involved in some of the online communications with Jerry that appreciative readers were conducting through the book site, and it swiftly became evident that he and I shared a similar set of concerns, although my own experiences have imparted a decidedly darker cast to my thoughts and writings about the perception of divine and daemonic communication. Jerry and I also conversed through Facebook (to which I only recently returned, with a new account, after a multi-year hiatus), and I found him to be a very kind and generous-spirited correspondent. When the full, final edition of God: An Autobiography was published last year by Calladium, I was pleased to see it make slow but sure and steady headway as readers began to catch on to its import. Kirkus Reviews weighed in with a sparklingly positive (and nicely informative) response in which they called the book “a captivating religious dialogue for the modern age.” A writer for Reading Religion, the book review publication of the American Academy of Religion, praised God: An Autobiography as “the most path-breaking material for future philosophical and theological reflection I have come across in a long time.”

I concur with both assessments. Today I finally got around to writing my own brief review of the book. I posted it at Amazon a little while ago (making it only the second Amazon review that I have ever written; the first was for The Secret of Ventriloquism by Jon Padgett). Now I’m sharing it with Teeming Brain readers, many of whom I suspect will find it, and the book itself, of interest. Read the rest of this entry

My introduction to Jon Padgett’s ‘The Secret of Ventriloquism’

As I have mentioned in the past, my good friend Jon Padgett’s debut horror fiction collection The Secret of Ventriloquism, featuring an introduction by me, is a very special piece of work. It has been gratifying to see how events in the several months since I last talked about it have borne this out. Rue Morgue Magazine selected it as the Best Fiction Book of 2016. Michael Calia praised it in The Wall Street Journal. It has gained additional reviews and enthusiastic endorsements from the likes of Paul Tremblay, who describes it as “a horror revelation,” and Weird Fiction Review, where reviewer Adams Mills asserts that it is “a collection that begs to be read as a whole, and then also to be revisited past the first reading.”

If this whets your appetite, be advised that right now, for a limited time, the Kindle edition can be downloaded for free. (After reading it, I think you might also find that you want to buy a physical copy.) [UPDATE 3/26/17: Alas, this offer has now expired.]

To whet your appetite even further, here’s the complete text of my introduction.

* * *

Introduction to

The Secret of Ventriloquism by Jon Padgett

S. T. Joshi has famously argued that the truly great authors of weird fiction have been great precisely because they use their stories as a vehicle for expressing a coherent worldview. I would here like to advance an alternative thesis. I would like to assert that one of the characteristics of great weird fiction, and most especially weird horror—not the sole characteristic, of course, since weird horror is a multifaceted jewel, but a characteristic that is crucial and irreducible in those works of the weird that lodge in the reader’s mind with unforgettable force and intensity—is a vivid and distinct authorial voice.

Can you imagine Poe’s “The Fall of the House of Usher” without the sonorous narrative voice that speaks from the very first page in tones of absolute gloom and abject dread? Can you imagine Lovecraft’s “The Music of Erich Zann” minus its voice of detached, dreamlike trepidation tinged with cosmic horror, as generated by the author’s distinctive deployment of diction and artistry of prose style? Or Shirley Jackson’s The Haunting of Hill House without the striking establishment of voice in the classic opening paragraph (“No live organism can continue for long to exist sanely under conditions of absolute reality; even larks and katydids are supposed, by some, to dream…”), which then develops over the course of the novel into a sustained tone of mingled dread, loneliness, and melancholy? Or what about Ligotti’s “The Last Feast of Harlequin” without its measured tone of fearful discovery foregrounded against an emotional backdrop of desolate inner wintriness, as delivered in the narrative voice of an unnamed social anthropologist investigating a strange clown festival in an American Midwestern town? Each of these stories would be not just diminished but fundamentally altered—neutered, hamstrung, eviscerated—by the removal of its distinctive voice, which, vitally, is not just the narrative voice of the individual story but the voice of the author expressing itself through the environment of that particular work. Read the rest of this entry

Revised and Relaunched: MattCardin.com

 

After several months of deliberation and development, I have just launched a brand new version of my author site, www.mattcardin.com. The layout and structure are completely new, with easy navigation, a modern look, and an overall sleeker design. Have a look and let me know what you think.

 

Big Data, Artificial Intelligence, and Dehumanization: Surrendering to the Death of Democracy

 

Greetings, Teeming Brainers. I’m just peeking in from the digital wings, amid much ongoing blog silence, to observe that many of the issues and developments — sociocultural, technological, and more — that I began furiously tracking here way back in 2006 are continuing to head in pretty much the same direction. A case in point is provided by the alarming information, presented in a frankly alarmed tone, that appears in this new piece from Scientific American (originally published in SA’s German-language sister publication, Spektrum der Wissenschaft):

Everything started quite harmlessly. Search engines and recommendation platforms began to offer us personalised suggestions for products and services. This information is based on personal and meta-data that has been gathered from previous searches, purchases and mobility behaviour, as well as social interactions. While officially, the identity of the user is protected, it can, in practice, be inferred quite easily. Today, algorithms know pretty well what we do, what we think and how we feel — possibly even better than our friends and family or even ourselves. Often the recommendations we are offered fit so well that the resulting decisions feel as if they were our own, even though they are actually not our decisions. In fact, we are being remotely controlled ever more successfully in this manner. The more is known about us, the less likely our choices are to be free and not predetermined by others.

But it won’t stop there. Some software platforms are moving towards “persuasive computing.” In the future, using sophisticated manipulation technologies, these platforms will be able to steer us through entire courses of action, be it for the execution of complex work processes or to generate free content for Internet platforms, from which corporations earn billions. The trend goes from programming computers to programming people. . . .

[I]t can be said that we are now at a crossroads. Big data, artificial intelligence, cybernetics and behavioral economics are shaping our society — for better or worse. If such widespread technologies are not compatible with our society’s core values, sooner or later they will cause extensive damage. They could lead to an automated society with totalitarian features. In the worst case, a centralized artificial intelligence would control what we know, what we think and how we act. We are at the historic moment, where we have to decide on the right path — a path that allows us all to benefit from the digital revolution

Oh, and for a concrete illustration of all the above, check this out:

How would behavioural and social control impact our lives? The concept of a Citizen Score, which is now being implemented in China, gives an idea. There, all citizens are rated on a one-dimensional ranking scale. Everything they do gives plus or minus points. This is not only aimed at mass surveillance. The score depends on an individual’s clicks on the Internet and their politically-correct conduct or not, and it determines their credit terms, their access to certain jobs, and travel visas. Therefore, the Citizen Score is about behavioural and social control. Even the behaviour of friends and acquaintances affects this score, i.e. the principle of clan liability is also applied: everyone becomes both a guardian of virtue and a kind of snooping informant, at the same time; unorthodox thinkers are isolated. Were similar principles to spread in democratic countries, it would be ultimately irrelevant whether it was the state or influential companies that set the rules. In both cases, the pillars of democracy would be directly threatened.

FULL ARTICLE: Will Democracy Survive Big Data and Artificial Intelligence?

Of course, none of this is real news to anybody who has been paying attention. It’s just something that people like me, and maybe like you, find troubling enough to highlight and comment on. And maybe, in the end, Cipher from The Matrix will turn out to have been right: Maybe ignorance really is bliss. Because from where I’m sitting, there doesn’t appear to be anything one can do to stop this streamrollering, metastasizing, runaway train-like dystopian trend. Talking about it is just that: talk. Which is one reason why I’ve lost a portion of the will that originally kept me blogging here for so many years. You can only play the role of Cassandra for so long before the intrinsic attraction begins to dissipate. Read the rest of this entry

The world of tomorrow? You can have it (Hollywood post-apocalyptic supercut)