Blog Archives

The numinous, subversive power of art in an artificial age: Talking with J. F. Martel

 

Reclaiming_Art_in_the_Age_of_Artifice-by_J_F_Martel

Now live: my interview with Canadian filmmaker J. F. Martel, author of the just-published — and thoroughly wonderful — Reclaiming Art in the Age of Artifice, which should be of interest to all Teeming Brainers since it comes with glowing blurb recommendations from the likes of Daniel Pinchbeck, Patrick Harpur, Erik Davis, and yours truly.

Here’s a taste of J. F.’s and my conversation:

MATT CARDIN: How would you describe Reclaiming Art in the Age of Artifice to the uninitiated, to someone who comes to it cold and has no idea what it’s about?

J. F. MARTEL: The book is an attempt to defend art against the onslaught of the cultural industries, which today seek to reduce art to a mindless form of entertainment or, at best, a communication tool. In Reclaiming Art I argue that great works of art constitute an expressive response to the radical mystery of existence. They are therefore inherently strange, troubling, and impossible to reduce to a single meaning or message. Much of contemporary culture is organized in such a way as to push this kind of art to the margins while celebrating works that reaffirm prevailing ideologies. In contrast, real works of art are machines for destroying ideologies, first and foremost the ideologies in which they were created.

MC: What exactly do you mean? How do real works of art serve this subversive function?

JFM: A great art work, be it a movie, a novel, a film, or a dance piece, presents the entire world aesthetically — meaning, as a play of forces that have no inherent moral value. Even the personal convictions of the author, however implicit they may be in the work itself, are given over to the aesthetic. By becoming part of an aesthetic universe, they relinquish the claims to truth that they may hold in the author’s mind in the everyday. This, I think, is how a Christian author like Dostoyevsky can write such agnostic novels, and how an atheistic author like Thomas Ligotti can create fictional worlds imbued with a sense of the sacred, however dark or malignant. Nietzsche said that the world can only be justified aesthetically, that is, beyond the good-and-evil binary trap of ideological thinking. The reason for this is that when we tune in to the aesthetic frequency, we see that the forces that make up the world exceed our “human, all too human” conceptualizations.

FULL INTERVIEW: “Reclaiming Art in the Age of Artifice

Called to academe: The university’s monastic ideal in a neoliberal age

Medieval_writing_desk

Here’s media studies scholar Siva Vaidhyanathan making the case for recognizing the reality of an academic/scholarly calling — in the authentic religious vocational sense — in the midst of a neoliberal age obsessed with the economic and political concerns of the so-called “real world”:

In the United States, and increasingly in the world at large, we tend to reduce the conversation about the value, role, and scope of the scholarly life to how it serves short-term and personal interests like career preparation or job training. Sometimes we discuss higher education as an economic boon, attracting industry to a particular location or employing thousands in a remote town. Or we probe it as an engine of research and innovation. And sometimes we use academia as a tableau for satire or social criticism when we expose the excesses of the lazy and self-indulgent professoriat or giggle at the paper titles at the annual meeting of the Modern Language Association.

But none of these appraisals of the life of the mind gets at the real heart of the matter: the now quaint-sounding matter of the university’s “mission” — the bigger-picture question of what our institutions of higher learning do for and with the world.

. . . Within every great American university, even MIT, there is a monastery. It’s at its core. Sometimes the campus walls and spires make that ancestry undeniable. More often, the stadiums, sweatshirt stores, laboratories, fraternity houses, and career-placement offices mask the monastery. But it’s still there. European universities emerged from the network of monasteries that had accumulated, preserved, copied, and catalogued texts and scrolls over centuries. The transformation from cloistered monastery to slightly less cloistered university occurred in fits and starts over three centuries. But by the eighteenth century, universities throughout Europe were able to converse about this new thing called science and reflect on the meaning and utility of ancient texts that bore new meaning at the dawn of an industrial age.

Early American colleges and universities were likewise religious institutions built to train clergy to serve a sinful people. Soon they took on an additional role: exposing idle sons of the landed gentry such as James Madison and Thomas Jefferson to dangerous books coming over from Europe.

. . . [But today] When we scholars explain our passions — the deep satisfaction we feel when we help a nineteen-year-old make a connection between the Mahabharata and The Iliad, or when our research challenges the surprising results of some medical experiment that the year before generated unwarranted headlines — many of our listeners roll their eyes like my fellow students did back in that classroom in 1995. How embarrassing that people find deep value in such uncountable things.

It’s been a couple of decades since any American faculty member could engage in the deep pursuit of knowledge untethered from the clock or calendar. But many of us still write for the guild and the guild only, satisfied that someday someone might find the work a valuable part of a body of knowledge. But if that never happens, so be it — it’s all part of the calling’s steep price of admission.

MORE: “Mind Games: Making the Case for an Academic Calling in a Neoliberal Age

 

Image: “Medieval writing desk” [Public Domain], via Wikimedia Commons

Recommended Reading 36

This week: How entire U.S. towns now rely on food stamps. The regrets of the Iraqi “sledgehammer man,” whose image became famous in Western media when Saddam’s statue fell. The Obama administration’s epic (and hypocritical) focus on secrecy. The demise of Google Reader and what it portends for Net-i-fied life and culture. The sinister rise of an all-pervasive — and unblinkingly embraced — Orwellian Big Brotherism in the age of Big Data, with a focus on Facebook’s “Like” button, Google Glass, and Google’s vision of “a future of frictionless, continuous shopping.” A surge of ghost sightings and spiritual troubles among survivors of Japan’s earthquake and tsunami. The rise of the “Little Free Libraries” movement in America and abroad. Read the rest of this entry

My Own Personal Tesseract: Reflections on ‘A Wrinkle in Time’

Cosmos_432625_Desktop_Nexus

 

Although my work as an author has been overwhelmingly centered in realms of darkness and horror, as cross-fertilized by my deep and personal focus on matters of religion, philosophy, and psychology, I have also been a lifelong lover of fantasy and science fiction. So perhaps it’s not surprising that one of the foundational books in my life has been A Wrinkle in Time, which wraps up all of these genres, themes, and concerns inside a story, a writing style, and a sensibility that together epitomize the word “wonderful.” Interestingly, over the past decade-plus of my involvement in professional writing and publishing, I’ve found that many other authors who likewise work in the field labeled “horror” count Wrinkle as one of their most cherished books.

Yesterday I caught wind of the fact that a graphic novel adaptation has just been released. I did a bit of looking into it. This involved reading several plot summaries and celebrations of the original novel. And, appropriately enough, it all sent my thoughts and emotions soaring backward and forward through time. Read the rest of this entry

Books, solitude, and finding your own reality amid a cultural cacophony

From a lecture titled “Solitude and Leadership,” which William Deresiewicz delivered to the plebe class at the United States Military Academy at West Point in October 2009:

Thinking for yourself means finding yourself, finding your own reality. Here’s the other problem with Facebook and Twitter and even The New York Times. When you expose yourself to those things, especially in the constant way that people do now — older people as well as younger people — you are continuously bombarding yourself with a stream of other people’s thoughts. You are marinating yourself in the conventional wisdom. In other people’s reality: for others, not for yourself. You are creating a cacophony in which it is impossible to hear your own voice, whether it’s yourself you’re thinking about or anything else. That’s what Emerson meant when he said that “he who should inspire and lead his race must be defended from travelling with the souls of other men, from living, breathing, reading, and writing in the daily, time-worn yoke of their opinions.” Notice that he uses the word lead. Leadership means finding a new direction, not simply putting yourself at the front of the herd that’s heading toward the cliff.

So why is reading books any better than reading tweets or wall posts? Well, sometimes it isn’t. Sometimes, you need to put down your book, if only to think about what you’re reading, what you think about what you’re reading. But a book has two advantages over a tweet. First, the person who wrote it thought about it a lot more carefully. The book is the result of his solitude, his attempt to think for himself.

Second, most books are old. This is not a disadvantage: this is precisely what makes them valuable. They stand against the conventional wisdom of today simply because they’re not from today. Even if they merely reflect the conventional wisdom of their own day, they say something different from what you hear all the time. But the great books, the ones you find on a syllabus, the ones people have continued to read, don’t reflect the conventional wisdom of their day. They say things that have the permanent power to disrupt our habits of thought. They were revolutionary in their own time, and they are still revolutionary today.

— William Deresiewicz, “Solitude and Leadership,” The American Scholar, Spring 2010

Image: “Man Reading at Lamplight” by Georg Friedrich Kersting, 1814 [Public domain], via Wikimedia Commons

Our “cognitive surplus” is temporary, just like the fossil fuels that power it

In his 2010 book Cognitive Surplus, released in hardcover with the subtitle “Creativity and Generosity in a Connected Age” and in paperback with the subtitle “How Technology Makes Consumers into Collaborators,” Clay Shirky expanded his reputation as everybody’s favorite digital guru by arguing that “new digital technology” — primarily of the social media sort — “is unleashing a torrent of creative production that will transform our world. For the first time, people are embracing new media that allow them to pool their efforts at vanishingly low cost. The results of this aggregated effort range from mind-expanding reference tools like Wikipedia to life-saving Web sites like Ushahidi.com, which allows Kenyans to report acts of violence in real time. [The book] explores what’s possible when people unite to use their intellect, energy, and time for the greater good.”

Here he is expounding the idea in a popular TED talk:

Although Shirky can be criticized for an undue optimism, since it’s quite likely that his view of how people tend to use the freeing of their time and mental energy by technology is overly rosy, the fact that such a freeing-up has happened is incontrovertible. And now comes a paper written by two experts in digital communications and published in one of the longest-running online journals about the Internet itself that argues the cognitive surplus is a side effect of our massive exploitation of fossil fuels, and that its fate and future will therefore parallel the arc of fossil fuel-based civilization, which is, in the wide scope of things, a fleeting phase in human history, since “fossil fuels are not forever.” Read the rest of this entry

Recommended Reading 31

This week’s recommended reading includes: a warning about and meditation upon the possible dire consequences of the human species’ spectacular success in dominating the planetary petri dish; a profile of a literary journal devoted to injecting ancient wisdom into the wasteland of the modern cyber-soul; a beautiful explanation and defense of literature’s inherent resistance to being “understood” by algorithmic data analysis; information and opinions on Mind and Cosmos, the new book in which philosopher Thomas Nagel argues for the inadequacy of the standard materialist version of science; a warning and lament about the artistically decrepit state of American cinema; a long 1979 article, written in the immediate aftermath of the original Stars Wars movie, that examines both the movie’s seismic cultural impact and its origin in the mind and machinations of George Lucas; notes on a recent lecture given by psychological anthropologist Tanya Luhrmann, famed for her research into the phenomenon of “hearing voices” and its psychological and cultural meanings; and a fascinating New York Times piece about a Greek island where people tend to live longer and healthier lives than anywhere else on the planet. Read the rest of this entry

On living well in Ray Bradbury’s dystopia: Notes toward a monastic response

Morris Berman may not have been the first person to offer simultaneous commentary on American culture and Fahrenheit 451 by observing that the former has basically transformed itself into the dystopian society depicted by the latter. Many people have noted in the decades since Fahrenheit was first published in 1953 that things have been moving eerily and strikingly in the direction Bradbury foresaw (or rather, the direction he tried to forestall; “I wasn’t trying to predict the future,” he famously said in a 2003 interview. “I was trying to prevent it.”) But it was Morris who most forcefully affected me with this line of thought when he laid it out in The Twilight of American Culture:

In 1953, Ray Bradbury published Fahrenheit 451 — later made into a movie by Francois Truffaut — which depicts a future society in which intelligence has largely collapsed and the reading of books is forbidden by law. People sit around interacting with screens (referred to as “the family”) and taking tranquilizers. Today, nearly five decades later, isn’t this largely the point at which we have arrived? Do not the data [on the collapse of American intelligence] suggest that most of our neighbors are, in fact, the mindless automatons depicted in Truffaut’s film? True, the story does contain a class of “book people” who hide in the forest and memorize the classics, to pass on to future generations — and this vignette does, in fact, provide a clue as to what just might enable our civilization to eventually recover — but the majority of citizens on the eve of the twenty-first century watch an average of four hours of TV a day, pop Prozac and its derivatives like candy, and perhaps read a Danielle Steel novel once a year

. . . [T]he society depicted in Fahrenheit 451 has banned books and immerses itself instead in video entertainment, a kind of “electronic Zen,” in which history has been forgotten and only the present moment counts . . . [The novel] is extraordinarily prescient. Leaving aside the issue of direct censorship of books — rendered unnecessary by McWorld, as it turns out, because most people don’t read anymore — most of the features of this futuristic society are virtually upon us, or perhaps no more than twenty years away. [1]

Read the rest of this entry

The Internet’s corrosive mental effects: A growing problem requiring a deliberate defensive response

For those of you who, like me, have been interested to hear the background drumbeat of warnings about the mental and neurological effects of the Internet revolution over the past several years — think Nicholas Carr’s “Is Google Making Us Stupid?” and The Shallows,  just for starters —  a recent, in-depth article about this very subject from Newsweek will make for compelling reading. It’s not exactly a pleasant read, though, because the conclusion it draws from mountains of evidence is deeply disturbing.

Here’s the gist:

Teaser: Tweets, texts, emails, posts. New research says the Internet can make us lonely and depressed — and may even create more extreme forms of mental illness, Tony Dokoupil reports.

Questions about the Internet’s deleterious effects on the mind are at least as old as hyperlinks. But even among Web skeptics, the idea that a new technology might influence how we think and feel — let alone contribute to a great American crack-up — was considered silly and naive, like waving a cane at electric light or blaming the television for kids these days. Instead, the Internet was seen as just another medium, a delivery system, not a diabolical machine. It made people happier and more productive. And where was the proof otherwise?

Read the rest of this entry