Category Archives: Education

Literature makes you weird. Its gift is the uncanny.

In a previous Teeming Brain post (one that has received a steady inflow of visitors ever since I first published it in 2009), I talked about the magical/alchemical power of language in general and poetic language in particular:

[T]here’s a positively magical power in language, particularly in the poetic use of it, since language enables each of us to recreate his or her private thoughts and emotions in somebody else’s headspace and heartspace … It’s a veritably alchemical moment, since the poet acts as a linguistic alchemist who uses language to transmute the reader’s inner state into something else.

— “The Evolution of Consciousness and the Alchemy of Language,” July 2, 2009

In an essay published just today in The Chronicle Review from The Chronicle of Higher Education, Eric G. Wilson, professor of English at Wake Forest University, provides a beautifully realized illustration of and meditation on this same idea:

I…tell my disgruntled students about the first time I read, as an undergraduate, these lines:

There’s a certain Slant of light,
Winter Afternoons—
That oppresses, like the Heft
Of Cathedral Tunes—

I had often witnessed beams of dull December light with a melancholy I didn’t understand. Dickinson’s flash clarified my feelings: In the impoverished glow of the cold time were heavy reminders of brightness I desired but couldn’t possess. But this affliction had fever, intimations of future heat that was luminous, like hymns.

Dickinson’s verse spelled out the abstruse, made the strange familiar. In this new intimacy, however, was a novel astonishment: The chilly light from that day onward exposed the enigmas of longing, both tormenting and radiant. Her poetry left me amazed — caught in wonderment as well as labyrinth.

Other epiphanies followed. What I had taken for granted was shattered; the marvelous erupted amid the fragments. In Whitman I saw ordinary grass morph into the “uncut hair of graves.” In Eliot’s “Prufrock,” I watched twilight transmogrify into “a patient etherized upon a table.” The grass, the evening—in these metaphors, they grew more lucid than before, and more cryptic.

This is all wrapped up inside Wilson’s overarching thesis that the purpose of literature is to “make you weird,” something he says he first realized when he spontaneously blurted it out to a father and son when, having been tasked with the job of manning the table at a weekend college recruiting fair for high school seniors, he showed up “irritable, hung over, and resentful”:

A father and son immediately appeared, in virginal Wake Forest T-shirts and blond crew cuts. They smiled at me as if I had just praised their promptness. The younger looked up at dad, and father nodded to son, and son blurted: “Sell me the English major!” Through my brain’s murk, I searched for the hype. Failing to find it, I confessed: “It makes you weird.”

After a confused “OK,” the two looked down, backed away, and were gone. They shouldn’t have been so hasty. I had revealed to them, though I didn’t know it then, the great payoff of literary study: It estranges us from our normal habits of thought and perception, nullifies old conceptual maps, and so propels us into uncharted regions, outlandish and bracing, where we must create, if we are to thrive, coordinates more capacious, more sublime than the ones we already know. The uncanny — not truth, beauty, or goodness — is literature’s boon.

— Eric G. Wilson, “Poetry Makes You Weird,” The Chronicle of Higher Education, December 10, 2012

Trust me: Go and read the whole thing. You’ll be glad you did.

Also worth noting, although unrelated (or perhaps not?), is the fact that Wilson is the author of several books touching on subjects of direct interest to Teeming Brain readers. These include 2008’s Against Happiness: In Praise of Melancholy, in which he argues that, contra America’s addiction to happy talk and positive thinking, “melancholia is necessary to any thriving culture, that it is the muse of great literature, painting, music, and innovation,” and 2012’s Everyone Loves a Good Train Wreck: Why We Can’t Look Away, in which he explores humanity’s apparently ineradicable fascination with evil, morbidity, gore, and horror by “drawing on the findings of biologists, sociologists, psychologists, anthropologists, philosophers, theologians, and artists … [A] lifelong student of the macabre, Wilson believes there’s something nourishing in darkness. ‘To repress death is to lose the feeling of life,’ he writes. ‘A closeness to death discloses our most fertile energies.'”

Image: “Blue Lagoon” by h.koppdelaney under Creative Commons

Alan Watts on choosing your life’s work: Forget the money, follow your deep desire

What do you desire? What makes you itch? What sort of a situation would you like? … [When counseling graduating students who ask for career advice,] I always ask the question, “What would you do if money were no object? How would you really enjoy spending your life?” … If you say that getting the money is the most important thing, you will spend your life completely wasting your time. You will be doing things you don’t like doing in order to go on living — that is, to go on doing things you don’t like doing. Which is stupid. Better to have a short life that is full of what you like doing than a long life spent in a miserable way … See, what we’re doing is we’re bringing up children and educating them to live the same sort of lives we’re living, in order that they may justify themselves and find satisfaction in life by bringing up their children, to bring up their children, to do the same thing … And so, therefore, it’s so important to consider this question: What do I desire?

Hat tip to Brain Pickings

Resist Dystopia: Learn to Enjoy Reading Shakespeare

At the conclusion of Technopoly, Neil Postman lays out his concept of the “loving resistance fighter,” someone who keeps an open heart and a strong hold on the symbols and narratives of liberty, honor, intelligence, etc., that made America (and, by extension, other modern democracies) great, while deliberately resisting the coarsening, dumbing, soul-killing influence of the modern-day totalitarian technocracy.

This essay by Joseph Smigelski, community college English instructor in Northern California, strikes me as falling right in line with Postman’s vision. It also resonates with Ray Bradbury and Morris Berman: it’s a clear, doable, and direct way of enacting the monastic option amid our Fahrenheit 451-like circumstance.

The other day, I received a letter from a friend who wrote, “Unfortunately, I find him almost impossible to understand…. Is there a secret to comprehending Shakespeare? I’d really like to read him, and any hints would be appreciated.” My friend is not a philistine but a well-read woman who struggled through the major plays in school and has seen various theatrical productions and film versions of them. She obviously respects and values the immortal words of William Shakespeare and would like to join ranks with the many who enjoy reading him. So I was distressed by her candid admission of having such difficulty with his language. I am sure that many of you will sympathize with her and agree in a knee-jerk fashion that, yes, Shakespeare is indeed impossible to understand. But I think the problem is not with William Shakespeare but with you. Before you take offense, let me explain.

The first thing you have to do when confronting Shakespeare is break down the wall of resistance that has been constructed between you and him by a cultural atmosphere fraught with willful misunderstanding. For instance, how many times have you heard someone say that Shakespeare wrote in Old English or Middle English? That right there might be enough to put you off. But both of those claims are patently false … Shakespeare wrote in Modern English, the same language that we speak today … Your problem with understanding Shakespeare is due to his language being poetic. Most of your everyday discourse has become so pedestrian that your ears have become unable to tune in to language that aspires to greater heights. This may or may not be your fault. We all are aware that the state of education in this country is woefully bleak. But why submit to the prevailing philistine attitude without a fight?

… Whatever else you do, be sure to avoid such abominations as the “No Fear Shakespeare” and the “Shakespeare Made Easy” series, both of which should be more aptly titled “The Reader Made Stupid” series.

… [R]emember the old saying: Nothing worth having comes easily. The enjoyment kicks in when you really start to get it, when you finally meet William Shakespeare on his own turf and his language begins to open new doors in your consciousness.

— Joseph Smigelski, “How to Enjoy Reading Shakespeare,” The Huffington Post, April 7, 2010

Technology and schools: Update on a techno-utopian delusion

Throughout the 1990s the Clinton administration pushed hard for the universal integration of computers and information technology throughout America’s public education system, culminating in Bill Clinton’s official presidential call for “A computer in every classroom,” since, in his words, technology is “the great equalizer” for schools. No matter that it was an idea (and ideology) that was basically made up and lacking in any real support. No matter that, as Todd Oppenheimer incisively argued in a now-classic 1997 Atlantic article titled “The Computer Delusion” (and later in its 2003 book-length expansion, The Flickering Mind: The False Promise of Technology in the Classroom and How Learning Can Be Saved), “There is no good evidence that most uses of computers significantly improve teaching and learning, yet school districts are cutting programs — music, art, physical education — that enrich children’s lives to make room for this dubious nostrum, and the Clinton Administration has embraced the goal of ‘computers in every classroom’ with credulous and costly enthusiasm.” The techno-utopian impulse for America’s schools proved to be unstoppable on a practical level, and schools en masse, from kindergarten to college, bought into it on a proverbial hook, line, and sinker basis. The idea prevalent at administrative levels was and — as I can vouch from having spent the last decade-plus working in high school and college settings — still is that technology in and of itself is a Great Thing that will Revolutionize Learning. Even though many individual administrators and teachers are quite savvy and sensitive to the nuances of the techno-utopian gospel, the overall institutional-cultural pressure is overwhelmingly in the direction of uncritical adoption.

Read the rest of this entry

Beware the American craze for college credentialing

The editors of the always-valuable n+1 have published a penetrating and damning assessment of what’s wrong with the craze for credentials that marks the American economic and educational landscape right now. It’s all the more valuable for putting the whole thing in long-historical perspective.

For the contemporary bachelor or master or doctor of this or that, as for the Ming-era scholar-bureaucrat or the medieval European guildsman, income and social position are acquired through affiliation with a cartel. Those who want to join have to pay to play, and many never recover from the entry fee.

…Over the last thirty years, the university has replaced the labor union as the most important institution, after the corporation, in American political and economic life. As union jobs have disappeared, participation in the labor force, the political system, and cultural affairs is increasingly regulated by professional guilds that require their members to spend the best years of life paying exorbitant tolls and kissing patrician rings. Whatever modest benefits accreditation offers in signaling attainment of skills, as a ranking mechanism it’s zero-sum: the result is to enrich the accreditors and to discredit those who lack equivalent credentials.

Jean Baudrillard once suggested an important correction to classical Marxism: exchange value is not, as Marx had it, a distortion of a commodity’s underlying use value; use value, instead, is a fiction created by exchange value. In the same way, systems of accreditation do not assess merit; merit is a fiction created by systems of accreditation. Like the market for skin care products, the market for credentials is inexhaustible: as the bachelor’s degree becomes democratized, the master’s degree becomes mandatory for advancement. Our elaborate, expensive system of higher education is first and foremost a system of stratification, and only secondly  – and very dimly — a system for imparting knowledge.

— “Death by Degrees,” n+1, June 19, 2012

Read the rest of this entry

‘The Twilight Zone’ for teachers: ‘Changing of the Guard’

In 1962 The Twilight Zone ran an episode titled “The Changing of the Guard.” It starred Donald Pleasence (in his first American television appearance) as an elderly literature professor who is forced into retirement and decides to kill himself on Christmas Eve when he’s overcome by the sense that his entire life and career have been futile because, as he sees it, nothing he has done or taught has meant anything, since (as he sees it) his teaching, which spans three generations of students, has never had a real, lasting impact on anyone. At the last minute, however, a collective supernatural visitation reveals that he’s wrong about this.

The episode feels — in a very good way — like a hybrid of A Christmas Carol, It’s a Wonderful Life and Dead Poets Society. The CBS Video Library describes it like this:

Read the rest of this entry

Education research exposes the theory of multiple intelligences as singularly stupid

Oh, the delicious irony. Or rather the sweet savor of vindication. When I went through the Missouri teacher certification program from 2000 to 2001, the famous “theory of multiple intelligences” was all the rage. It was one of the philosophical and practical touchstones for training new teachers how to achieve maximum success in educating their students. Introduced by Harvard education professor Howard Gardner in 1983, it holds that there are different kinds of intelligence — as opposed to the previously reigning notion that the term “intelligence” refers to a single, consistent cognitive quality — and that traditional education and schooling only speaks to one or two of them, and that schools and teachers need to retarget not only their methods but their entire outlook, expectation, and overarching goals in order to account for this.

The trouble was and is that Gardner pretty much made the whole thing up. It’s backed by no evidence. This was whispered throughout the teacher training program at Missouri State University. So was the suspicion that the theory only became universally and enthusiastically embraced because it appeals to contemporary ideological prejudices. But that didn’t change the fact that we were all required to learn it and implement it in our practice lessons. Nor did it change the fact that I was required to attend more than one professional development training seminar about it when I worked for six years in a public high school. Personally, I thought it was kind of ridiculous. So did many of my colleagues.

As of today, the whisper of disagreement with this trendy education school orthodoxy has become a roar. When NPR covers something, you know it’s gone as mainstream as it can go (short of being mentioned on America’s Got Talent, that is):

We’ve all heard the theory that some students are visual learners, while others are auditory learners. And still other kids learn best when lessons involve movement. But should teachers target instruction based on perceptions of students’ strengths? … In fact, an entire industry has sprouted based on learning styles. There are workshops for teachers, products targeted at different learning styles and some schools that even evaluate students based on this theory. This prompted Doug Rohrer, a psychologist at the University of South Florida, to look more closely at the learning style theory. When he reviewed studies of learning styles, he found no scientific evidence backing up the idea. “We have not found evidence from a randomized control trial supporting any of these,” he says, “and until such evidence exists, we don’t recommend that they be used.” Willingham suggests it might be more useful to figure out similarities in how our brains learn, rather than differences.

Full story at NPR: “Think You’re an Auditory or Visual Learner? Scientists Say It’s Unlikely

For context and further illumination, consider reading “Not Every Is Secretly a Genius,” published two years ago in The Chronicle of Higher Education:

The appealing elements of the theory [of  multiple intelligences] are numerous. It’s “cool,” to start with: The list-like format has great attraction for introductory psychology and education classes. It also seems to jibe well with the common observation that individuals have particular talents. More important, especially for education, it implicitly (although perhaps unintentionally on Gardner’s part) promises that each child has strengths as well as weaknesses … Multiple intelligences put every child on an equal footing, granting the hope of identical value in an ostensible meritocracy. The theory fits well with a number of the assumptions that have dominated educational philosophy for years … The only problem, with all respect to Gardner: There probably is just a single intelligence or capacity to learn, not multiple ones devoted to independent tasks. To varying degrees, some individuals have this capacity, and others do not … [T]he eight intelligences are based more on philosophy than on data.

… A pedagogy designed to identify strong and weak areas of achievement is not a bad idea. But believing that such an approach rests on the existence of multiple intelligences has real risks … Students encouraged to explore their talents in dance or socializing may find themselves slammed against a wall of reality when expected to actually know how to do algebra or read a book in college … It’s time that we begin to work with the reality that we have, not the one we wish we had. To do otherwise would be just plain stupid.

Rhode Island School of Design now requiring incoming freshmen to read H.P. Lovecraft

Is it really possible that a modern-day American college has actively taken steps to transform the experience and education they offer their students into an overtly Lovecraftian affair? Why, yes, it is, much to my jaw-dropped astonishment and delight. Cue the sound of stars aligning.

First, the wide-scope background: As reported by The New York Times in 2007, “Nationwide, hundreds of colleges and universities, large and small, public and private, assign first-year students a book to read over the summer, hoping to create a sense of community and engage students intellectually.”

And now the eldritch case in point: Rhode Island School of Design (RISD), the fine arts and design college located in Providence and abutting Brown University, has joined the national trend by launching a Summer Reading Program this year that requires all incoming freshmen to read the same book. And they’ve picked a title by the Prince of Providence Letters himself:

[T]rue to RISD’s penchant for the idiosyncratic and intriguingly off-center, first-year students won’t be reading The Kite Runner, A Hope in the Unseen or other bestsellers that are among the top picks on college campuses. Instead, they’ll be diving into a 1927 pulp fiction novel written by an author who described his guiding literary principle as “cosmic horror,” and featuring a protagonist who is driven insane by a journey that leads him into a world of sorcerers and the occult.

The Case of Charles Dexter Ward was never published during the lifetime of its author, Rhode Island native H.P. Lovecraft. When it eventually did see the light of day, it was in a 1941 issue of the fantasy/horror magazine Weird Tales. Although it’s anything but a standard selection, RISD faculty say the work is the perfect choice to inaugurate RISD’s Common Reading Program: a work of visually rich fiction that is steeped in Rhode Island history and that tackles complex themes – from the notion of fate and the power of family bloodlines to the dangers of modernization and the limits of scientific inquiry.

Those excerpts come from “RISD Summer Reading? Horrors!“, the RISD’s official description of the project. The full piece is well worth your time, especially since it features various faculty members offering their justifications and explanations of this idiosyncratic but utterly appropriate literary choice. It also points out that “an accompanying website will feature drawings and other work by students inspired by Lovecraft’s fiction. The book will be woven into various Orientation programs, with faculty-led discussion groups, a short film about the author and tours of the city focusing on sites and landmarks identified in the book.” That website, not incidentally, is RISD Common Reading, which in addition to describing the program contains an “About H.P. Lovecraft” page that consists solely of a link to the Lovecraft bio at Donovan Loucks’ definitive Lovecraft site, as well as a link to the online text of Charles Dexter Ward.

The Providence Journal ran a story on the program today, and quoted RISD’s Daniel Cavicchi, head of the Department of History, Philosophy, and the Social Science’s, about the college’s choice:

“Well, it’s not a typical choice,” RISD’s Daniel Cavicchi acknowledged with a chuckle. “Most reading programs assign books that deal with contemporary issues, and we certainly considered some of those. But this one resonated the most, I think, with everyone. It’s different in that it’s a horror novel. But we thought it would be a really good idea to have something that would engage the students in thinking about their new home of Providence.”

…“And we thought it had many different entry points, many themes,” said Cavicchi, who suggested the book — which he had read in high school long before setting foot in Providence — to the rest of the committee that picked it. Themes like the role of place in creative inspiration; the point of knowing one’s personal history; the ethics of manipulating nature; the limits of science and rationality… “To me, the book is very layered. There is the horror story, but then there are all these other elements in and around the horror story.”

(See full story at The Providence Journal.)

Beautiful. Amazing. Wonderful. When I was an undergraduate at the University of Missouri-Columbia, I had to make up my own Lovecraftian curriculum. By “had to” I mean I was driven to do so by a positively daimonic compulsion and fascination, and by “Lovecraftian curriculum” I mean Lovecraft’s complete fiction, as ordered in hardcover from Arkham House (to be followed by his selected letters after I graduated), as well as everything by and about him that was housed in the university’s library. This included Donald Burleson’s H.P. Lovecraft: A Critical Study and Lovecraft: Disturbing the Universe, Maurice Lévy’s Lovecraft: A Study in the Fantastic, Darrell Schweitzer’s The Dream-Quest of H.P. Lovecraft, the Joshi-edited H.P. Lovecraft: Four Decades of Criticism, and more. But this was all, as I said, entirely self-driven and self-conducted. I think I would have suspected that I had accidentally transitioned into an alternate universe a la the cosmic slips in Robert Anton Wilson’s Schrodinger’s Cat if I had actually been required to read Lovecraft at Mizzou.

Pardon me if I hear the hoofbeats of apocalyptic horsemen approaching. Or maybe that crackling sound is hell freezing over. In any case, life is cool, and RISD’s incoming  freshmen are receiving an education indeed.

From Google’s “in-house philosopher,” a beautiful credo in defense of studying the humanities

Here at The Teeming Brain I’ve gone on at some length about the disastrous/dystopian trends in contemporary American education, including, especially, the rise of the techno-corporate consumer model that assigns a purely economic raison d’etre to higher education. (See, for example, my “America’s Colleges at a Crossroads” series and additional articles.) Today I’m fascinated, and rather psyched, to discover a smart and forceful statement in favor of pursuing a humanities-oriented education, written by somebody who earned a grad degree at MIT and then launched into a lucrative career in computer programming, only to abandon it a few years later to earn a Ph.D. in philosophy because his technological interests organically led him to a passionate personal focus on philosophical matters.

Damon Horowitz’s bio says he “is currently in-house philosopher at Google” — an intriguing job title if ever I heard one — and his essay published yesterday (July 17) at The Chronicle of Higher Education is described as “an excerpt of a keynote address he gave in the spring at the BiblioTech conference at Stanford University.” A quick Google search reveals that the address itself was titled “Why You Should Quit Your Technology Job and Get a Humanities Ph.D.”

Here are some choice highlights from a highlight-filled essay that’s quotable almost in toto:

I wanted to better understand what it was about how we were defining intelligence that was leading us astray: What were we failing to understand about the nature of thought in our attempts to build thinking machines? And, slowly, I realized that the questions I was asking were philosophical questions — about the nature of thought, the structure of language, the grounds of meaning. So if I really hoped to make major progress in AI, the best place to do this wouldn’t be another AI lab. If I really wanted to build a better thinker, I should go study philosophy.


In learning the limits of my technologist worldview, I didn’t just get a few handy ideas about how to build better AI systems. My studies opened up a new outlook on the world. I would unapologetically characterize it as a personal intellectual transformation: a renewed appreciation for the elements of life that are not scientifically understood or technologically engineered.

In other words: I became a humanist.


Maybe you, too, are disposed toward critical thinking. Maybe, despite the comfort and security that your job offers, you, too, have noticed cracks in the technotopian bubble.

Maybe you are worn out by endless marketing platitudes about the endless benefits of your products; and you’re not entirely at ease with your contribution to the broader culture industry. Maybe you are unsatisfied by oversimplifications in the product itself. What exactly is the relationship created by “friending” someone online? How can your online profile capture the full glory of your performance of self? Maybe you are cautious about the impact of technology. You are startled that our social-entertainment Web sites are playing crucial roles in global revolutions. You wonder whether those new tools, like any weapons, can be used for evil as well as good, and you are reluctant to engage in the cultural imperialism that distribution of a technology arguably entails.


[D]o you really value your mortgage more than the life of the mind? What is the point of a comfortable living if you don’t know what the humanities have taught us about living well? If you already have a job in the technology industry, you are already significantly more wealthy than the vast majority of our planet’s population. You already have enough.

If you are worried about your career, I must tell you that getting a humanities Ph.D. is not only not a danger to your employability, it is quite the opposite. I believe there no surer path to leaping dramatically forward in your career than to earn a Ph.D. in the humanities. Because the thought leaders in our industry are not the ones who plodded dully, step by step, up the career ladder. The leaders are the ones who took chances and developed unique perspectives.

Complete text at The Chronicle‘s site: “From Technologist to Philosopher

What’s more, Horowitz’s speech is on Youtube:

On clarity of language, thought, consciousness, and being

As a professional writer and English teacher for the past decade, I’ve been prone to think frequently about the role of language in life. One of the recurring themes in my thoughts — occasioned at least in part by some of my grad school studies in philosophy, anthropology, and sociolinguistics, and also by my being confronted at my job every day by extremely rough and problematic uses of the English language that are damned difficult to address — is the question of “correct” language. Is the very idea of correctness in this area just a culturally imperialistic metanarrative? Is it just arbitrary in the grand scheme of things? Or does it really get at a crucial truth?

And beyond mere technical correctness — grammar etc. — what about matters of rhetoric, style, and syntactical choices? How important are they not just to academic matters but to life in general, and not just in a utilitarian sense but a deeply human one?

A recent essay in The New York Review of Books offers some real fodder for reflection on all of these things. In “Words” (July 15), British academic Tony Judt talks about the vast significance of language in both his own personal life and the life of human culture at large.  The essay is fascinating and poignant — fascinating because of the insight Judt brings to bear on the relationship between the clear and skillful deployment of language (in both print and speech) and the achievement of a general clarity of life and thought, and poignant because he caps the whole thing off by talking about a progressive neurological disorder from which he suffers, and which will inevitably rob him of speech. “Translating being into thought,” he says, “thought into words, and words into communication will soon be beyond me and I shall be confined to the rhetorical landscape of my interior reflections.”

He explains that he was brought up in a family where talking and debating were centrally important, and was processed through the British elementary school system of the 1950s, when “‘Good’ English was at its peak” and “We were instructed in the unacceptability of even the most minor syntactical transgression.”

The heart of the essay appears in his comments about the close connection between clarity of language and clarity of thought, and the way this connection has been devalued over the past half century of public life:

Sheer rhetorical facility, whatever its appeal, need not denote originality and depth of content.

All the same, inarticulacy surely suggests a shortcoming of thought. This idea will sound odd to a generation praised for what they are trying to say rather than the thing said. Articulacy itself became an object of suspicion in the 1970s: the retreat from “form” favored uncritical approbation of mere “self-expression,” above all in the classroom. But it is one thing to encourage students to express their opinions freely and to take care not to crush these under the weight of prematurely imposed authority. It is quite another for teachers to retreat from formal criticism in the hope that the freedom thereby accorded will favor independent thought: “Don’t worry how you say it, it’s the ideas that count.”

Forty years on from the 1960s, there are not many instructors left with the self-confidence (or the training) to pounce on infelicitous expression and explain clearly just why it inhibits intelligent reflection. The revolution of my generation played an important role in this unraveling: the priority accorded the autonomous individual in every sphere of life should not be underestimated — “doing your own thing” took protean form.

Today “natural” expression — in language as in art — is preferred to artifice. We unreflectively suppose that truth no less than beauty is conveyed more effectively thereby. Alexander Pope knew better. For many centuries in the Western tradition, how well you expressed a position corresponded closely to the credibility of your argument. Rhetorical styles might vary from the spartan to the baroque, but style itself was never a matter of indifference. And “style” was not just a well-turned sentence: poor expression belied poor thought. Confused words suggested confused ideas at best, dissimulation at worst.

He goes on from this to observe that in the modern social media milieu of Facebook, Twitter, MySpace, and texting, “pithy allusion substitutes for exposition,” and people who live under the reign of an overweening consumerism begin to talk like text messages.

The prognosis he offers is unequivocal:

This ought to worry us. When words lose their integrity so do the ideas they express. If we privilege personal expression over formal convention, then we are privatizing language no less than we have privatized so much else. “When I use a word,” Humpty Dumpty said, in rather a scornful tone, “it means just what I choose it to mean—neither more nor less.” “The question is,” said Alice, “whether you can make words mean so many different things.” Alice was right: the outcome is anarchy.

As I said, this all hits home because of my personal and professional positions as a writer and teacher. And also because of my philosophical and spiritual proclivities. I’m deeply influenced by a loose Zen-Christian nondual school of thinking, seeing, and knowing, and of course this involves the recognition that reality in itself is fundamentally unspeakable, fundamentally a matter of pure being-ness and first-person apprehension. “The menu isn’t the meal.” “The map isn’t the territory.” Don’t get so distracted by the finger pointing to the moon that you miss the moon itself, the “finger” being words and concepts and the “moon” being the living realities they symbolize. And so on.

For years I struggled with the question of whether this semi-existentialist recognition of the abstraction of language and thought from real being, while valid and crucial, might not entail the necessary conclusion that language is unimportant. That’s one of the major reasons, among all the others, that Judt’s insights are so gripping:  because he with his neurological disorder is faced with the imminent loss of his ability to communicate in words. And this really and truly does strike him — and me — as a loss.

In point of fact, reality’s transcendence of language means that the real world and life in general should be infinitely expressible in words. No matter that the words and concepts are relative realities instead of absolute ones, and symbolic realities instead of existential ones. This very fact means a person should ideally be able to describe his or her thoughts and experiences in a literally endless variety of linguistic variations, all of them circling around and pointing toward the realities themselves, and recreating in the mind and affect of the equally linguistically astute listener or reader an approximation of those very realities, thus encouraging a “see for yourself” transition to direct looking. Not to be able to do this, to lack the skills and sensibility to state and restate our experience, is to be locked away in a prison of muteness.

I recall being exhilarated as an undergraduate when I read Robert Anton Wilson’s The Widow’s Son and came to the fantastic philosophical passage in which — as I recall (it’s been a few years) — Wilson presents a hypothetical scene of humanity’s first explosion of self-consciousness, wherein an early human spontaneously develops the first-ever capacity for self-conscious reflection, and is thus able to recognize the beauty of a flower or sunset for the first time, and exclaims to another human with gasping wonder and delight, “Oh, look! Look at this!” Writes Wilson, “And beauty was created in a world that had been flat and dead and meaningless until that moment.”

The entire history of language proceeds from that delightful leap in self-consciousness, from that titanically freeing and empowering ability to step back from life and really see it, and to symbolize it in some form that’s communicable to others, so that they, too, can see for themselves by using the symbol for its proper purpose: as the Taoist’s “finger pointing toward the moon,” which directs attention away from itself and toward reality, serving only as a bridge. (See my “The Evolution of Consciousness and the Alchemy of Language” for more along these lines.)

I finished reading Colin Wilson’s The Philosopher’s Stone recently, and the entire philosophical thrust of that ecstatically philosophical novel is the value of being able to step back from immediate experience and grasp wider meanings. Wilson writes, “So poets, philosophers, scientists are always having these moments in which they grasp enormous meanings.” He even deliberately presents an instance in which a dull and prosaic-minded character suffers a head wound that accidentally endows him with the ability to induce “value experiences” (the novel’s fictionalized version of Maslow’s “peak experiences”) at will, simply so that he (Wilson) can make this very point about the importance of linguistic expression: “We had found someone who could plunge into ecstasy as a moment’s notice. Here was s Wordsworth without the power of self-expression, a Traherne who could only say ‘Gor, ain’t it pretty.'”

So all of this is just a longish and rambling rumination to get around to saying this: that Judt is right. The power to use language with self-conscious correctness, and not just that, but with rhetorical beauty, is a real power with real value because it really does allow “the translation of being into thought, thought into words, and words into communication” — which means your and my subjectivity becomes sharable. Our walled-off world of interiority becomes something we can communicate to someone else, and they can communicate theirs to us. There may be, in fact there truly are, wordless ways of doing the same thing — but words are one of the finest and most effective means we have of doing this. (See yesterday’s post about fictional entertainments and their power to cultivate empathy.)

Even more: Words, like self-consciousness, can actually enhance primary experience. The capacity for self-consciousness and the capacity for language being inextricably interlinked, it’s simply the case that the better your ability to reflect upon and express your experience consciously and linguistically, the more fully you know that experience. The very act of reflection creates the reflector. It’s bound up with the fact of individual subjecthood itself, as any student of the Western intellectual, philosophical, political, and social tradition, not to mention any student of Buddhism, can tell you. And the achievement and refinement of that ego self, despite the undeniable and enormous problems it has created — everything having to do with the “nightmare” of recorded/civilized history from which Joyce was struggling to awake — is one of the greatest quantum leaps in the history of the universe’s evolution. It’s the universe becoming awake to itself, and our purpose lies not in fleeing from the ego but in fulfilling the purpose for which it arose. See the pre/trans fallacy famously articulated by Ken Wilber. See the biblical Jesus: “I come not to destroy the law but to fulfill it.”

Our culture now presents us with an opportunity either to rise to, and even above, the opportunity embodied in words and language, or to sink below it. This is what I and every other writer and/or teacher is charged with addressing. We’re not just trying to enhance students’ communication skills in order to enhance their employment prospects. We’re helping to focus their being, to focus Being itself, for the ultimate fulfillment of its purpose, by helping them to develop their linguistic capacities and conscious interior sensibilities to the greatest possible extent.