Remember: You must conform.
Better yet: Doughnut thing. (Watch for explanation.)
Traditional Amish Buggy. By Ad Meskens (Own work) [CC BY-SA 3.0 (http://creativecommons.org/licenses/by-sa/3.0) or GFDL (http://www.gnu.org/copyleft/fdl.html)], via Wikimedia Commons
When I was a kid, the first time I ever heard of the Amish was when I watched the movie Witness for the first time. Much later, in the first decade of the aughts, I lived for seven years right in the heart of Missouri Amish country, where horse-drawn buggies on the shoulder of the road were a frequent sight, and where I regularly rubbed shoulders with Amish people in stores, at garage sales, and elsewhere. Between those two extremes, I took an undergraduate college sociology course at Mizzou titled simply “The Old Order Amish,” taught by a highly respected professor who had himself grown up in an Old Order Mennonite community (and who, as I just now discovered, died only two months ago). So all of that amplifies my personal interest in this brief and thoughtful reflection by University of Wyoming English professor Arielle Zibrak on the possible meaning and lessons of “Amish time” for a 21st-century technological society that has become obsessed with future visions and intimations of collapse and dystopia:
Wendell Berry wrote that American society’s inability to see the Amish for what they are is indicative of the most basic flaws of the American progress narrative. I think we’re beginning to see the frayed edges of that narrative’s unraveling. While the future used to appear to us as Marty Mcfly’s hoverboard, robo cops, and casual space travel, it now seems more frequently to come in the form of close-knit roving communities that communicate via flare and cook game over open fires, e.g. McCarthy’s The Road or Frank Darabont’s “The Walking Dead.”
We usually cast these fictional futures as dystopias. But if Margaret Atwood is right — and she should know — “within every dystopia there’s a little utopia.” And I can’t help but wonder if, as our vision of the future continues to shift, our view of the Amish will shift with it. For now, the best I can do is to try to learn from the Amish.
These days, after another move, I’m living out west, where my little sedan is itself a buggy parked alongside the giant pickups at the superstore. I can’t be around the Amish anymore in the sense of space. But I can try to be closer to where they are in the sense of time, which is neither the past nor precisely the future (even if there’s a zombie apocalypse or the hipsters keep defecting to dairy farms and haberdasheries) but is squarely inside of the mystery of the present.
MORE: “On Amish Time“
“Fire of Troy” by Kerstiaen de Keuninck (Coninck), 17th cent. [Public domain], via Wikimedia Commons
NOTE: This post was originally published in January 2007 in a different form. Based on various circumstances — including the publication just yesterday of a post titled “Collective Brainwashing & Modern Concentration Camps” over at Daily Grail, which calls out the below-transcribed portion of My Dinner with Andre — now seems like a good time to re-present this in a slightly revised and enhanced form.
* * *
One of the most nightmarish things about a dark age is the degradation it entails for life’s overall tone, not least in the dehumanization that occurs when a people’s intellectual, emotional, moral, spiritual, political, social, and cultural life in general is reduced to a ghastly level of brutishness and ignorance. As is now plainly evident all around us in the industrialized world of present-day info-technocracy, this coarsening of life can occur even in circumstances of relative material prosperity. It doesn’t always have to be a dark age like the one that gripped Europe in the aftermath of Rome’s fall, when starvation and plague were rampant and most people barely scraped by at a miserable subsistence level. A dark age can unfold and exist right in the middle of outward conditions that may appear enlightened to those who don’t look too closely or deeply.
Sometimes it’s oddly comforting to dwell on the words of people who have seen today’s dark age of dehumanization unfolding. When it feels like the world is full of robots instead of people, or when it begins to feel like we really are living on the planet of the apes (as Robert Anton Wilson liked to put it), it can be a powerfully affirming experience to be reminded that other people have observed the same thing.
With this in mind, here are three of my own favorite articulations of these things, which, based on my own experience, I recommend you ingest, digest, memorize, and keep mentally handy for reciting to yourself on a rainy day. There are no solutions offered here. There’s just the satisfaction of being confronted by grim realities and looking them full in the face. Read the rest of this entry
“The Madhouse” by Francisco Goya [Public domain], via Wikimedia Commons
So brilliant: an implicitly ironic but outwardly straight-faced reading of the DSM-5 as a dystopian horror novel, complete with a quasi-Ligottian assessment of the book’s narrative voice and view of humanity.
Great dystopia isn’t so much fantasy as a kind of estrangement or dislocation from the present; the ability to stand outside time and see the situation in its full hideousness. The dystopian novel doesn’t necessarily have to be a novel. . . . Something has gone terribly wrong in the world; we are living the wrong life, a life without any real fulfillment. The newly published DSM-5 is a classic dystopian novel in this mold.
Here, we have an entire book, something that purports to be a kind of encyclopedia of madness, a Library of Babel for the mind, containing everything that can possibly be wrong with a human being. . . . DSM-5 arranges its various strains of madness solely in terms of the behaviors exhibited. This is a recurring theme in the novel, while any consideration of the mind itself is entirely absent. . . . The idea emerges that every person’s illness is somehow their own fault, that it comes from nowhere but themselves: their genes, their addictions, and their inherent human insufficiency. We enter a strange shadow-world where for someone to engage in prostitution isn’t the result of intersecting environmental factors (gender relations, economic class, family and social relationships) but a symptom of “conduct disorder,” along with “lying, truancy, [and] running away.” A mad person is like a faulty machine. The pseudo-objective gaze only sees what they do, rather than what they think or how they feel. A person who shits on the kitchen floor because it gives them erotic pleasure and a person who shits on the kitchen floor to ward off the demons living in the cupboard are both shunted into the diagnostic category of encopresis. It’s not just that their thought-processes don’t matter, it’s as if they don’t exist. The human being is a web of flesh spun over a void.
. . . The word “disorder” occurs so many times that it almost detaches itself from any real signification, so that the implied existence of an ordered state against which a disorder can be measured nearly vanishes is almost forgotten. Throughout the novel, this ordered normality never appears except as an inference; it is the object of a subdued, hopeless yearning. With normality as a negatively defined and nebulously perfect ideal, anything and everything can then be condemned as a deviation from it. . . . If there is a normality here, it’s a state of near-catatonia. DSM-5 seems to have no definition of happiness other than the absence of suffering. The normal individual in this book is tranquilized and bovine-eyed, mutely accepting everything in a sometimes painful world without ever feeling much in the way of anything about it. The vast absurd excesses of passion that form the raw matter of art, literature, love, and humanity are too distressing; it’s easier to stop being human altogether, to simply plod on as a heaped collection of diagnoses with a body vaguely attached.
. . . For all the subtlety of its characterization, the book doesn’t just provide a chilling psychological portrait, it conjures up an entire world. The clue is in the name: On some level we’re to imagine that the American Psychiatric Association is a body with real powers, that the “Diagnostic and Statistical Manual” is something that might actually be used, and that its caricature of our inner lives could have serious consequences. Sections like those on the personality disorders offer a terrifying glimpse of a futuristic system of repression, one in which deviance isn’t furiously stamped out like it is in Orwell’s unsubtle Oceania, but pathologized instead. Here there’s no need for any rats, and the diagnostician can honestly believe she’s doing the right thing; it’s all in the name of restoring the sick to health. DSM-5 describes a nightmare society in which human beings are individuated, sick, and alone. For much of the novel, what the narrator of this story is describing is its own solitude, its own inability to appreciate other people, and its own overpowering desire for death — but the real horror lies in the world that could produce such a voice.
MORE: “Book of Lamentations“
For more on the DSM-V and the controversy it has elicited, see this.
Here’s a double dose of dystopian cheer to accompany a warm and sunny Monday afternoon (or at least that’s the weather here in Central Texas).
First, Adam Kirsch, writing for The New Republic, in a piece dated May 2:
Everyone who ever swore to cling to typewriters, record players, and letters now uses word processors, iPods, and e-mail. There is no room for Bartlebys in the twenty-first century, and if a few still exist they are scorned. (Bartleby himself was scorned, which was the whole point of his preferring not to.) Extend this logic from physical technology to intellectual technology, and it seems almost like common sense to say that if we are not all digital humanists now, we will be in a few years. As the authors of Digital_Humanities write, with perfect confidence in the inexorability — and the desirability — of their goals, “the 8-page essay and the 25-page research paper will have to make room for the game design, the multi-player narrative, the video mash-up, the online exhibit and other new forms and formats as pedagogical exercises.”
. . . The best thing that the humanities could do at this moment, then, is not to embrace the momentum of the digital, the tech tsunami, but to resist it and to critique it. This is not Luddism; it is intellectual responsibility. Is it actually true that reading online is an adequate substitute for reading on paper? If not, perhaps we should not be concentrating on digitizing our books but on preserving and circulating them more effectively. Are images able to do the work of a complex discourse? If not, and reasoning is irreducibly linguistic, then it would be a grave mistake to move writing away from the center of a humanities education.
. . . The posture of skepticism is a wearisome one for the humanities, now perhaps more than ever, when technology is so confident and culture is so self-suspicious. It is no wonder that some humanists are tempted to throw off the traditional burden and infuse the humanities with the material resources and the militant confidence of the digital. The danger is that they will wake up one morning to find that they have sold their birthright for a mess of apps.
Second, Will Self, writing for The Guardian, in a piece also dated May 2:
The literary novel as an art work and a narrative art form central to our culture is indeed dying before our eyes. Let me refine my terms: I do not mean narrative prose fiction tout court is dying — the kidult boywizardsroman and the soft sadomasochistic porn fantasy are clearly in rude good health. And nor do I mean that serious novels will either cease to be written or read. But what is already no longer the case is the situation that obtained when I was a young man. In the early 1980s, and I would argue throughout the second half of the last century, the literary novel was perceived to be the prince of art forms, the cultural capstone and the apogee of creative endeavour. The capability words have when arranged sequentially to both mimic the free flow of human thought and investigate the physical expressions and interactions of thinking subjects; the way they may be shaped into a believable simulacrum of either the commonsensical world, or any number of invented ones; and the capability of the extended prose form itself, which, unlike any other art form, is able to enact self-analysis, to describe other aesthetic modes and even mimic them. All this led to a general acknowledgment: the novel was the true Wagnerian Gesamtkunstwerk.
. . . [T]he advent of digital media is not simply destructive of the codex, but of the Gutenberg mind itself. There is one question alone that you must ask yourself in order to establish whether the serious novel will still retain cultural primacy and centrality in another 20 years. This is the question: if you accept that by then the vast majority of text will be read in digital form on devices linked to the web, do you also believe that those readers will voluntarily choose to disable that connectivity? If your answer to this is no, then the death of the novel is sealed out of your own mouth.
. . . I believe the serious novel will continue to be written and read, but it will be an art form on a par with easel painting or classical music: confined to a defined social and demographic group, requiring a degree of subsidy, a subject for historical scholarship rather than public discourse. . . . I’ve no intention of writing fictions in the form of tweets or text messages — nor do I see my future in computer-games design. My apprenticeship as a novelist has lasted a long time now, and I still cherish hopes of eventually qualifying. Besides, as the possessor of a Gutenberg mind, it is quite impossible for me to foretell what the new dominant narrative art form will be — if, that is, there is to be one at all.
Image: Painting: John White Alexander (1856–1915); Photo: Andreas Praefcke (Own work (own photograph)) [Public domain or Public domain], via Wikimedia Commons
In my recent post about Jeff Kripal’s article “Visions of the Impossible,” I mentioned that biologist and hardcore skeptical materialist Jerry Coyne published a scathing response to Jeff’s argument soon after it appeared. For those who would like to keep up with the conversation, here’s the heart of Coyne’s response (which, in its full version, shows him offering several direct responses to several long passages that he quotes from Jeff’s piece):
For some reason the Chronicle of Higher Education, a weekly publication that details doings (and available jobs) in American academia, has shown a penchant for bashing science and promoting anti-materialist views. . . . I’m not sure why that is, but I suspect it has something to do with supporting the humanities against the dreaded incursion of science — the bogus disease of “scientism.”
That’s certainly the case with a big new article in the Chronicle, “Visions of the impossible: how ‘fantastic’ stories unlock the nature of consciousness,” by Jeffrey J. Kripal, a professor of religious studies at Rice University in Texas. Given his position, it’s not surprising that Kripal’s piece is an argument about Why There is Something Out There Beyond Science. And although the piece is long, I can summarize its thesis in two sentences (these are my words, not Kripal’s):
“People have had weird experiences, like dreaming in great detail about something happening before it actually does; and because these events can’t be explained by science, the most likely explanation is that they are messages from some non-material realm beyond our ken. If you combine that with science’s complete failure to understand consciousness, we must conclude that naturalism is not sufficient to understand the universe, and that our brains are receiving some sort of ‘transhuman signals.'”
That sounds bizarre, especially for a distinguished periodical, but anti-naturalism seems to be replacing postmodernism as the latest way to bash science in academia.
. . . But our brain is not anything like a radio. The information processed in that organ comes not from a transhuman ether replete with other people’s thoughts, but from signals sent from one neuron to another, ultimately deriving from the effect of our physical environment on our senses. If you cut your optic nerves, you go blind; if you cut the auditory nerves, you become deaf. Without such sensory inputs, whose mechanisms we understand well, we simply don’t get information from the spooky channels promoted by Kripal.
When science manages to find reliable evidence for that kind of clairvoyance, I’ll begin to pay attention. Until then, the idea of our brain as a supernatural radio seems like a kind of twentieth-century alchemy—the resort of those whose will to believe outstrips their respect for the facts.
Full article: “Science Is Being Bashed by Academic Who Should Know Better“
(An aside: Is it just me, or in his second paragraph above does Coyne effectively insult and dismiss the entire field of religious studies and all of the people who work in it?)
Jeff responded five days later in a second piece for the Chronicle, where he met Coyne’s criticisms head-on with words like these: Read the rest of this entry