Category Archives: Education
Interesting: Last month The Chronicle of Higher Education published an article by Tom Bartlett, their senior science editor, titled “Spoiled Science.” It’s about the way Cornell University’s renowned Food and Brand Lab has taken a credibility hit in the wake of revelations about multiple statistical anomalies that have been discovered in papers co-authored by its director, Brian Wansink, who is also a celebrity scholar due to appearance on the likes of 60 Minutes and Rachael Ray. The heart of the article’s import is laid out in this paragraph:
The slow-motion credibility crisis in social science has taken the shine off a slew of once-brilliant reputations and thrown years of research into doubt. It’s also led to an undercurrent of anxiety among scientists who fear that their labs and their publication records might come under attack from a feisty cadre of freelance critics. The specifics of these skirmishes can seem technical at times, with talk of p-values and sample sizes, but they go straight to the heart of how new knowledge is created and disseminated, and whether some of what we call science really deserves that label.
In the middle of the piece, Bartlett sudden mentions Daryl Bem’s famous precognition research from a few years ago, and subjects it to a brief but withering moment of scorn:
This isn’t the first time Cornell has had to cope with a blow to its research reputation. In 2011, Daryl Bem, an emeritus professor of psychology, published a paper in which he showed, or seemed to show, that subjects could anticipate pornographic images before they appeared on a computer screen. If true, Bem’s finding would upend what we understand about the nature of time and causation. It would be a big deal. That paper, “Feeling the Future,” was widely ridiculed and failed to replicate, though Bem himself has stood by his results.
Yesterday Bem responded with a letter to the Chronicle titled “In Defense of Research on Precognition,” in which he sets the record straight. He begins by pointing out that his paper was published in Journal of Personality and Social Psychology, which has a rejection rate of 80 percent, and where the paper was approved by four referees and two editors before publication.
He then points out that Barlett’s claim about the experiment’s failure to replicate is patently false: “In 2015, three colleagues and I published a follow-up meta-analysis of 90 such experiments conducted by 33 laboratories in 14 countries. The results strongly support my original findings. In particular, the independent replications are robust and highly significant statistically.”
Finally, he shares this salient fact:
Bartlett further asserts that this research was widely ridiculed and constituted a blow to Cornell’s research reputation. But it was Cornell’s own public-affairs office that was proactively instrumental in setting up interviews with the press and other media following the publication of the original article. New Scientist, Discover, Wired, New York Magazine, and Cornell’s own in-house publications all described the research findings seriously and without ridicule.
Me, I’m just fascinated to see mentions of such matters cropping up repeatedly in a place like The Chronicle of Higher Education, whose publication of essays by Jeffrey Kripal on the paranormal I discussed at some length a few years ago. (And of course I’d be lying if I denied that I simply enjoyed reading Bem’s refutation of Bartlett’s belittling.)
Joan W. Scott in The Nation:
“Civility” has become a watch word for academic administrators. Earlier this year, Inside Higher Ed released a survey of college and university chief academic officers, which found that “a majority of provosts are concerned about declining faculty civility in American higher education.” Most of these provosts also “believe that civility is a legitimate criterion in hiring and evaluating faculty members,” and most think that faculty incivility is directed primarily at administrators. The survey brought into the open what has perhaps long been an unarticulated requirement for promotion and tenure: a certain kind of deference to those in power.
But what exactly is civility — and is it a prerequisite for a vibrant intellectual climate? As it turns out, the definitions on offer are porous and vague. University of Illinois professor Cary Nelson, who supported the decision not to hire Salaita, sees it as a “reluctance to indulge in mutual hatred,” thereby placing a limit on violence and campus warfare. Others stress courteous and respectful behavior and its concomitants: comfort, safety, and security. The University of Missouri’s “Show Me Respect” project includes a “toolbox” that offers 20 ways to achieve civility (including the reminder to “do unto others as you would have them do unto you”). At the University of Wisconsin, Oshkosh, a 2011 conference offered these words of wisdom: “Academic freedom and free speech require open, safe, civil and collegial campus environments.” And a statement from a University of Maryland discussion paper on civility in 2013 defines it “simply as ‘niceness to others.’… Additionally, the definition may be used broadly to spur discussions on how ‘nice guys and gals finish first’ and how cordiality and kindness can be tracked across campus to ensure faculty, staff, and students are indeed playing nice.”
The attempts to secure the comfort and safety of students — now recognized for their economic value as paying clients who need to be satisfied — are subjugating language and thinking to their own ends. These dictates seem to know no limits and are evident in other policies, such as the call for “trigger warnings” in college classrooms. Professors are being asked by the representatives of some students or groups — and by the anxious deans who rush to satisfy their complaints — to avoid assigning material that might provoke flashbacks or even attention to discomforting violence. The demand for trigger warnings has the same intent as the emphasis on comfort and civility in the Salaita affair and the statement to the UC Berkeley community by Dirks: to stifle thought on the part of both teachers and students who might otherwise express opinions that could make others “uncomfortable.”
All of these efforts presume a certain benign self-evidence for the use of the term “civility.” As the University of Maryland statement puts it, “niceness” is “easily understood by all parties”: We know civility when we see it. Left aside in these invocations are not only interpretive differences among individuals and groups (one man’s or woman’s presumed civility may strike another as uncivil), but also the history of the term. Although, as with any word, the meanings of “civility” have changed, the concept still carries traces of its earlier use. I’d argue further that although the contexts and specific applications have varied over time, the notion of civility consistently establishes relations of power whenever it is invoked. Moreover, it is always the powerful who determine its meaning — one that, whatever its specific content, demeans and delegitimizes those who do not meet its test.
MORE: “The New Thought Police“
From biblical theologian Wesley Hill in First Things:
Irrelevant reading is the sort of reading you do when you pick up a book that, you fear, has nothing whatever to say to your present concern, the thing that’s driving you to want to read in the first place. Say you’re a teacher and you want to learn more about your craft. You may pick up Ken Bain’s marvelous book What the Best College Teachers Do and read it dutifully, annotating the margins and writing pieces of advice to yourself about next year’s lesson plans. But then, on your nightstand, say, you plop Chaim Potok’s novel The Promise down, since you’ve told yourself you’d read it ever since finishing its prequel The Chosen a couple of years ago. Late one night, you stay up and finish it. And you read that gripping scene in the yeshiva where the protagonist Reuven is quizzed mercilessly about arcana from the Talmud, and suddenly, you see not only the kind of teacher you need to be (Socratic, inspiring, relishing the mysterious complexity of your subject) but also find the inspiration you need to finish that next lecture. Your supposedly irrelevant fiction reading becomes more, or at least as, important to you as your allegedly more relevant textbook. And you grasp intuitively what my friend Luke Neff once put into a pithy saying: “Cultural omnivores make the best teachers.”
. . . Not all reading should be “irrelevant.” Some should be assiduous study of the key texts in one’s field. Other reading, the especially pleasurable kind, should be purely recreational. But when one is reading widely, there’s a special kind of delight that emerges when an evidently immaterial book suddenly intersects with what you most need to know in that moment. There’s no telling when such a moment may arrive, so it’s best to keep up a habit of irrelevant reading.
I sometimes tell my students the most important reading they’ll do for one of my classes at the seminary where I teach may well be the reading I never thought to assign.
MORE: “In Praise of Irrelevant Reading“
Here’s a double dose of dystopian cheer to accompany a warm and sunny Monday afternoon (or at least that’s the weather here in Central Texas).
First, Adam Kirsch, writing for The New Republic, in a piece dated May 2:
Everyone who ever swore to cling to typewriters, record players, and letters now uses word processors, iPods, and e-mail. There is no room for Bartlebys in the twenty-first century, and if a few still exist they are scorned. (Bartleby himself was scorned, which was the whole point of his preferring not to.) Extend this logic from physical technology to intellectual technology, and it seems almost like common sense to say that if we are not all digital humanists now, we will be in a few years. As the authors of Digital_Humanities write, with perfect confidence in the inexorability — and the desirability — of their goals, “the 8-page essay and the 25-page research paper will have to make room for the game design, the multi-player narrative, the video mash-up, the online exhibit and other new forms and formats as pedagogical exercises.”
. . . The best thing that the humanities could do at this moment, then, is not to embrace the momentum of the digital, the tech tsunami, but to resist it and to critique it. This is not Luddism; it is intellectual responsibility. Is it actually true that reading online is an adequate substitute for reading on paper? If not, perhaps we should not be concentrating on digitizing our books but on preserving and circulating them more effectively. Are images able to do the work of a complex discourse? If not, and reasoning is irreducibly linguistic, then it would be a grave mistake to move writing away from the center of a humanities education.
. . . The posture of skepticism is a wearisome one for the humanities, now perhaps more than ever, when technology is so confident and culture is so self-suspicious. It is no wonder that some humanists are tempted to throw off the traditional burden and infuse the humanities with the material resources and the militant confidence of the digital. The danger is that they will wake up one morning to find that they have sold their birthright for a mess of apps.
Second, Will Self, writing for The Guardian, in a piece also dated May 2:
The literary novel as an art work and a narrative art form central to our culture is indeed dying before our eyes. Let me refine my terms: I do not mean narrative prose fiction tout court is dying — the kidult boywizardsroman and the soft sadomasochistic porn fantasy are clearly in rude good health. And nor do I mean that serious novels will either cease to be written or read. But what is already no longer the case is the situation that obtained when I was a young man. In the early 1980s, and I would argue throughout the second half of the last century, the literary novel was perceived to be the prince of art forms, the cultural capstone and the apogee of creative endeavour. The capability words have when arranged sequentially to both mimic the free flow of human thought and investigate the physical expressions and interactions of thinking subjects; the way they may be shaped into a believable simulacrum of either the commonsensical world, or any number of invented ones; and the capability of the extended prose form itself, which, unlike any other art form, is able to enact self-analysis, to describe other aesthetic modes and even mimic them. All this led to a general acknowledgment: the novel was the true Wagnerian Gesamtkunstwerk.
. . . [T]he advent of digital media is not simply destructive of the codex, but of the Gutenberg mind itself. There is one question alone that you must ask yourself in order to establish whether the serious novel will still retain cultural primacy and centrality in another 20 years. This is the question: if you accept that by then the vast majority of text will be read in digital form on devices linked to the web, do you also believe that those readers will voluntarily choose to disable that connectivity? If your answer to this is no, then the death of the novel is sealed out of your own mouth.
. . . I believe the serious novel will continue to be written and read, but it will be an art form on a par with easel painting or classical music: confined to a defined social and demographic group, requiring a degree of subsidy, a subject for historical scholarship rather than public discourse. . . . I’ve no intention of writing fictions in the form of tweets or text messages — nor do I see my future in computer-games design. My apprenticeship as a novelist has lasted a long time now, and I still cherish hopes of eventually qualifying. Besides, as the possessor of a Gutenberg mind, it is quite impossible for me to foretell what the new dominant narrative art form will be — if, that is, there is to be one at all.
Image: Painting: John White Alexander (1856–1915); Photo: Andreas Praefcke (Own work (own photograph)) [Public domain or Public domain], via Wikimedia Commons
Here’s media studies scholar Siva Vaidhyanathan making the case for recognizing the reality of an academic/scholarly calling — in the authentic religious vocational sense — in the midst of a neoliberal age obsessed with the economic and political concerns of the so-called “real world”:
In the United States, and increasingly in the world at large, we tend to reduce the conversation about the value, role, and scope of the scholarly life to how it serves short-term and personal interests like career preparation or job training. Sometimes we discuss higher education as an economic boon, attracting industry to a particular location or employing thousands in a remote town. Or we probe it as an engine of research and innovation. And sometimes we use academia as a tableau for satire or social criticism when we expose the excesses of the lazy and self-indulgent professoriat or giggle at the paper titles at the annual meeting of the Modern Language Association.
But none of these appraisals of the life of the mind gets at the real heart of the matter: the now quaint-sounding matter of the university’s “mission” — the bigger-picture question of what our institutions of higher learning do for and with the world.
. . . Within every great American university, even MIT, there is a monastery. It’s at its core. Sometimes the campus walls and spires make that ancestry undeniable. More often, the stadiums, sweatshirt stores, laboratories, fraternity houses, and career-placement offices mask the monastery. But it’s still there. European universities emerged from the network of monasteries that had accumulated, preserved, copied, and catalogued texts and scrolls over centuries. The transformation from cloistered monastery to slightly less cloistered university occurred in fits and starts over three centuries. But by the eighteenth century, universities throughout Europe were able to converse about this new thing called science and reflect on the meaning and utility of ancient texts that bore new meaning at the dawn of an industrial age.
Early American colleges and universities were likewise religious institutions built to train clergy to serve a sinful people. Soon they took on an additional role: exposing idle sons of the landed gentry such as James Madison and Thomas Jefferson to dangerous books coming over from Europe.
. . . [But today] When we scholars explain our passions — the deep satisfaction we feel when we help a nineteen-year-old make a connection between the Mahabharata and The Iliad, or when our research challenges the surprising results of some medical experiment that the year before generated unwarranted headlines — many of our listeners roll their eyes like my fellow students did back in that classroom in 1995. How embarrassing that people find deep value in such uncountable things.
It’s been a couple of decades since any American faculty member could engage in the deep pursuit of knowledge untethered from the clock or calendar. But many of us still write for the guild and the guild only, satisfied that someday someone might find the work a valuable part of a body of knowledge. But if that never happens, so be it — it’s all part of the calling’s steep price of admission.
Image: “Medieval writing desk” [Public Domain], via Wikimedia Commons
Last week I was led to quote one of Bradbury’s famous bits of life advice — of which there are many — to one of those students. It was his line about leaping off cliffs and then building your wings on the way down. Afterward, I got curious about the provenance of this quote, and this led me on an Internet search for its source or sources. Eventually I was led to an excellent 23-year-old interview with Bradbury in South Carolina’s Spartanburg Herald-Journal, obtained by them from the New York Times news service, and presently readable thanks to Google’s news archive.
The title is “Learning is solitary pursuit for Bradbury.” The journalist is Luaine Lee. The date is October 17, 1990. And the interview shows Bradbury offering some really lovely articulations of ideas, insights, and anecdotes (many of them familiar but all of them neverendingly fascinating) from his personal mythic journey. Read the rest of this entry