Beware the coming fusion of humans — you, me, all of us — with our smartphones and their array of apps for everything from finding directions to buying groceries to making ethical decisions. And make no mistake: this fusion is indeed coming. Or rather, it’s already here in nascent form. Just look around yourself and notice the people sitting and standing in any public place — stores, restaurants, movie theaters, sidewalks, streets, roads — and consulting their handheld digi-daemons about where to go and how to get there, what to do and when and how to do it, whom to call and what to say. In literally every aspect of life, we increasingly get by with more than just a little help from our handheld friends. Read the rest of this entry
In his new book The Big Screen: The Story of the Movies, film historian David Thomson seriously poses the question of whether our collective and alienating addiction to the multitude of screens (televisions, phones, tablet computers, etc.) that increasingly keep us buffered from the existential reality of the world and people around us may not be directly traceable to the birth and epochal influence of the first and biggest screen of them all. He makes the point concisely in a recent, brief essay for The Independent, excerpted below.
This is powerful, thought-provoking, disturbing stuff. And note that the title of Thomson’s book in its American edition, as given above, has been altered to tone down that quality of disturbingness; in its British release, the book’s subtitle is much more descriptive of its ominous message: The Big Screen: The Story of the Movies and What They Did to Us.]
Teaser: In his new book, David Thomson reveals how cinema has changed us all, and asks if being in thrall to the screen has detached us from reality
At first, the magic was overwhelming: in 1895, the first audiences for the Lumière brothers’ films feared that an approaching steam engine was going to come out of the screen and hit them. That gullibility passed off like morning mist, though observing the shower in Psycho (1960) we still seem to feel the impact of the knife. That scene is very frightening, but we know we’re not supposed to get up and rescue Janet Leigh. In a similar way, we can watch the surreal imagery of the devastation at Fukushima, or wherever, and whisper to ourselves that it’s terrible and tragic, but not happening to us. How large a step is it from that denial of our full selfhood to the wry passivity with which we observe global warming, economic collapse and a new freelance nuclear age as portents of an end to a world that is beyond us? Pioneers of film, such as D W Griffith, Chaplin and Abel Gance, hoped that the movie would make a single population in the world angry or moved enough to share liberty and opportunity and end war and intolerance. But perhaps it has made for a society of voyeurs who associate their own hiding in the dark with the safe futility of dealing with the screen’s frenzy.
… For decades, we told ourselves we were watching film and its illusion of reality. And so we treated movies as if they were theatre or novels given this extra investment and the kicker of sensation — of being there … They are all frenzies on the wall. What is most important is the fact of the screen as something that separates us from reality. All along, I think, we have been watching screens, and it is only recently, with the profusion of electronic screens, some so small that people aged over 25 can’t quite see them, that this has been appreciated … I fear film studies, film in academia and good criticism of the medium are all McGuffins compared with the dislocating stealth of the screen. People in the street nowadays bump into one another because they are intent on screens, which means they hardly notice the architecture, the acts of mayhem and indifference going on around them, or the weather. The medium that was alleged to bring all realities to our laps may have reduced us to laptops … I think, now, anything goes if it serves the screen and keeps us in alleged entertainment and information, as our true state moves ever further from being entertaining.
— David Thomson, The Independent, Cinema has changed us all: The birth of alienation, September 30, 2012
Morris Berman may not have been the first person to offer simultaneous commentary on American culture and Fahrenheit 451 by observing that the former has basically transformed itself into the dystopian society depicted by the latter. Many people have noted in the decades since Fahrenheit was first published in 1953 that things have been moving eerily and strikingly in the direction Bradbury foresaw (or rather, the direction he tried to forestall; “I wasn’t trying to predict the future,” he famously said in a 2003 interview. “I was trying to prevent it.”) But it was Berman who most forcefully affected me with this line of thought when he laid it out in The Twilight of American Culture:
In 1953, Ray Bradbury published Fahrenheit 451 — later made into a movie by Francois Truffaut — which depicts a future society in which intelligence has largely collapsed and the reading of books is forbidden by law. People sit around interacting with screens (referred to as “the family”) and taking tranquilizers. Today, nearly five decades later, isn’t this largely the point at which we have arrived? Do not the data [on the collapse of American intelligence] suggest that most of our neighbors are, in fact, the mindless automatons depicted in Truffaut’s film? True, the story does contain a class of “book people” who hide in the forest and memorize the classics, to pass on to future generations — and this vignette does, in fact, provide a clue as to what just might enable our civilization to eventually recover — but the majority of citizens on the eve of the twenty-first century watch an average of four hours of TV a day, pop Prozac and its derivatives like candy, and perhaps read a Danielle Steel novel once a year
. . . [T]he society depicted in Fahrenheit 451 has banned books and immerses itself instead in video entertainment, a kind of “electronic Zen,” in which history has been forgotten and only the present moment counts . . . [The novel] is extraordinarily prescient. Leaving aside the issue of direct censorship of books — rendered unnecessary by McWorld, as it turns out, because most people don’t read anymore — most of the features of this futuristic society are virtually upon us, or perhaps no more than twenty years away. 
The Internet’s corrosive mental effects: A growing problem requiring a deliberate defensive response
For those of you who, like me, have been interested to hear the background drumbeat of warnings about the mental and neurological effects of the Internet revolution over the past several years — think Nicholas Carr’s “Is Google Making Us Stupid?” and The Shallows, just for starters — a recent, in-depth article about this very subject from Newsweek will make for compelling reading. It’s not exactly a pleasant read, though, because the conclusion it draws from mountains of evidence is deeply disturbing.
Here’s the gist:
Teaser: Tweets, texts, emails, posts. New research says the Internet can make us lonely and depressed — and may even create more extreme forms of mental illness, Tony Dokoupil reports.
Questions about the Internet’s deleterious effects on the mind are at least as old as hyperlinks. But even among Web skeptics, the idea that a new technology might influence how we think and feel — let alone contribute to a great American crack-up — was considered silly and naive, like waving a cane at electric light or blaming the television for kids these days. Instead, the Internet was seen as just another medium, a delivery system, not a diabolical machine. It made people happier and more productive. And where was the proof otherwise?
Throughout the 1990s the Clinton administration pushed hard for the universal integration of computers and information technology throughout America’s public education system, culminating in Bill Clinton’s official presidential call for “A computer in every classroom,” since, in his words, technology is “the great equalizer” for schools. No matter that it was an idea (and ideology) that was basically made up and lacking in any real support. No matter that, as Todd Oppenheimer incisively argued in a now-classic 1997 Atlantic article titled “The Computer Delusion” (and later in its 2003 book-length expansion, The Flickering Mind: The False Promise of Technology in the Classroom and How Learning Can Be Saved), “There is no good evidence that most uses of computers significantly improve teaching and learning, yet school districts are cutting programs — music, art, physical education — that enrich children’s lives to make room for this dubious nostrum, and the Clinton Administration has embraced the goal of ‘computers in every classroom’ with credulous and costly enthusiasm.” The techno-utopian impulse for America’s schools proved to be unstoppable on a practical level, and schools en masse, from kindergarten to college, bought into it on a proverbial hook, line, and sinker basis. The idea prevalent at administrative levels was and — as I can vouch from having spent the last decade-plus working in high school and college settings — still is that technology in and of itself is a Great Thing that will Revolutionize Learning. Even though many individual administrators and teachers are quite savvy and sensitive to the nuances of the techno-utopian gospel, the overall institutional-cultural pressure is overwhelmingly in the direction of uncritical adoption.
The editors of the always-valuable n+1 have published a penetrating and damning assessment of what’s wrong with the craze for credentials that marks the American economic and educational landscape right now. It’s all the more valuable for putting the whole thing in long-historical perspective.
For the contemporary bachelor or master or doctor of this or that, as for the Ming-era scholar-bureaucrat or the medieval European guildsman, income and social position are acquired through affiliation with a cartel. Those who want to join have to pay to play, and many never recover from the entry fee.
…Over the last thirty years, the university has replaced the labor union as the most important institution, after the corporation, in American political and economic life. As union jobs have disappeared, participation in the labor force, the political system, and cultural affairs is increasingly regulated by professional guilds that require their members to spend the best years of life paying exorbitant tolls and kissing patrician rings. Whatever modest benefits accreditation offers in signaling attainment of skills, as a ranking mechanism it’s zero-sum: the result is to enrich the accreditors and to discredit those who lack equivalent credentials.
Jean Baudrillard once suggested an important correction to classical Marxism: exchange value is not, as Marx had it, a distortion of a commodity’s underlying use value; use value, instead, is a fiction created by exchange value. In the same way, systems of accreditation do not assess merit; merit is a fiction created by systems of accreditation. Like the market for skin care products, the market for credentials is inexhaustible: as the bachelor’s degree becomes democratized, the master’s degree becomes mandatory for advancement. Our elaborate, expensive system of higher education is first and foremost a system of stratification, and only secondly – and very dimly — a system for imparting knowledge.
— “Death by Degrees,” n+1, June 19, 2012