Category Archives: Internet & Media
At the beginning of each semester I tell my students the very thing that journalist Zat Rana gets at in a recent article for Quartz when I deliver a mini-sermon about my complete ban on phones — and also, for almost all purposes, laptops — in my classroom. A smartphone or almost any cell phone in your hand, on your desk, or even in your pocket as you’re trying to concentrate on important other things is a vampire demon powered by dystopian corporate overlords whose sole purpose is to suck your soul by siphoning away your attention and immersing you in a portable customized Matrix.
Or as Rana says, in less metaphorical language:
One of the biggest problems of our generation is that while the ability to manage our attention is becoming increasingly valuable, the world around us is being designed to steal away as much of it as possible….Companies like Google and Facebook aren’t just creating products anymore. They’re building ecosystems. And the most effective way to monetize an ecosystem is to begin with engagement. It’s by designing their features to ensure that we give up as much of our attention as possible.
Rana offers three pieces of sound advice for helping to reclaim your attention (which is the asset referred to in the title): mindfulness meditation, “ruthless single-tasking,” and regular periods of deliberate detachment from the digital world.
Interestingly, it looks like there’s a mini-wave of this type of awareness building in the mediasphere. Rana’s article for Quartz was published on October 2. Four days later The Guardian published a provocative and alarming piece with this teaser: “Google, Twitter and Facebook workers who helped make technology so addictive are disconnecting themselves from the internet. Paul Lewis reports on the Silicon Valley refuseniks alarmed by a race for human attention.” It’s a very long and in-depth article. Here’s a taste:
There is growing concern that as well as addicting users, technology is contributing toward so-called “continuous partial attention”, severely limiting people’s ability to focus, and possibly lowering IQ. One recent study showed that the mere presence of smartphones damages cognitive capacity — even when the device is turned off. “Everyone is distracted,” Rosenstein says. “All of the time.”
But those concerns are trivial compared with the devastating impact upon the political system that some of Rosenstein’s peers believe can be attributed to the rise of social media and the attention-based market that drives it. . . .
Tech companies can exploit such vulnerabilities to keep people hooked; manipulating, for example, when people receive “likes” for their posts, ensuring they arrive when an individual is likely to feel vulnerable, or in need of approval, or maybe just bored. And the very same techniques can be sold to the highest bidder. . . .
“The dynamics of the attention economy are structurally set up to undermine the human will,” [ex-Google strategist James Williams] says. “If politics is an expression of our human will, on individual and collective levels, then the attention economy is directly undermining the assumptions that democracy rests on.” If Apple, Facebook, Google, Twitter, Instagram and Snapchat are gradually chipping away at our ability to control our own minds, could there come a point, I ask, at which democracy no longer functions?
“Will we be able to recognise it, if and when it happens?” Williams replies. “And if we can’t, then how do we know it hasn’t happened already?”
In the same vein, Nicholas Carr (no stranger to The Teeming Brain’s pages) published a similarly aimed — and even a similarly titled — essay in the Weekend Review section of The Wall Street Journal on the very day the Guardian article appeared (October 6). “Research suggests that as the brain grows dependent on phone technology, the intellect weakens,” says the teaser. Here’s a representative passage from the essay itself:
Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object in the environment that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.” Media and communication devices, from telephones to TV sets, have always tapped into this instinct. Whether turned on or switched off, they promise an unending supply of information and experiences. By design, they grab and hold our attention in ways natural objects never could.
But even in the history of captivating media, the smartphone stands out. It’s an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what [Adrian] Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it’s part of the surroundings — which it always is. Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library, and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That’s what a smartphone represents to us. No wonder we can’t take our minds off it.
Full Text: “How Smartphones Hijack Our Minds“
At his blog Carr noted the simultaneous appearance of his essay and the Guardian article on the same day. He also noted the striking coincidence of the similarity between the titles, calling it a “telling coincidence” and commenting:
It’s been clear for some time that smartphones and social-media apps are powerful distraction machines. They routinely divide our attention. But the “hijack” metaphor — I took it from Adrian Ward’s article “Supernormal” — implies a phenomenon greater and more insidious than distraction. To hijack something is to seize control of it from its rightful owner. What’s up for grabs is your mind.
Perhaps the most astonishing thing about all of this is that John Carpenter warned us about it three decades ago, and not vaguely, but quite specifically and pointedly. The only difference is that the technology in his (quasi-)fictional presentation was television. Well, that, plus the fact that his evil overlords really were ill-intentioned, whereas ours may be in some cases as much victims of their own devices as we are. In any event:
There is a signal broadcast every second of every day through our television sets. Even when the set is turned off, the brain receives the input. . . . Our impulses are being redirected. We are living in an artificially induced state of consciousness that resembles sleep. . . . We have been lulled into a trance.
From an essay by Ed Finn, founding director of the Center for Science and the Imagination at Arizona State University:
We are all centaurs now, our aesthetics continuously enhanced by computation. Every photograph I take on my smartphone is silently improved by algorithms the second after I take it. Every document autocorrected, every digital file optimised. Musicians complain about the death of competence in the wake of Auto-Tune, just as they did in the wake of the synthesiser in the 1970s. It is difficult to think of a medium where creative practice has not been thoroughly transformed by computation and an attendant series of optimisations. . . .
Today, we experience art in collaboration with these algorithms. How can we disentangle the book critic, say, from the highly personalised algorithms managing her notes, communications, browsing history and filtered feeds on Facebook and Instagram? . . . .
The immediate creative consequence of this sea change is that we are building more technical competence into our tools. It is getting harder to take a really terrible digital photograph, and in correlation the average quality of photographs is rising. From automated essay critiques to algorithms that advise people on fashion errors and coordinating outfits, computation is changing aesthetics. When every art has its Auto-Tune, how will we distinguish great beauty from an increasingly perfect average? . . .
We are starting to perceive the world through computational filters, perhaps even organising our lives around the perfect selfie or defining our aesthetic worth around the endorsements of computationally mediated ‘friends’. . . .
Human creativity has always been a response to the immense strangeness of reality, and now its subject has evolved, as reality becomes increasingly codeterminate, and intermingled, with computation. If that statement seems extreme, consider the extent to which our fundamental perceptions of reality — from research in the physical sciences to finance to the little screens we constantly interject between ourselves in the world — have changed what it means to live, to feel, to know. As creators and appreciators of the arts, we would do well to remember all the things that Google does not know.
FULL TEXT: “Art by Algorithm“
This remarkable animation comes from the hand (or computer) of illustrator and animator Steve Cutts, famed for such things as 2012’s Man, which packs an unbelievable punch. So does the one I’ve chosen to post here. Cutts created it for last year’s hit song “Are You Lost in the World Like Me?” by Moby and The Void Pacific Choir. But I personally like this slight repurposing much better, where the musical accompaniment is changed to French composer Yann Tiersen’s “Comptine d’un autre été, l’après-midi” (best known for being featured in the soundtrack for the 2001 French film Amélie).
The story told by the visuals, and also by the piercingly beautiful and sad musical accompaniment, can stand without comment here, as Teeming Brain readers are well aware of my deep disturbance and unhappiness at the digital dystopia that has emerged in the age of the smartphone. I consider Cutts something of a genius, both for his choice of animation style and for his devastating accuracy in calling out the dark and despairing heart of this cultural dead end in fairly visionary fashion. And no, the fact that his creation of this animation, and my sharing of it here, and your reading of it, is all facilitated by the existence of networked computers doesn’t invalidate the message with a fatal irony. We could probably do better, culturally and humanly speaking, in our uses of these technologies. But instead we’re apparently inclined to give way, en masse, to our lowest impulses, resulting in a kind of digital Dante’s Inferno whose factual reality isn’t really all that far from the only slightly exaggerated version presented by Cutts.
A grateful acknowledgment goes out to Jesús Olmo, who introduced me to Cutts by sending me a link to Man last month.
Here’s the ever-reliable Nick Ripatrazone discussing the inspirational influence of Marshall McLuhan on David Cronenberg as the latter was conceiving and making 1983’s Videodrome, which Ripatrazone characterizes — correctly, I think — as “perfect viewing for 2017 — the year a man baptized by television becomes president.” The article also provides an able introduction to McLuhan’s legacy, reputation, and influence.
In his audio commentary for the film, Cronenberg admits that the professor [Brian O’Blivion, who runs the sinister Videodrome broadcast and its attendant “mission” for homeless people, the Cathode Ray Mission] was inspired by the “communications guru” Marshall McLuhan. McLuhan taught at the University of Toronto while Cronenberg attended, but to his “everlasting regret,” he never took a course with the media icon. Cronenberg said that McLuhan’s “influence was felt everywhere at the university” — a mystical-tinged description that McLuhan would have appreciated. . . .
McLuhan was a scholar of James Joyce, a purveyor of print. He documented the advent of the electric eye, but he didn’t desire it. Although he had “nothing but distaste for the process of change,” he said you had to “keep cool during our descent into the maelstrom.” Max can’t keep cool. He is infected by Videodrome; the show’s reality subverts its unreal medium. Max discovers that Professor O’Blivion helped create Videodrome because “he saw it as the next phase in the evolution of man as a technological animal.” Sustained viewing of Videodrome creates tumors and hallucinations. Max is being played by the remaining originators of Videodrome, whose philosophy sounds downright familiar: “North America’s getting soft, and the rest of the world is getting tough. We’re entering savage new times, and we’re going to have to be pure and direct and strong if we’re going to survive them.” Videodrome is a way to identify the derelicts by giving them what they most crave — real violence — and then incapacitate them into submission.
McLuhan’s idea that “mental breakdown is the very common result of uprooting and inundation with new information,” and his simultaneous interest in, and skepticism of, the “electric eye” finds a gory literalism in Cronenberg’s film. Videodrome is what happens when a self-described existentialist atheist channels McLuhan — but makes McLuhan’s Catholic-infused media analysis more secular and raw. Cronenberg was able to foretell our electronic evolution, the quasi-Eucharistic way we “taste and see” the Internet. The film’s gore and gush might now strike us as campy, but Videodrome shows what happens when mind and device become one. “Death is not the end,” one character says, but “the beginning of the new flesh.” We’re already there.
Here are some choice passages from an insight-rich essay by historian James McWilliams at The American Scholar, in which he discusses two major and complementary options for dealing with digital technology’s epochal assault on the stable self: first, take serious and substantial steps to humanize the digital world; second, retain (or return to) a serious relationship with the physical book.
The underlying concern with the Internet is not whether it will fragment our attention spans or mold our minds to the bit-work of modernity. In the end, it will likely do both. The deeper question is what can be done when we realize that we want some control over the exchange between our brains and the Web, that we want to protect our deeper sense of self from digital media’s dominance over modern life. . . .
The essence of our dilemma, one that weighs especially heavily on Generation Xers and millennials, is that the digital world disarms our ability to oppose it while luring us with assurances of convenience. It’s critical not only that we identify this process but also that we fully understand how digital media co-opt our sense of self while inhibiting our ability to reclaim it. . . .
This is not to suggest that we should aim to abolish digital media or disconnect completely — not at all. Instead, we must learn to humanize digital life as actively as we’ve digitized human life.
No one solution can restore equity to the human-digital relationship. Still, whatever means we pursue must be readily available (and cheap) and offer the convenience of information, entertainment, and social engagement while promoting identity-building experiences that anchor the self in society. Plato might not have approved, but the tool that’s best suited to achieve these goals today is an object so simple that I can almost feel the eye-rolls coming in response to such a nostalgic fix for a modern dilemma: the book. Saving the self in the age of the selfie may require nothing more or less complicated than recovering the lost art of serious reading. . . .
[A]s the fog of digital life descends, making us increasingly stressed out and unempathetic, solipsistic yet globally connected, and seeking solutions in the crucible of our own angst, it’s worth reiterating what reading does for the searching self. A physical book, which liberates us from pop-up ads and the temptation to click into oblivion when the prose gets dull, represents everything that an identity requires to discover Heidegger’s nearness amid digital tyranny. It offers immersion into inner experience, engagement in impassioned discussion, humility within a larger community, and the affirmation of an ineluctable quest to experience the consciousness of fellow humans. In this way, books can save us.
Full text: “Saving the Self in the Age of the Selfie“
In the past I have both 1) praised Jeff Bezos for displaying what looks like a true love of books and reading, and 2) highlighted Amazon’s bullying and heavy-handedness in the publishing industry by linking to Steve Wasserman’s damning 2012 article “The Amazon Effect,” in which Wasserman, the former editor of the Los Angeles Times Books Review, explains how his early positive view of Bezos and Amazon soured over time as it became evident that the company is intent on “bulldozing any real or perceived obstacles to its single-minded pursuit of maximum clout” by imposing “increasingly harsh terms on both its competitors and its clients.”
Recently it’s looking like the scale has tipped definitively in favor of the negative judgment on both Bezos and his company. Or at least that’s my take, which is based on the fact, revealed just last week, that Amazon is now flat-out blackmailing publishers and authors into complying with their draconian demands by charging higher prices and delaying shipments for products from companies that resist them. Various other tactics are also involved, such as removing entire promotion pages for some books. What’s more, Amazon isn’t afraid to play this kind of hardball with books by big-name authors. Titles by J. K. Rowling, Anne River Siddons, and James Patterson are among those that have been affected.
Says The New York Times‘ David Streitfield and Melissa Eddy:
Amazon’s power over the publishing and bookselling industries is unrivaled in the modern era. Now it has started wielding its might in a more brazen way than ever before. Seeking ever-higher payments from publishers to bolster its anemic bottom line, Amazon is holding books and authors hostage on two continents by delaying shipments and raising prices. The literary community is fearful and outraged — and practically begging for government intervention. . . . No firm in American history has exerted the control over the American book market — physical, digital and secondhand — that Amazon does.
Now, I don’t know about you, but I’m personally fed up with this kind of crap, and this feeling applies to more than just the Amazon situation. Amazon is emblematic of a major cultural shift that has taken place in the Internet era as megacorporations representing various sectors of the business world and cultural life at large have attempted to hold us all hostage by playing an egregiously monopolistic game. And it all seems doubly sinister in a way that’s distinct from the monopolies of a past age, since this time the imperialistic and totalitarian business practices are hitched to, and also — or so the corporate titans hope — enabled and sweetened by, the digital-populist tone of “personal freedom and empowerment” that still attends the Internet like a lingering morning mist at midday.
This kind of thing makes me remember all over again why I ditched Facebook and Twitter. Among other reasons, I just got sick of being a willing pawn in the war of the Digital Overlords, where my personal data and decisions are used as leverage and ammunition. I’ve been thinking for many months that it may be time to ditch Amazon as well, and this recent revelation adds some serious weight to that consideration. This would of course mean going back and removing all of the Amazon affiliate links here at The Teeming Brain. I also own a Kindle and subscribe to Prime, so, you know, I’m pretty deeply entangled. And don’t think for a minute that I’m not aware of the tarry syrup of irony that automatically coats every word I type here, on a blog, using a computer that’s running a Windows operating system, thus reinforcing the basic thrust of the entire digital economy and cultural technopoly that I’m ostensibly criticizing.
I would be interested to hear anybody’s thoughts on this issue. Is Amazon really a tyrant? Would a personal boycott be advisable? Would it even be meaningful? More broadly, is the future just a giant playing field for megacorporations where the role of us peasants is simply to be trampled underfoot while saying thank you for it?
Image courtesy of mack2happy / FreeDigitalPhotos.net
This absorbing video condenses the message presented by philosopher Alain de Botton in his new book The News: A User’s Manual, whose basic thesis and purpose is described by the publisher as follows:
We are never really taught how to make sense of the torrent of news we face every day . . . but this has a huge impact on our sense of what matters and of how we should lead our lives. In his dazzling new book, de Botton takes twenty-five archetypal news stories — including an airplane crash, a murder, a celebrity interview and a political scandal — and submits them to unusually intense analysis with a view to helping us navigate our news-soaked age.
Here are the points made in the above video (and thus in de Botton’s bok), as distilled by me: Read the rest of this entry
From the late 1980s to the early 2000s, I kept a longhand journal. It was where I learned the sound of my own inner voice and the rhythm of my own thoughts, and where I gained a more conscious awareness and understanding of the ideas, subjects, emotions, and themes that are, through sheer force of gravitational passion, my given subject matter as a writer and human being.
This writing discipline, which was powered by a combination of conscious will and involuntary compulsion (so deeply intermixed that I could never fully figure out where the one left off and the other began), began to alter itself spontaneously with my plunge into Internet culture circa 1995. To condense a very long story to a single sentence, almost from the very minute I entered the Internet fray, my desire to write by hand began to dwindle until it almost disappeared — but it remains something that I deliberately return to from time to time for inner recalibration and recentering, and I invariably find it so full of beneficial, soul-healing effects that I wonder every time why I ever abandoned it to begin with.
Now comes digital culture commentator Tom Chatfield, writing in City Journal about information age anxiety and the danger that we will be utterly swallowed by the vortex of digital noise and distraction that we have created. And he talks cogently about this very issue: the relationship between, and in fact the conflict between, the clear-souled act of writing by hand and the swirl of digital noise and distraction that otherwise cocoons us:
I have noticed, for example, that I think and feel differently depending on whether my cell phone is switched on or off. The knowledge that I am potentially contactable subtly alters the texture of my time. According to a Pew Research Center survey, 67 percent of American adults have experienced “phantom” rings, thinking that their phones are vibrating or ringing when they aren’t. I now try to build some uncontactable time into each of my days — not because I fear technology but because feeling able to say no as well as yes helps me take ownership of my decisions. Without boundaries, without friction, value slips away.
I sometimes write in longhand simply to re-create some of this friction. When I write with a pen on paper, words flow with the sense that they exist just half a sentence ahead of the nib. The mechanical slowness of writing helps me feel words as objects as well as ideas, with a synesthetic pleasure in their arrival. Composing into a physical notebook helps writing and reverie mix, often unexpectedly: sentences and phrases arrive out of the blue. Pens and paper are themselves simply the technologies of another era. There’s no magic in them, no fetish to worship. It is the experiences they enable — not what they are in themselves — that I value, alongside the gifts of more recent innovations.
Yet I struggle to live up to my own plan. I check my e-mail too often. I ache for the tiny endorsement of a retweet. I panic at an hour’s loss of cell-phone reception. I entrust ever more of my life and library to third parties, from Amazon to Apple, whose “ecosystems” seem to absorb me.
Where is the still point of the turning world where I might stand, understand, and take back control?
— Tom Chatfield, “Anxious in the Information Age,” City Journal 23.3 (Summer 2013)
I can tell you that my own experience parallels that of Mr. Chatfield with uncanny precision. Perhaps yours does as well.
Relatedly, I encourage you to go and read Mitch Horowitz’s recent article about taking a “massive leap forward in your writing through one simple exercise.” And what is that exercise? It’s very simple, and also simply revolutionary, says Mitch:
First, identify a piece of critical writing that you admire — perhaps an essay, article or review — but above all, something that captures the vitality and discretion that you would like to bring to the page. Then, recopy it by hand.
In the action of copying the piece by hand — not typing on a computer or tablet — you will discover the innards and guts of what the writer is doing. Writing by hand, with pen and paper, compels you to become mentally and even physically involved in picking apart the work. You will gain a new perspective on how the writer says things, how he deploys evidence and examples, and how his sentences are designed to introduce details or withhold them for later.
— Mitch Horowitz, “How to Take a Massive Leap Forward in Your Writing through One Simple Exercise,” The Huffington Post, September 19, 2013
Mitch goes on to describe how his hand-copying of an article by Jack Curry in The New York Times “reinvigorated my own passion for writing — and led me to focus on metaphysical history, which resulted in my two recent books: Occult America (Bantam, 2009) and One Simple Idea: How Positive Thinking Reshaped Modern Life (Crown, Jan 2014).”
Again, my own experience parallels what’s described here, because I myself have gotten enormous authorial mileage from copying down by hand the work of other writers.
And now you’ll have to excuse me, because I’ve got to log off, pick up a pen, and spend some time blackening a few pages in the notebook (as in, a bound stack of real paper pages, not a petite laptop computer) that awaits my real-world attention. But before I do, if any of this speaks to you, then I suppose the upshot is obvious: go thou and do likewise.