At the beginning of each semester I tell my students the very thing that journalist Zat Rana gets at in a recent article for Quartz when I deliver a mini-sermon about my complete ban on phones — and also, for almost all purposes, laptops — in my classroom. A smartphone or almost any cell phone in your hand, on your desk, or even in your pocket as you’re trying to concentrate on important other things is a vampire demon powered by dystopian corporate overlords whose sole purpose is to suck your soul by siphoning away your attention and immersing you in a portable customized Matrix.
Or as Rana says, in less metaphorical language:
One of the biggest problems of our generation is that while the ability to manage our attention is becoming increasingly valuable, the world around us is being designed to steal away as much of it as possible….Companies like Google and Facebook aren’t just creating products anymore. They’re building ecosystems. And the most effective way to monetize an ecosystem is to begin with engagement. It’s by designing their features to ensure that we give up as much of our attention as possible.
Rana offers three pieces of sound advice for helping to reclaim your attention (which is the asset referred to in the title): mindfulness meditation, “ruthless single-tasking,” and regular periods of deliberate detachment from the digital world.
Interestingly, it looks like there’s a mini-wave of this type of awareness building in the mediasphere. Rana’s article for Quartz was published on October 2. Four days later The Guardian published a provocative and alarming piece with this teaser: “Google, Twitter and Facebook workers who helped make technology so addictive are disconnecting themselves from the internet. Paul Lewis reports on the Silicon Valley refuseniks alarmed by a race for human attention.” It’s a very long and in-depth article. Here’s a taste:
There is growing concern that as well as addicting users, technology is contributing toward so-called “continuous partial attention”, severely limiting people’s ability to focus, and possibly lowering IQ. One recent study showed that the mere presence of smartphones damages cognitive capacity — even when the device is turned off. “Everyone is distracted,” Rosenstein says. “All of the time.”
But those concerns are trivial compared with the devastating impact upon the political system that some of Rosenstein’s peers believe can be attributed to the rise of social media and the attention-based market that drives it. . . .
Tech companies can exploit such vulnerabilities to keep people hooked; manipulating, for example, when people receive “likes” for their posts, ensuring they arrive when an individual is likely to feel vulnerable, or in need of approval, or maybe just bored. And the very same techniques can be sold to the highest bidder. . . .
“The dynamics of the attention economy are structurally set up to undermine the human will,” [ex-Google strategist James Williams] says. “If politics is an expression of our human will, on individual and collective levels, then the attention economy is directly undermining the assumptions that democracy rests on.” If Apple, Facebook, Google, Twitter, Instagram and Snapchat are gradually chipping away at our ability to control our own minds, could there come a point, I ask, at which democracy no longer functions?
“Will we be able to recognise it, if and when it happens?” Williams replies. “And if we can’t, then how do we know it hasn’t happened already?”
In the same vein, Nicholas Carr (no stranger to The Teeming Brain’s pages) published a similarly aimed — and even a similarly titled — essay in the Weekend Review section of The Wall Street Journal on the very day the Guardian article appeared (October 6). “Research suggests that as the brain grows dependent on phone technology, the intellect weakens,” says the teaser. Here’s a representative passage from the essay itself:
Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object in the environment that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.” Media and communication devices, from telephones to TV sets, have always tapped into this instinct. Whether turned on or switched off, they promise an unending supply of information and experiences. By design, they grab and hold our attention in ways natural objects never could.
But even in the history of captivating media, the smartphone stands out. It’s an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what [Adrian] Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it’s part of the surroundings — which it always is. Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library, and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That’s what a smartphone represents to us. No wonder we can’t take our minds off it.
Full Text: “How Smartphones Hijack Our Minds“
At his blog Carr noted the simultaneous appearance of his essay and the Guardian article on the same day. He also noted the striking coincidence of the similarity between the titles, calling it a “telling coincidence” and commenting:
It’s been clear for some time that smartphones and social-media apps are powerful distraction machines. They routinely divide our attention. But the “hijack” metaphor — I took it from Adrian Ward’s article “Supernormal” — implies a phenomenon greater and more insidious than distraction. To hijack something is to seize control of it from its rightful owner. What’s up for grabs is your mind.
Perhaps the most astonishing thing about all of this is that John Carpenter warned us about it three decades ago, and not vaguely, but quite specifically and pointedly. The only difference is that the technology in his (quasi-)fictional presentation was television. Well, that, plus the fact that his evil overlords really were ill-intentioned, whereas ours may be in some cases as much victims of their own devices as we are. In any event:
There is a signal broadcast every second of every day through our television sets. Even when the set is turned off, the brain receives the input. . . . Our impulses are being redirected. We are living in an artificially induced state of consciousness that resembles sleep. . . . We have been lulled into a trance.
One of the most subtle and subversive pieces of social criticism in Fahrenheit 451comes early in the book when Montag, a fireman (i.e., book burner) who eventually wakes up to a recognition of his society’s essential character as a fascist-totalitarian dark age, chats with a teenaged girl named Clarisse. Or rather, it’s she who chats with him. The dumbed-down denizen’s of Bradbury’s keenly envisioned future dystopia of ignorance, repression, distraction, and dissipation are more fond of television, music, games, sports, sedatives, and other amusements than they are of real human contact, and when Clarisse suddenly shows up, introduces herself, and begins talking to Montag on a succession of evenings as he walks home from work, he’s considerably discomfited. But he finds her intriguing, and eventually he comes to look forward to their talks, so that when she unexpectedly disappears — presumably having been taken away by the repressive central government (a suspicion that’s confirmed later in the novel) — he’s deeply disturbed by it.
At one point in their conversations, he asks her why she isn’t in school. Her response reflects a profound inversion and perversion of what it means to be “antisocial” as judged by the surrounding society:
“Oh, they don’t miss me,” she said. “I’m antisocial, they say. I don’t mix. It’s so strange. I’m very social indeed. It all depends on what you mean by social, doesn’t it? Social to me means talking to you about things like this.” She rattled some chestnuts that had fallen off the tree in the front yard. “Or talking about how strange the world is. Being with people is nice. But I don’t think it’s social to get a bunch of people together and then not let them talk, do you? An hour of TV class, an hour of basketball or baseball or running, another hour of transcription history or painting pictures, and more sports, but do you know, we never ask questions, or at least most don’t; they just run the answers at you, bing, bing, bing, and us sitting there for four more hours of film-teacher. That’s not social to me at all. It’s a lot of funnels and lot of water poured down the spout and out the bottom, and them telling us it’s wine when it’s not. They run us so ragged by the end of the day we can’t do anything but go to bed or head for a Fun Park to bully people around … I guess I’m everything they say I am, all right. I haven’t any friends. That’s supposed to prove I’m abnormal.”
Although Bradbury’s critique in this passage is aimed largely at the public school system, his description of Clarisse’s ironic plight, in which her authentic human sociability earns her the label “antisocial” — a label that, as the book later shows, is tantamount to a criminal charge in this particular (semi-)fictional dystopia — has wider resonances in today’s world of cultural dominance by social media. In fact, we may be seeing a similar inversion and perversion of language and values play out right before our eyes at this very cultural moment.
Read the rest of this entry