Blog Archives

Your smartphone is built to hijack and harvest your mind

At the beginning of each semester I tell my students the very thing that journalist Zat Rana gets at in a recent article for Quartz when I deliver a mini-sermon about my complete ban on phones — and also, for almost all purposes, laptops — in my classroom. A smartphone or almost any cell phone in your hand, on your desk, or even in your pocket as you’re trying to concentrate on important other things is a vampire demon powered by dystopian corporate overlords whose sole purpose is to suck your soul by siphoning away your attention and immersing you in a portable customized Matrix.

Or as Rana says, in less metaphorical language:

One of the biggest problems of our generation is that while the ability to manage our attention is becoming increasingly valuable, the world around us is being designed to steal away as much of it as possible….Companies like Google and Facebook aren’t just creating products anymore. They’re building ecosystems. And the most effective way to monetize an ecosystem is to begin with engagement. It’s by designing their features to ensure that we give up as much of our attention as possible.

Full Text: “Technology is destroying the most important asset in your life

Rana offers three pieces of sound advice for helping to reclaim your attention (which is the asset referred to in the title): mindfulness meditation, “ruthless single-tasking,” and regular periods of deliberate detachment from the digital world.

Interestingly, it looks like there’s a mini-wave of this type of awareness building in the mediasphere. Rana’s article for Quartz was published on October 2. Four days later The Guardian published a provocative and alarming piece with this teaser: “Google, Twitter and Facebook workers who helped make technology so addictive are disconnecting themselves from the internet. Paul Lewis reports on the Silicon Valley refuseniks alarmed by a race for human attention.” It’s a very long and in-depth article. Here’s a taste:

There is growing concern that as well as addicting users, technology is contributing toward so-called “continuous partial attention”, severely limiting people’s ability to focus, and possibly lowering IQ. One recent study showed that the mere presence of smartphones damages cognitive capacity — even when the device is turned off. “Everyone is distracted,” Rosenstein says. “All of the time.”

But those concerns are trivial compared with the devastating impact upon the political system that some of Rosenstein’s peers believe can be attributed to the rise of social media and the attention-based market that drives it. . . .

Tech companies can exploit such vulnerabilities to keep people hooked; manipulating, for example, when people receive “likes” for their posts, ensuring they arrive when an individual is likely to feel vulnerable, or in need of approval, or maybe just bored. And the very same techniques can be sold to the highest bidder. . . .

“The dynamics of the attention economy are structurally set up to undermine the human will,” [ex-Google strategist James Williams] says. “If politics is an expression of our human will, on individual and collective levels, then the attention economy is directly undermining the assumptions that democracy rests on.” If Apple, Facebook, Google, Twitter, Instagram and Snapchat are gradually chipping away at our ability to control our own minds, could there come a point, I ask, at which democracy no longer functions?

“Will we be able to recognise it, if and when it happens?” Williams replies. “And if we can’t, then how do we know it hasn’t happened already?”

Full Text: “‘Our minds can be hijacked’: The tech insiders who fear a smartphone dystopia

In the same vein, Nicholas Carr (no stranger to The Teeming Brain’s pages) published a similarly aimed — and even a similarly titled — essay in the Weekend Review section of The Wall Street Journal on the very day the Guardian article appeared (October 6). “Research suggests that as the brain grows dependent on phone technology, the intellect weakens,” says the teaser. Here’s a representative passage from the essay itself:

Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object in the environment that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.” Media and communication devices, from telephones to TV sets, have always tapped into this instinct. Whether turned on or switched off, they promise an unending supply of information and experiences. By design, they grab and hold our attention in ways natural objects never could.

But even in the history of captivating media, the smartphone stands out. It’s an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what [Adrian] Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it’s part of the surroundings — which it always is. Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library, and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That’s what a smartphone represents to us. No wonder we can’t take our minds off it.

Full Text: “How Smartphones Hijack Our Minds

At his blog Carr noted the simultaneous appearance of his essay and the Guardian article on the same day. He also noted the striking coincidence of the similarity between the titles, calling it a “telling coincidence” and commenting:

It’s been clear for some time that smartphones and social-media apps are powerful distraction machines. They routinely divide our attention. But the “hijack” metaphor — I took it from Adrian Ward’s article “Supernormal” — implies a phenomenon greater and more insidious than distraction. To hijack something is to seize control of it from its rightful owner. What’s up for grabs is your mind.

Perhaps the most astonishing thing about all of this is that John Carpenter warned us about it three decades ago, and not vaguely, but quite specifically and pointedly. The only difference is that the technology in his (quasi-)fictional presentation was television. Well, that, plus the fact that his evil overlords really were ill-intentioned, whereas ours may be in some cases as much victims of their own devices as we are. In any event:

There is a signal broadcast every second of every day through our television sets. Even when the set is turned off, the brain receives the input. . . . Our impulses are being redirected. We are living in an artificially induced state of consciousness that resembles sleep. . . . We have been lulled into a trance.

Teeming Links – July 11, 2014

FireHead

Apologies for the dearth of posts during the week leading up to now. I have reached crunch time on both the mummy encyclopedia and the paranormal encyclopedia, and, in combination with the fact that just this week I started a new day job at a new (to me) college, my time will be limited in the near future. That said, weekly Teeming Links will continue appearing every Friday. I also have a number of great features lined up for publication, including a very long interview with psychedelic research pioneer James Fadiman (finished and currently in the editing and formatting stage) and the third installment of Dominik Irtenkauf’s “Sounds of Apocalypse” series.

 

TTB_divider

 

Niall Ferguson wonders whether the powers that be will transform the supposed “libertarian utopia” of the Internet into a totalitarian dystopia worthy of Fritz Lang’s Metropolis: “[T]he suspicion cannot be dismissed that, despite all the hype of the Information Age and all the brouhaha about Messrs. Snowden and Assange, the old hierarchies and new networks are in the process of reaching a quiet accommodation with one another, much as thrones and telephones did a century ago.”

Writer and former Omni editor-in-chief Keith Ferrell describes what he has learned from an experiment in living like an 11th-century farmer, or rather, like a post-apocalyptic survivor: “Our modern era’s dependence upon technology and, especially, chemical and motorised technology, has divorced most of us from soil and seeds and fundamental skills. . . . Planning and long-practised rhythms were at the core of the 11th-century farmer’s life; improvisation, much of it desperate, would be the heart of the post-apocalyptic farmer’s existence.”

In a world where the dominating goals of tech development are mobilility and sociality, Nicholas Carr wonders what kinds of alternative technologies and devices we might have if the guiding values were to be stationary and solitary. (Personally, I can think of one such technology, though not an electronic one: the paper book.)

Speaking of which, Andrew Erdmann uses the vehicle of Hal Ashby’s classic 1979 film Being There to reflect on our collective descent into aliteracy and electronically induced infantile idiocy: “I consider myself fortunate that I experienced reading and thinking before the Internet, and the written word before PowerPoint. I like to think that these experiences afford me some self-defense despite my own use of the Blackberry and other technologies.”

Roberto Bolaño says books are the only homeland for the true writer.

Javier Marías says the only real reason to write a novel is because this “allows the novelist to spend much of his time in a fictional world, which is really the only or at least the most bearable place to be.”

The Vatican has formally recognized the International Association of Exorcists and approved their statutes.

In response to the above, Chris French, the prominent skeptic and specialist in the psychology of paranormal beliefs and psychological states, argues in The Guardian that possession is better understood in psychological rather than supernatural terms. (Chris, btw, is writing the entry on anomalistic psychology for my paranormal encyclopedia.)

BBC journalist David Robson offers a firsthand, participatory account of how scientists are using hypnosis to simulate possession and understand why some people believe they’re inhabited by paranormal beings.

Over at Boing Boing, Don Jolly profiles Shannon Taggart, photographer of séances, spirits, and ectoplasm: “Taggart is not a ‘believer,’ in the traditional sense, nor does she seem to debunk her subject. Rather, she presents a world where belief and unbelief are radically mediated by technology — and raises the possibility that in the age of omnipresent electronic image what is ‘true’ may be a much harder debate than the skeptics suppose.” (Shannon, btw, is writing the entries on thoughtography and Kirlian photography for my paranormal encyclopedia.)

Philosopher Bernardo Kastrup absolutely nails, in his typically lucid fashion, the reason why scientific materialism is baloney:

It’s a philosophical and not a logical interpretation of science. Science itself is just a study of the patterns and the regularities that we can observe in reality. It doesn’t carry with it an interpretation. . . . Scientific materialism is when you load the scientific observations of the regularities of nature with an ontological interpretation and you say, “What you’re observing here is matter outside of mind that has an existence that would still go on even if nobody were looking at it.” That is already an interpretation. It’s not really pure science anymore, and the essence of scientific materialism is [the idea] that the real world is outside of mind, it’s independent of mind, and particular arrangements of elements in that real world, namely, subatomic particles, generate mind, generate subjective experience. Now of course the only carrier of reality anyone can know is subjective experience. So materialism is a kind of projection, an abstraction and then a projection onto the world of something that is fundamentally beyond knowledge.

Awesomeness alert: Guillermo del Toro hints — nay, states — that there is still life in his At the Mountains of Madness dream project.

Journalist and novelist Joseph L. Flatley offers an engaging exploration of the real-life occult influence of Lovecraft’s fictional Necronomicon (with much info about, e.g., the origin of the Simonomicon and the theories of Donald Tyson).

 

“Fire Head” image courtesy of Salvatore Vuono / FreeDigitalPhotos.net

Recommended Reading 36

This week: How entire U.S. towns now rely on food stamps. The regrets of the Iraqi “sledgehammer man,” whose image became famous in Western media when Saddam’s statue fell. The Obama administration’s epic (and hypocritical) focus on secrecy. The demise of Google Reader and what it portends for Net-i-fied life and culture. The sinister rise of an all-pervasive — and unblinkingly embraced — Orwellian Big Brotherism in the age of Big Data, with a focus on Facebook’s “Like” button, Google Glass, and Google’s vision of “a future of frictionless, continuous shopping.” A surge of ghost sightings and spiritual troubles among survivors of Japan’s earthquake and tsunami. The rise of the “Little Free Libraries” movement in America and abroad. Read the rest of this entry

Is the “brain as computer” metaphor dying?

Brain_Waves

Yesterday, Edge.org published a long and depth-filled conversation with Daniel C. Dennett — he of Consciousness Explained and Darwin’s Dangerous Idea fame — and it shows the renowned philosopher of mind and consciousness saying some things about the now-ubiquitous model and metaphor of the brain as a kind of computing machine that casts a whole new and, as it so happens, doubtful light on the matter:

The vision of the brain as a computer, which I still champion, is changing so fast. The brain’s a computer, but it’s so different from any computer that you’re used to. It’s not like your desktop or your laptop at all, and it’s not like your iPhone except in some ways. It’s a much more interesting phenomenon … We’re getting away from the rigidity of that model, which was worth trying for all it was worth. You go for the low-hanging fruit first. First, you try to make minds as simple as possible. You make them as much like digital computers, as much like von Neumann machines, as possible. It doesn’t work.

— “The Normal Well-Tempered Mind: A Conversation with Daniel C. Dennett,” Edge, January 8, 2013

This is embedded in a much longer series of reflections and analyses on the current state of research into mind, brain, and consciousness, but Nicholas Carr — he of “Is Google Making Us Stupid?” and The Shallows: What the Internet Is Doing to Our Brains fame — culls out the above-quoted portions and holds them up for closer inspection at his blog, and what he finds is that the unspoken subtext shows the entire edifice of the brain-as-computer metaphor crumbling (or, as he puts it, melting):

As someone who has a deep distrust of the popular metaphor that portrays the brain as a computer, I was struck by [Dennett’s words] … Normally, the explanatory power of a metaphor comes from describing a thing we don’t understand in terms of a thing we do understand. But this brain-as-computer metaphor now seems to be diverging from that model. The computer in the metaphor seems to be something very different from what we mean when we talk about a “computer.” The part of the metaphor that is supposed to be concrete has turned into a mystery fluid.

— Nicholas Carr, “Do I smell a metaphor melting?” Rough Type, January 8, 2013

Carr envisions a brief and semi-satirical dialogue that brings out the point:

The brain is like a computer!

Cool. What kind of computer is the brain like?

It’s not actually like any computer that’s ever been invented.

So what kind of computer is it like?

It’s like the unique form of a computer that we call a brain.

So the brain is like a brain?

Yes, exactly.

It sounds like it’s time for a new metaphor.

He closes by point out, evocatively, that “Our understanding of complex, mysterious things always proceeds from metaphor to metaphor. The moment a metaphor changes is an exciting moment because it opens new perspectives that the old metaphor foreclosed” (emphasis added).

The takeaway would seem to be a combined message of “stay tuned” and “brace yourself,” since the death or substantial mutation or revision of the metaphor in question would constitute an epochal shift in the way we’ve all been conditioned to think about our minds and selves on a very deep, very unconscious, very reflexive level for a couple of generations. And if a culture-wide opening of those “new perspectives that the old metaphor foreclosed” should happen to link up with the resurgent consciousness revolution currently taking place in the realms of religion, spirituality, parapsychology, art, music, literature, and psychedelics research, then watch out.

Image courtesy of Victor Habbick / FreeDigitalPhotos.net

Silence, solitude, and self-discovery in an age of mass distraction

“[T]he internet seizes our attention only to scatter it. We are immersed because there’s a constant barrage of stimuli coming at us and we seem to be very much seduced by that kind of constantly changing patterns of visual and auditorial stimuli. When we become immersed in our gadgets, we are immersed in a series of distractions rather than a sustained, focused type of thinking … There are messages coming at us through email, instant messenger, SMS, tweets etc. We are distracted by everything on the page, the various windows, the many applications running. You have to see the entire picture of how we are being stimulated. If you compare that to the placidity of a printed page, it doesn’t take long to notice that the experience of taking information from a printed page is not only different but almost the opposite from taking in information from a network-connected screen. With a page, you are shielded from distraction. We underestimate how the page encourages focused thinking — which I don’t think is normal for human beings — whereas the screen indulges our desire to be constantly distracted.”

— “Information and Contemplative Thought: We Turn Ourselves into Media Creations,” Interview with Nicholas Carr, The European, January 31, 2012

“Has it really come to this? In barely one generation we’ve moved from exulting in the time-saving devices that have so expanded our lives to trying to get away from them — often in order to make more time. The more ways we have to connect, the more many of us seem desperate to unplug. Like teenagers, we appear to have gone from knowing nothing about the world to knowing too much all but overnight. Internet rescue camps in South Korea and China try to save kids addicted to the screen. Writer friends of mine pay good money to get the Freedom software that enables them to disable (for up to eight hours) the very Internet connections that seemed so emancipating not long ago. Even Intel (of all companies) experimented in 2007 with conferring four uninterrupted hours of quiet time every Tuesday morning on 300 engineers and managers … [T]he average American spends at least eight and a half hours a day in front of a screen … The average American teenager sends or receives 75 text messages a day … We have more and more ways to communicate, as Thoreau noted, but less and less to say … The central paradox of the machines that have made our lives so much brighter, quicker, longer and healthier is that they cannot teach us how to make the best use of them; the information revolution came without an instruction manual.”

— Pico Iyer, “The Joy of Quiet,” The New York Times, December 29, 2011

“I am encouraged by services such as Instapaper, Readability or Freedom — applications that are designed to make us more attentive when using the internet. It is a good sign because it shows that some people are concerned about this and sense that they are no longer in control of their attention. Of course there’s an irony in looking for solutions in the same technology that keeps us distracted.”

— Carr, “Information and Contemplative Thought” Read the rest of this entry

The Internet’s corrosive mental effects: A growing problem requiring a deliberate defensive response

For those of you who, like me, have been interested to hear the background drumbeat of warnings about the mental and neurological effects of the Internet revolution over the past several years — think Nicholas Carr’s “Is Google Making Us Stupid?” and The Shallows,  just for starters —  a recent, in-depth article about this very subject from Newsweek will make for compelling reading. It’s not exactly a pleasant read, though, because the conclusion it draws from mountains of evidence is deeply disturbing.

Here’s the gist:

Teaser: Tweets, texts, emails, posts. New research says the Internet can make us lonely and depressed — and may even create more extreme forms of mental illness, Tony Dokoupil reports.

Questions about the Internet’s deleterious effects on the mind are at least as old as hyperlinks. But even among Web skeptics, the idea that a new technology might influence how we think and feel — let alone contribute to a great American crack-up — was considered silly and naive, like waving a cane at electric light or blaming the television for kids these days. Instead, the Internet was seen as just another medium, a delivery system, not a diabolical machine. It made people happier and more productive. And where was the proof otherwise?

Read the rest of this entry

The Internet is melting our brains

The current issue of the Atlantic Monthly (July/August) has an interesting cover story by Nicholas Carr — “Is Google Making Us Stupid? What the Internet Is Doing to Our Brains” — about the effects of the Internet revolution on human cognition. I bought the issue at the airport last weekend while waiting for my flight to Mo*Con III and found it to be quite a worthy read, especially since the author’s description of some of the changes he has noticed in his own mental life under the spell of perpetual Internet usage parallel certain effects that I’ve been noticing in myself for the past several years.

He writes:

Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going — so far as I can tell — but it’s changing. I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.

I think I know what’s going on. For more than a decade now, I’ve been spending a lot of time online, searching and surfing and sometimes adding to the great databases of the Internet. The Web has been a godsend to me as a writer. Research that once required days in the stacks or periodical rooms of libraries can now be done in minutes. A few Google searches, some quick clicks on hyperlinks, and I’ve got the telltale fact or pithy quote I was after. Even when I’m not working, I’m as likely as not to be foraging in the Web’s info-thickets — reading and writing e-mails, scanning headlines and blog posts, watching videos and listening to podcasts, or just tripping from link to link to link. (Unlike footnotes, to which they’re sometimes likened, hyperlinks don’t merely point to related works; they propel you toward them.)

For me, as for others, the Net is becoming a universal medium, the conduit for most of the information that flows through my eyes and ears and into my mind. The advantages of having immediate access to such an incredibly rich store of information are many, and they’ve been widely described and duly applauded. “The perfect recall of silicon memory,” Wired’s Clive Thompson has written, “can be an enormous boon to thinking.” But that boon comes at a price. As the media theorist Marshall McLuhan pointed out in the 1960s, media are not just passive channels of information. They supply the stuff of thought, but they also shape the process of thought. And what the Net seems to be doing is chipping away my capacity for concentration and contemplation. My mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.

Carr goes on to offer a concise and fascinating history of the effects of new communications technologies on human cultures and societies, going all the way back to Plato’s low view of the development of writing itself (since he, Plato, feared dependence on the written word would siphon away people’s mental abilities), to the invention of the printing press, to Nietzsche’s admission that acquiring a typewriter had changed the character of his writing. Carr finishes by advising that we should be skeptical of his very skepticism about the Internet, since all revolutions in communication technologies have been met with similar Luddite-esque condemnations. That said, he still holds out the possibility that he’s right, and that something valuable, namely, our ability and even our desire to think and reflect deeply and to have our selves and societies formed and informed by this mental and moral depth, is currently under assault and in danger of being lost:

Perhaps those who dismiss critics of the Internet as Luddites or nostalgists will be proved correct, and from our hyperactive, data-stoked minds will spring a golden age of intellectual discovery and universal wisdom. Then again, the Net isn’t the alphabet, and although it may replace the printing press, it produces something altogether different. The kind of deep reading that a sequence of printed pages promotes is valuable not just for the knowledge we acquire from the author’s words but for the intellectual vibrations those words set off within our own minds. In the quiet spaces opened up by the sustained, undistracted reading of a book, or by any other act of contemplation, for that matter, we make our own associations, draw our own inferences and analogies, foster our own ideas. Deep reading, as Maryanne Wolf argues, is indistinguishable from deep thinking. If we lose those quiet spaces, or fill them up with “content,” we will sacrifice something important not only in our selves but in our culture.

Any longtime reader of The Teeming Brain will know that I exult in finding such thoughts and feeling expressed so well, and also in expressing them myself. From Neil Postman’s Amusing Ourselves to Death and Technopoly: The Surrender of Culture to Technology, to Theodore Roszak’s Where the Wasteland Ends, to Daniel Boorstin’s The Image: A Guide to Pseudo-Events in America, to dystopian fictional visions like Ray Bradbury’s Fahrenheit 451, Aldous Huxley’s Brave New World, and, most recently, Paolo Bacigalupi’s Pump Six and Other Stories (for which I wrote a glowing review for the new issue of Dead Reckonings), I am fascinated by the exploration of the changes that modern mass and digital communications technologies are wreaking upon our civilizations and cultures.

Like Carr, my fascination has a personal aspect. Ever since I was an undergraduate student majoring in communication and minoring in philosophy at the University of Missouri, where I was introduced to culture and media criticism and the high intellectual tradition of the West (and also the East), I have been obsessed with understanding the personal effects of the technology and mass media cocoon into which I was born, and which has grown astonishingly more comprehensive and complex in just my lifetime, which hasn’t yet reached its 40th year. Most recently I have noticed that my entry into the Internet world, which occurred definitively in 1996, has produced a progressive change in my attention span and concentrative abilities exactly like the one Carr describes.

Lately I have been taking steps to remedy that. I have reduced my online time (although not my total computer usage time). I have deliberately sought out a few long works of fiction to read. Interestingly, my ability to read in long-form has been impacted almost exclusively in the realm of fiction. I am able to read nonfiction just fine. But I have noticed a growing impatience with long fictional works over the years that is attributable, when I reflect on it and trace it, to the very phenomenon Carr describes. Presently I’m pleased to report that I am in process of successfully rehabilitating myself.

Lest anybody think that these fears are new, I’ll give the last word to Bradbury himself. About a year ago (May 30, 2007), L.A. Weekly published a fine article about him titled “Ray Bradbury: Fahrenheit 451 Misinterpreted” that featured a present-day Bradbury arguing that his most famous novel is not really about censorship, as has long been received opinion by the general public and literary establishment, but is instead about the insidious and pernicious effects of television on society. Bradbury is convinced — and so am I — that present-day trends in television and American society confirm the book’s warning.

The author of the article wisely delved into Bradbury’s history and discovered a letter Bradbury wrote in 1951 to Richard Matheson that covered the same territory. The words of the then-thirty-something Bradbury about the effects of radio on people’s ability to think, concentrate, and read serve as a fascinating touchstone to Carr’s Atlantic article, written 57 years later, about the effects of the Internet on the same activities:

As early as 1951, Bradbury presaged his fears about TV, in a letter about the dangers of radio, written to fantasy and science-fiction writer Richard Matheson. Bradbury wrote that “Radio has contributed to our ‘growing lack of attention.’… This sort of hopscotching existence makes it almost impossible for people, myself included, to sit down and get into a novel again. We have become a short story reading people, or, worse than that, a QUICK reading people.”

If Bradbury was right then, and if Carr is right now, then we have been living through the intellectual fall of our civilization for more than half a century, and have been dressing it up and passing it off to ourselves en masse as a wonderful, liberating cultural advance. This bears much reflection and meditation — if, that is, we’re still able to do it.