Category Archives: Science & Technology
The following paragraphs are from a talk delivered by Pinboard founder Maciej Cegłowski at the recent Emerging Technologies for the Enterprise conference in Philadelphia. Citing as Exhibit A the colossal train wreck that was the 2016 American presidential election, Cegłowski basically explains how, in the current version of the Internet that has emerged over the past decade-plus, we have collectively created a technology that is perfectly calibrated for undermining Western democratic societies and ideals.
But as incisive as his analysis is, I seriously doubt that his (equally incisive) proposed solutions, described later in the piece, will ever be implemented to any meaningful extent. I mean, if we’re going to employ the explicitly Frankensteinian metaphor of “building a monster,” then it’s important to bear in mind that Victor Frankenstein and his wretched creation did not find their way to anything resembling a happy ending. (And note that Cegłowski himself acknowledges as much at the end of his piece when he closes his discussion of proposed solutions by asserting that “even though we’re likely to fail, all we can do is try.”)
This year especially there’s an uncomfortable feeling in the tech industry that we did something wrong, that in following our credo of “move fast and break things,” some of what we knocked down were the load-bearing walls of our democracy. . . .
A question few are asking is whether the tools of mass surveillance and social control we spent the last decade building could have had anything to do with the debacle of the 2017 [sic] election, or whether destroying local journalism and making national journalism so dependent on our platforms was, in retrospect, a good idea. . . .
We built the commercial internet by mastering techniques of persuasion and surveillance that we’ve extended to billions of people, including essentially the entire population of the Western democracies. But admitting that this tool of social control might be conducive to authoritarianism is not something we’re ready to face. After all, we’re good people. We like freedom. How could we have built tools that subvert it? . . .
The economic basis of the Internet is surveillance. Every interaction with a computing device leaves a data trail, and whole industries exist to consume this data. Unlike dystopian visions from the past, this surveillance is not just being conducted by governments or faceless corporations. Instead, it’s the work of a small number of sympathetic tech companies with likable founders, whose real dream is to build robots and Mars rockets and do cool things that make the world better. Surveillance just pays the bills. . . .
Orwell imagined a world in which the state could shamelessly rewrite the past. The Internet has taught us that people are happy to do this work themselves, provided they have their peer group with them, and a common enemy to unite against. They will happily construct alternative realities for themselves, and adjust them as necessary to fit the changing facts . . . .
A lot of what we call “disruption” in the tech industry has just been killing flawed but established institutions, and mining them for parts. When we do this, we make a dangerous assumption about our ability to undo our own bad decisions, or the time span required to build institutions that match the needs of new realities.
Right now, a small caste of programmers is in charge of the surveillance economy, and has broad latitude to change it. But this situation will not last for long. The kinds of black-box machine learning that have been so successful in the age of mass surveillance are going to become commoditized and will no longer require skilled artisans to deploy. . . .
Unless something happens to mobilize the tech workforce, or unless the advertising bubble finally bursts, we can expect the weird, topsy-turvy status quo of 2017 to solidify into the new reality.
Greetings, Teeming Brainers. I’m just peeking in from the digital wings, amid much ongoing blog silence, to observe that many of the issues and developments — sociocultural, technological, and more — that I began furiously tracking here way back in 2006 are continuing to head in pretty much the same direction. A case in point is provided by the alarming information, presented in a frankly alarmed tone, that appears in this new piece from Scientific American (originally published in SA’s German-language sister publication, Spektrum der Wissenschaft):
Everything started quite harmlessly. Search engines and recommendation platforms began to offer us personalised suggestions for products and services. This information is based on personal and meta-data that has been gathered from previous searches, purchases and mobility behaviour, as well as social interactions. While officially, the identity of the user is protected, it can, in practice, be inferred quite easily. Today, algorithms know pretty well what we do, what we think and how we feel — possibly even better than our friends and family or even ourselves. Often the recommendations we are offered fit so well that the resulting decisions feel as if they were our own, even though they are actually not our decisions. In fact, we are being remotely controlled ever more successfully in this manner. The more is known about us, the less likely our choices are to be free and not predetermined by others.
But it won’t stop there. Some software platforms are moving towards “persuasive computing.” In the future, using sophisticated manipulation technologies, these platforms will be able to steer us through entire courses of action, be it for the execution of complex work processes or to generate free content for Internet platforms, from which corporations earn billions. The trend goes from programming computers to programming people. . . .
[I]t can be said that we are now at a crossroads. Big data, artificial intelligence, cybernetics and behavioral economics are shaping our society — for better or worse. If such widespread technologies are not compatible with our society’s core values, sooner or later they will cause extensive damage. They could lead to an automated society with totalitarian features. In the worst case, a centralized artificial intelligence would control what we know, what we think and how we act. We are at the historic moment, where we have to decide on the right path — a path that allows us all to benefit from the digital revolution
Oh, and for a concrete illustration of all the above, check this out:
How would behavioural and social control impact our lives? The concept of a Citizen Score, which is now being implemented in China, gives an idea. There, all citizens are rated on a one-dimensional ranking scale. Everything they do gives plus or minus points. This is not only aimed at mass surveillance. The score depends on an individual’s clicks on the Internet and their politically-correct conduct or not, and it determines their credit terms, their access to certain jobs, and travel visas. Therefore, the Citizen Score is about behavioural and social control. Even the behaviour of friends and acquaintances affects this score, i.e. the principle of clan liability is also applied: everyone becomes both a guardian of virtue and a kind of snooping informant, at the same time; unorthodox thinkers are isolated. Were similar principles to spread in democratic countries, it would be ultimately irrelevant whether it was the state or influential companies that set the rules. In both cases, the pillars of democracy would be directly threatened.
FULL ARTICLE: “Will Democracy Survive Big Data and Artificial Intelligence?”
Of course, none of this is real news to anybody who has been paying attention. It’s just something that people like me, and maybe like you, find troubling enough to highlight and comment on. And maybe, in the end, Cipher from The Matrix will turn out to have been right: Maybe ignorance really is bliss. Because from where I’m sitting, there doesn’t appear to be anything one can do to stop this streamrollering, metastasizing, runaway train-like dystopian trend. Talking about it is just that: talk. Which is one reason why I’ve lost a portion of the will that originally kept me blogging here for so many years. You can only play the role of Cassandra for so long before the intrinsic attraction begins to dissipate. Read the rest of this entry
Here’s a generous chunk of a really interesting and incisive blog post by author and Presbyterian pastor C. R. Wiley, who has been articulating interesting and incisive thoughts on religion, science, culture, Lovecraft, C. S. Lewis, and an associated network of ideas and writers for a some time now:
For [my scientist friends] the imagination is just a tool for problem solving. It’s not a window to view the real world through; it’s more a technique for envisioning ways out of conceptual impasses.
They’re unable to get past the factness of things. Meaning eludes them. . . .
When I ask my scientific friends, “what does it say?” (referring to any work of art) they look at me blankly. They seem to be unable to move from facts to meanings. Worse, they reduce meaning to facts in some sense. There’s a savanna theory for instance, which asserts with darwinian certitude that the reason some landscapes seem beautiful to us is because our prehistoric ancestors found savannas conducive to survival. (Darwinians have the same answer for everything, what C. S. Lewis is said to have called, “nothing-butterism”, meaning, whatever you think is the case can be reduced to “nothing but” survival.)
Seeing that the scientific method is a fairly recent phenomenon and we’ve had interest in meaning of things from the very beginning of recorded history, what is going on here?
I can’t help but believe something has gone wrong, that in the interest of understanding the world we’ve lost the world. The world is reduced to cause and effect, but its meaning is something we can no longer see.
FULL TEXT: “Is the Scientific Method a Form of Mental Illness?“
You may recall Wiley as the impetus behind one of the more popular posts here at The Teeming Brain in the past few years, “C. S. Lewis and H. P. Lovecraft on loathing and longing for alien worlds.” He’s well worth following. (For a relevant case in point, see his March blog post “H. P. Lovecraft, Evangelist of the Sublime.”)
For more on the mental illness that is scientism and the threat it poses to authentic imagination, see the following:
- Scientism, the fantastic, and the nature of consciousness
- The bias of scientism and the reality of paranormal experience
- Reclaiming Art in the Age of Artifice: An Interview with J. F. Martel
During a week when mummies are on everybody’s mind because of that widely circulated news story about the mummified Buddhist monk found in a Buddha statue, it’s nice to see that Library Journal has posted a review of my recently published Mummies around the World, which contains a long entry titled “Buddhist self-mummification” that’s written by Ron Beckett and Jerry Conlogue, the scientists and mummy experts who used to host National Geographic’s Mummy Road Show.
LJ’s verdict, I’m pleased to say, is positive:
This truly rollicking blend of scientific and pop culture offers facts ranging from actual methods of mummification to an entry on the 1955 movie Abbott and Costello Meet the Mummy and includes trivia-fact boxes. VERDICT: An educational and entertaining compendium that is recommended for all “mummymaniacs” everywhere.
MORE: Review of Mummies around the World (scroll to the bottom of the page)
A Google Books preview of my mummy encyclopedia is now available. At least from my end — and I know these previews tend to shift and alter sometimes — it shows the full table of contents (two of them, actually, one alphabetical and the other topically organized), the full preface and introduction, portions of the master timeline of mummies throughout history, and a few snippets of the book’s A-Z entries. For those of you who have been following my updates about this project over the past couple of years, here’s a glimpse of the final result.
The book is scheduled for publication on November 30. You can order it from the publisher or from all of the usual retail suspects (Amazon, Barnes & Noble, etc.). You’ll probably also find it in a library near you. And remember, you can view a full list of the book’s contributors with brief bios here.
I had considered titling this post “Philosophy slams Neil deGrasse Tyson,” but then I reconsidered. In case you haven’t heard, Tyson recently outed himself as a philistine. Or at least that’s how author and journalist Damon Linker characterizes it in an article titled, appropriately enough, “Why Neil deGrasse Tyson Is a Philistine.” In the words of the article’s teaser, “The popular television host says he has no time for deep, philosophical questions. That’s a horrible message to send to young scientists.”
What Linker is referring to is Tyson’s recent appearance as a guest on the popular Nerdist podcast. Beginning at about 20 minutes into the hour-long program, the conversation between Tyson and his multiple interviewers turns to the subject of philosophy, and Tyson speaks up to talk down the entire field. In fact, he takes pains to specify and clarify that he personally has absolutely no use for philosophy, which he views as a worthless distraction from other activities with real value.
Yes, it all sounds like it must be overstated in the retelling — but in point of fact, it’s not. Have a listen for yourself by clicking the link above, or else read his words here in this transcript of the program’s relevant portion. The comments from Tyson and his interviewers come right after they have been discussing the standardization of weights and measures. Note especially how Tyson not only dismisses philosophy but pointedly refuses to allow that there might be even a shred of validity or value in it. Read the rest of this entry
The above image is a photo of a Strandbeest. What, you may ask, is that? Here’s how its creator, the Dutch artist Theo Jansen (who can be seen in the photo as well), explains the matter:
Since 1990 I have been occupied creating new forms of life. Not pollen or seeds but plastic yellow tubes are used as the basic material of this new nature. I make skeletons that are able to walk on the wind, so they don’t have to eat. Over time, these skeletons have become increasingly better at surviving the elements such as storms and water, and eventually I want to put these animals out in herds on the beaches, so they will live their own lives.
If you wonder what this actual entails and looks like in action, see the video below. Be advised that it will probably stand as the coolest and most mind-blowing thing you’ll see all week, month, or maybe year:
Last summer Jansen visited the Peabody Essex Museum in Salem, Massachusetts, in preparation for the first major American exhibition of his work, which will be presented at the PEM in 2015 and titled “The Dream of the Strandbeest.” My sister Dinah is a writer for PEM, and here’s how she described his visit:
Prior to meeting the man behind the Strandbeest, my introduction was the same as most — gazing at online videos of the enormous beach-combing beasts, while trying to teleport myself to that peaceful beach in the Netherlands. From the first moment I saw the lifelike creatures walking their four-legged dog pace, I wondered whether the God-like figure behind these post-apocalyptic-looking critters could likely change the world.
. . . In a roomful of PEM staff, Jansen shared how a Strandbeest works with pistons that act like muscles. Constructed of plastic tubes and recycled water bottles, the creature has a purpose beyond its more obvious one of being beautiful and mysterious. They are built to harness wind power and save eroding beaches. They detect atmospheric pressure and are designed to “pin themselves to the ground” to survive storms. Jansen spends his mornings coming up with difficult algorithms in the workshop, before biking 50 kilometers to the beach to try them out. By the end of the day, he said, the design works or it doesn’t. “The tubes point you in a certain way,” he says. “I’m surprised by how beautiful they are.”
. . . Jansen recently shared the genetic code of the Strandbeest on the web and is proud of the resulting designs in wood, out of Legos, in materials imagined by children and adults, so that the average person can be “infected” with the compulsion to create a Strandbeest. This is how they masterfully reproduce, he points out, adding that he eventually wants to put them out on the beach in herds, so that they can live on their own.
“Maybe it’s only a fairytale in my head . . . a surviving animal on the beach,” he said. “These are all designed for that. Maybe before I die, these animals will be there. This is my horizon, you could say.”
MORE: “Stunning Strandbeests“
Image by Roel via Flickr under Creative Commons
The mummified body of a Pre-dynastic Egyptian man known as Gebelein Man (formerly called Ginger) in the British Museum
Editing the mummy encyclopedia over the past year and a half has left me with a still-active internal radar that scans the media incessantly for mummy-related news, and a recent (May 20) piece in The Independent about a new exhibition at the British Museum titled “Ancient Lives, New Discoveries: Eight Mummies, Eight Stories” lit up the screen last week like an approaching aircraft carrier.
The teaser conveys the gist:
A blockbuster exhibition at the British Museum unwraps the mysteries of 5,000-year-old Egyptian mummies. Zoe Pilger is fascinated — but not sure they should be on show at all.
The article itself takes the form of an absorbing report in which Ms. Pilger, in addition to describing the exhibition’s content and execution in vivid detail, briefly summarizes the history of scientific mummy studies and the cultural phenomenon of “mummymania” that was ignited by the opening of Tutankhamun’s tomb in 1922.
She talks about mummy unrollings (the popular practice of unwrapping mummies as a public spectacle in the nineteenth and early twentieth centuries), the cultural myth of a “mummy curse,” the incorporation of the curse motif into Universal Studios’ classic 1932 film The Mummy, the use of image scanning technologically (specifically, CT scans) to conduct non-invasive examinations of mummies, the ancient naturally mummified Egyptian body known as Gebelein Man, Margaret Murray’s famous unwrapping of an Egyptian mummy in front of a crowd of 500 onlookers at the Manchester Museum, and the troubling ethical questions that surround the act of examining and displaying human remains like this at all.
Each of these issues is also talked about at length and in detail in the mummy encyclopedia, the last of them (the ethical conundrum) in two separate articles, one titled “Displaying Mummies,” by scientist and mummy researcher Heather Gill-Frerking and the other titled “Collecting Mummies,” by literary scholar Richard Sugg (author of 2011′s Mummies, Cannibals and Vampires: The History of Corpse Medicine from the Renaissance to the Victorians).
I couldn’t help but be impressed at the way Ms. Pilger’s article, in the space of just 1300 words, offers an excellent primer on many crucial issues related to mummies and their scientific and cultural uses. I highly recommend reading the whole thing. Here’s a taste:
The first mummy on display is extraordinary. It seems charged with a supernatural energy and I half expected it to wake up. This young man was in his twenties or early thirties when he curled up into a foetal position and died. He was buried in a cemetery in Gebelein, Upper Egypt, and naturally mummified by the dry, hot sand. His remains are 5,000 years old but his presence is vivid. Most of his skin has been preserved: it covers his delicate bones. His feet are drawn up to his chest and his hands are cupped under his chin, as though pleading. He seems vulnerable, lit in a glass case in a dark room like a relic. He has been transformed into an object and put on display. To look at him provokes a primal feeling of horror. This is death made real.
. . . These bodies were not designed to be seen. There is a tyrannical tendency in Western culture to try to know everything — to decode, demystify, and disenchant even the most sacrosanct of secrets. A fascination with the “magic” of other cultures is coupled with a rationalist incredulity. We don’t believe, and yet we can’t stop investigating — historically, through violent methods. The curators of this exhibition seem aware of this danger. Rather than crude unwrappings as a form of public entertainment, these mummies are explored with scientific rigour and respect. Instead of revulsion, we are encouraged to feel a sense of shared humanity; they are dignified through the small detail of daily life — from the wigs they wore to the beer they drank. However, there is a feeling that they do not belong to us and should not be here.
Image: Photo of Gebelein Man by Jack1956 at the English language Wikipedia [GFDL (http://www.gnu.org/copyleft/fdl.html), CC-BY-SA-3.0 (http://creativecommons.org/licenses/by-sa/3.0/) or GFDL (http://www.gnu.org/copyleft/fdl.html)], via Wikimedia Commons
I’m pleased to announce that my mummy encyclopedia is now available for preorder from the publisher, and also from Amazon, Barnes & Noble, and elsewhere. The scheduled publication date is November 30.
From the official publisher’s description:
Perfect for school and public libraries, this is the only reference book to combine pop culture with science to uncover the mystery behind mummies and the mummification phenomena.
Mortality and death have always fascinated humankind. Civilizations from all over the world have practiced mummification as a means of preserving life after death — a ritual which captures the imagination of scientists, artists, and laypeople alike. This comprehensive encyclopedia focuses on all aspects of mummies: their ancient and modern history; their scientific study; their occurrence around the world; the religious and cultural beliefs surrounding them; and their roles in literary and cinematic entertainment.
Author and horror guru Matt Cardin brings together 130 original articles written by an international roster of leading scientists and scholars to examine the art, science, and religious rituals of mummification throughout history. Through a combination of factual articles and topical essays, this book reviews cultural beliefs about death; the afterlife; and the interment, entombment, and cremation of human corpses in places like Egypt, Europe, Asia, and Central and South America. Additionally, the book covers the phenomenon of natural mummification, where environmental conditions result in the spontaneous preservation of human and animal remains.
Here’s an excerpt (slightly condensed) from my introduction to the book: Read the rest of this entry
Here’s a double dose of dystopian cheer to accompany a warm and sunny Monday afternoon (or at least that’s the weather here in Central Texas).
First, Adam Kirsch, writing for The New Republic, in a piece dated May 2:
Everyone who ever swore to cling to typewriters, record players, and letters now uses word processors, iPods, and e-mail. There is no room for Bartlebys in the twenty-first century, and if a few still exist they are scorned. (Bartleby himself was scorned, which was the whole point of his preferring not to.) Extend this logic from physical technology to intellectual technology, and it seems almost like common sense to say that if we are not all digital humanists now, we will be in a few years. As the authors of Digital_Humanities write, with perfect confidence in the inexorability — and the desirability — of their goals, “the 8-page essay and the 25-page research paper will have to make room for the game design, the multi-player narrative, the video mash-up, the online exhibit and other new forms and formats as pedagogical exercises.”
. . . The best thing that the humanities could do at this moment, then, is not to embrace the momentum of the digital, the tech tsunami, but to resist it and to critique it. This is not Luddism; it is intellectual responsibility. Is it actually true that reading online is an adequate substitute for reading on paper? If not, perhaps we should not be concentrating on digitizing our books but on preserving and circulating them more effectively. Are images able to do the work of a complex discourse? If not, and reasoning is irreducibly linguistic, then it would be a grave mistake to move writing away from the center of a humanities education.
. . . The posture of skepticism is a wearisome one for the humanities, now perhaps more than ever, when technology is so confident and culture is so self-suspicious. It is no wonder that some humanists are tempted to throw off the traditional burden and infuse the humanities with the material resources and the militant confidence of the digital. The danger is that they will wake up one morning to find that they have sold their birthright for a mess of apps.
Second, Will Self, writing for The Guardian, in a piece also dated May 2:
The literary novel as an art work and a narrative art form central to our culture is indeed dying before our eyes. Let me refine my terms: I do not mean narrative prose fiction tout court is dying — the kidult boywizardsroman and the soft sadomasochistic porn fantasy are clearly in rude good health. And nor do I mean that serious novels will either cease to be written or read. But what is already no longer the case is the situation that obtained when I was a young man. In the early 1980s, and I would argue throughout the second half of the last century, the literary novel was perceived to be the prince of art forms, the cultural capstone and the apogee of creative endeavour. The capability words have when arranged sequentially to both mimic the free flow of human thought and investigate the physical expressions and interactions of thinking subjects; the way they may be shaped into a believable simulacrum of either the commonsensical world, or any number of invented ones; and the capability of the extended prose form itself, which, unlike any other art form, is able to enact self-analysis, to describe other aesthetic modes and even mimic them. All this led to a general acknowledgment: the novel was the true Wagnerian Gesamtkunstwerk.
. . . [T]he advent of digital media is not simply destructive of the codex, but of the Gutenberg mind itself. There is one question alone that you must ask yourself in order to establish whether the serious novel will still retain cultural primacy and centrality in another 20 years. This is the question: if you accept that by then the vast majority of text will be read in digital form on devices linked to the web, do you also believe that those readers will voluntarily choose to disable that connectivity? If your answer to this is no, then the death of the novel is sealed out of your own mouth.
. . . I believe the serious novel will continue to be written and read, but it will be an art form on a par with easel painting or classical music: confined to a defined social and demographic group, requiring a degree of subsidy, a subject for historical scholarship rather than public discourse. . . . I’ve no intention of writing fictions in the form of tweets or text messages — nor do I see my future in computer-games design. My apprenticeship as a novelist has lasted a long time now, and I still cherish hopes of eventually qualifying. Besides, as the possessor of a Gutenberg mind, it is quite impossible for me to foretell what the new dominant narrative art form will be — if, that is, there is to be one at all.