Smarter Than You Think: How Technology is Changing Our Minds for the Better. Clive Thompson
would doom the quality writers to “the danger of general oblivion” and produce “a return to barbarism.”26 Thankfully, he was wrong. Scholars quickly set about organizing the new mental environment by clipping their favorite passages from books and assembling them into huge tomes—florilegia, bouquets of text—so that readers could sample the best parts. They were basically blogging, going through some of the same arguments modern bloggers go through. (Is it enough to clip a passage, or do you also have to verify that what the author wrote was true? It was debated back then, as it is today.) The past turns out to be oddly reassuring, because a pattern emerges. Each time we’re faced with bewildering new thinking tools, we panic—then quickly set about deducing how they can be used to help us work, meditate, and create.
History also shows that we generally improve and refine our tools to make them better. Books, for example, weren’t always as well designed as they are now. In fact, the earliest ones were, by modern standards, practically unusable—often devoid of the navigational aids we now take for granted, such as indexes, paragraph breaks, or page numbers. It took decades—centuries, even—for the book to be redesigned into a more flexible cognitive tool, as suitable for quick reference as it is for deep reading. This is the same path we’ll need to tread with our digital tools. It’s why we need to understand not just the new abilities our tools give us today, but where they’re still deficient and how they ought to improve.
I have one caveat to offer. If you were hoping to read about the neuroscience of our brains and how technology is “rewiring” them, this volume will disappoint you.
This goes against the grain of modern discourse, I realize. In recent years, people interested in how we think have become obsessed with our brain chemistry. We’ve marveled at the ability of brain scanning—picturing our brain’s electrical activity or blood flow—to provide new clues as to what parts of the brain are linked to our behaviors. Some people panic that our brains are being deformed on a physiological level by today’s technology: spend too much time flipping between windows and skimming text instead of reading a book, or interrupting your conversations to read text messages, and pretty soon you won’t be able to concentrate on anything—and if you can’t concentrate on it, you can’t understand it either. In his book The Shallows, Nicholas Carr eloquently raised this alarm, arguing that the quality of our thought, as a species, rose in tandem with the ascendance of slow-moving, linear print and began declining with the arrival of the zingy, flighty Internet. “I’m not thinking the way I used to think,”28 he worried.
I’m certain that many of these fears are warranted. It has always been difficult for us to maintain mental habits of concentration and deep thought; that’s precisely why societies have engineered massive social institutions29 (everything from universities to book clubs and temples of worship) to encourage us to keep it up. It’s part of why only a relatively small subset of people become regular, immersive readers, and part of why an even smaller subset go on to higher education. Today’s multitasking tools really do make it harder than before to stay focused during long acts of reading and contemplation. They require a high level of “mindfulness”—paying attention to your own attention. While I don’t dwell on the perils of distraction in this book, the importance of being mindful resonates throughout these pages. One of the great challenges of today’s digital thinking tools is knowing when not to use them, when to rely on the powers of older and slower technologies, like paper and books.
That said, today’s confident talk by pundits and journalists about our “rewired” brains has one big problem: it is very premature. Serious neuroscientists agree that we don’t really know how our brains are wired to begin with. Brain chemistry is particularly mysterious when it comes to complex thought, like memory, creativity, and insight. “There will eventually be neuroscientific explanations30 for much of what we do; but those explanations will turn out to be incredibly complicated,” as the neuroscientist Gary Marcus pointed out when critiquing the popular fascination with brain scanning. “For now, our ability to understand how all those parts relate is quite limited, sort of like trying to understand the political dynamics of Ohio from an airplane window above Cleveland.” I’m not dismissing brain scanning; indeed, I’m confident it’ll be crucial in unlocking these mysteries in the decades to come. But right now the field is so new that it is rash to draw conclusions, either apocalyptic or utopian, about how the Internet is changing our brains. Even Carr, the most diligent explorer in this area, cited only a single brain-scanning study that specifically probed how people’s brains respond to using the Web,31 and those results were ambiguous.
The truth is that many healthy daily activities, if you scanned the brains of people participating in them, might appear outright dangerous to cognition. Over recent years, professor of psychiatry James Swain and teams of Yale and University of Michigan scientists32 scanned the brains of new mothers and fathers as they listened to recordings of their babies’ cries. They found brain circuit activity similar to that in people suffering from obsessive-compulsive disorder. Now, these parents did not actually have OCD. They were just being temporarily vigilant about their newborns. But since the experiments appeared to show the brains of new parents being altered at a neural level, you could write a pretty scary headline if you wanted: BECOMING A PARENT ERODES YOUR BRAIN FUNCTION! In reality, as Swain tells me, it’s much more benign. Being extra fretful and cautious around a newborn is a good thing for most parents: Babies are fragile. It’s worth the tradeoff. Similarly, living in cities—with their cramped dwellings and pounding noise—stresses us out on a straightforwardly physiological level33 and floods our system with cortisol, as I discovered while researching stress in New York City several years ago. But the very urban density that frazzles us mentally also makes us 50 percent more productive,34 and more creative, too, as Edward Glaeser argues in Triumph of the City, because of all those connections between people. This is “the city’s edge in producing ideas.”35 The upside of creativity is tied to the downside of living in a sardine tin, or, as Glaeser puts it, “Density has costs as well as benefits.”36 Our digital environments likely offer a similar push and pull. We tolerate their cognitive hassles and distractions for the enormous upside of being connected, in new ways, to other people.
I want to examine how technology changes our mental habits, but for now, we’ll be on firmer ground if we stick to what’s observably happening in the world around us: our cognitive behavior, the quality of our cultural production, and the social science that tries to measure what we do in everyday life. In any case, I won’t be talking about how your brain is being “rewired.” Almost everything rewires it, including this book.
The brain you had before you read this paragraph? You don’t get that brain back. I’m hoping the trade-off is worth it.
The rise of advanced chess didn’t end the debate about man versus machine, of course. In fact, the centaur phenomenon only complicated things further for the chess world—raising questions about how reliant players were on computers and how their presence affected the game itself. Some worried that if humans got too used to consulting machines, they wouldn’t be able to play without them. Indeed, in June 2011, chess master Christoph Natsidis was caught37 illicitly using a mobile phone during a regular human-to-human match. During tense moments, he kept vanishing for long bathroom visits; the referee, suspicious, discovered Natsidis entering moves into a piece of chess software on his smartphone. Chess had entered a phase similar to the doping scandals that have plagued baseball and cycling, except in this case the drug was software and its effect cognitive.
This is a nice metaphor for a fear that can nag at us in our everyday lives, too, as we use machines for thinking more and more. Are we losing some of our humanity? What happens if the Internet goes down: Do our brains collapse, too? Or is the question naive and irrelevant—as quaint as worrying about whether we’re “dumb” because we can’t compute long division without a piece of paper and a pencil?
Certainly, if we’re intellectually lazy or prone to cheating and shortcuts, or if we simply don’t pay much attention to how our tools affect the way we work, then yes—we can become, like Natsidis, overreliant. But the story of computers and chess offers a much more optimistic ending, too. Because it turns out that when chess players were