Smarter Than You Think: How Technology is Changing Our Minds for the Better. Clive Thompson
it backward, all while flying through; it’s like a TiVo for reality. He zooms into the scene of his watching his son, freezes it, then flies down the hallway into the kitchen, where his mother is looking up, startled, reacting to his yells of delight. It seems wildly futuristic, but Roy claims that eventually it won’t be impossible to do in your own home: cameras and hard drives are getting cheaper and cheaper, and the software isn’t far off either.
Still, as Roy acknowledges, the whole project is unsettling to some observers. “A lot of people have asked me, ‘Are you insane?’” He chuckles. They regard the cameras as Orwellian, though this isn’t really accurate; it’s Roy who’s recording himself, not a government or evil corporation, after all. But still, wouldn’t living with incessant recording corrode daily life, making you afraid that your weakest moments—bickering mean-spiritedly with your spouse about the dishes, losing your temper over something stupid, or, frankly, even having sex—would be recorded forever? Roy and his wife say this didn’t happen, because they were in control of the system. In each room there was a control panel that let you turn off the camera or audio; in general, they turned things off at 10 p.m. (after the baby was in bed) and back on at 8 a.m. They also had an “oops” button in every room: hit it, and you could erase as much as you wanted from recent recordings—a few minutes, an hour, even a day. It was a neat compromise, because of course one often doesn’t know when something embarrassing is going to happen until it’s already happening.
“This came up from, you know, my wife breast-feeding,” Roy says. “Or I’d stumble out of the shower, dripping and naked, wander out in the hallway—then realize what I was doing and hit the ‘oops’ button. I didn’t think my grad students needed to see that.” He also experienced the effect that documentarians and reality TV producers have long noticed: after a while, the cameras vanish.
The downsides, in other words, were worth the upsides—both scientific and personal. In 2007, Roy’s father came over to see his grandson when Roy was away at work. A few months later, his father had a stroke and died suddenly. Roy was devastated; he’d known his father’s health was in bad shape but hadn’t expected the end to come so soon.
Months later, Roy realized that he’d missed the chance to see his father play with his grandson for the last time. But the house had autorecorded it. Roy went to the TotalRecall system and found the video stream. He pulled it up: his father stood in the living room, lifting his grandson, tickling him, cooing over how much he’d grown.
Roy froze the moment and slowly panned out, looking at the scene, rewinding it and watching again, drifting around to relive it from several angles.
“I was floating around like a ghost watching him,” he says.
What would it be like to never forget anything? To start off your life with that sort of record, then keep it going until you die?
Memory is one of the most crucial and mysterious parts of our identities; take it away, and identity goes away, too, as families wrestling with Alzheimer’s quickly discover. Marcel Proust regarded the recollection of your life as a defining task of humanity; meditating on what you’ve done is an act of recovering, literally hunting around for “lost time.” Vladimir Nabokov saw it a bit differently: in Speak, Memory, he sees his past actions as being so deeply intertwined with his present ones that he declares, “I confess I do not believe in time.”5 (As Faulkner put it, “The past is never dead. It’s not even past.”)6
In recent years, I’ve noticed modern culture—in the United States, anyway—becoming increasingly, almost frenetically obsessed with lapses of memory. This may be because the aging baby-boomer population is skidding into its sixties, when forgetting the location of your keys becomes a daily embarrassment. Newspaper health sections deliver panicked articles about memory loss and proffer remedies, ranging from advice that is scientifically solid (get more sleep and exercise) to sketchy (take herbal supplements like ginkgo) to corporate snake oil (play pleasant but probably useless “brain fitness” video games.) We’re pretty hard on ourselves. Frailties in memory are seen as frailties in intelligence itself. In the run-up to the American presidential election of 2012, the candidacy of a prominent hopeful, Rick Perry, began unraveling with a single, searing memory lapse: in a televised debate, when he was asked about the three government bureaus he’d repeatedly vowed to eliminate, Perry named the first two—but was suddenly unable to recall the third. He stood there onstage, hemming and hawing for fifty-three agonizing seconds before the astonished audience, while his horrified political advisers watched his candidacy implode. (“It’s over, isn’t it?” one of Perry’s donors asked.)7
Yet the truth is, the politician’s mishap wasn’t all that unusual. On the contrary, it was extremely normal. Our brains are remarkably bad at remembering details. They’re great at getting the gist of something, but they consistently muff the specifics. Whenever we read a book or watch a TV show or wander down the street, we extract the meaning of what we see—the parts of it that make sense to us and fit into our overall picture of the world—but we lose everything else, in particular discarding the details that don’t fit our predetermined biases. This sounds like a recipe for disaster, but scientists point out that there’s an upside to this faulty recall. If we remembered every single detail of everything, we wouldn’t be able to make sense of anything. Forgetting is a gift and a curse: by chipping away at what we experience in everyday life, we leave behind a sculpture that’s meaningful to us, even if sometimes it happens to be wrong.
Our first glimpse into the way we forget came in the 1880s, when German psychologist Hermann Ebbinghaus ran a long, fascinating experiment on himself.8 He created twenty-three hundred “nonsense” three-letter combinations and memorized them. Then he’d test himself at regular periods to see how many he could remember. He discovered that memory decays quickly after you’ve learned something: Within twenty minutes, he could remember only about 60 percent of what he’d tried to memorize, and within an hour he could recall just under a half. A day later it had dwindled to about one third. But then the pace of forgetting slowed down. Six days later the total had slipped just a bit more—to 25.4 percent of the material—and a month later it was only a little worse, at 21.1 percent. Essentially, he had lost the great majority of the three-word combinations, but the few that remained had passed into long-term memory. This is now known as the Ebbinghaus curve of forgetting, and it’s a good-news-bad-news story: Not much gets into long-term memory, but what gets there sticks around.
Ebbinghaus had set himself an incredibly hard memory task. Meaningless gibberish is by nature hard to remember. In the 1970s and ’80s, psychologist Willem Wagenaar tried something a bit more true to life.9 Once a day for six years, he recorded a few of the things that happened to him on notecards, including details like where it happened and who he was with. (On September 10, 1983, for example, he went to see Leonardo da Vinci’s Last Supper in Milan with his friend Elizabeth Loftus, the noted psychologist). This is what psychologists call “episodic” or “autobiographical” memory—things that happen to us personally. Toward the end of the experiment, Wagenaar tested himself by pulling out a card to see if he remembered the event. He discovered that these episodic memories don’t degrade anywhere near as quickly as random information: In fact, he was able to recall about 70 percent of the events that had happened a half year ago, and his memory gradually dropped to 29 percent for events five years old. Why did he do better than Ebbinghaus? Because the cards contained “cues” that helped jog his memory—like knowing that his friend Liz Loftus was with him—and because some of the events were inherently more memorable. Your ability to recall something is highly dependent on the context in which you’re trying to do so; if you have the right cues around, it gets easier. More important, Wagenaar also showed that committing something to memory in the first place is much simpler if you’re paying close attention. If you’re engrossed in an emotionally vivid visit to a da Vinci painting, you’re far more likely to recall it; your everyday humdrum Monday meeting, not so much. (And if you’re frantically multitasking on a computer, paying only partial attention to a dozen tasks, you might only dimly remember any of what you’re doing, a problem that I’ll talk about many times in this book.) But even so, as Wagenaar found, there are surprising limits. For fully 20 percent of the events he recorded,