Eyes Wide Open: How to Make Smart Decisions in a Confusing World. Noreena Hertz
university campus square.
But researchers at the University of Washington discovered otherwise. When Dustin rode by people who were crossing the square while using their mobile phones, the vast majority completely failed to notice him.11 It was as if the unicycling clown simply wasn’t there.
This is an example of a phenomenon known as ‘inattentional blindness’,12 and it’s what happens when we’re very focused on one thing in particular – in this case a telephone conversation or an important text message. When we are focused like this, we typically don’t register new data points, new things that may come in to our sensory orbit. We can even miss highly visible objects we are looking directly at, because our attention is elsewhere.13 One professor rather brilliantly describes it as being ‘as if while the eyes “see” the object, the brain does not’.14
This is something we are all likely to have experienced. Have you ever bumped into a lamp-post while walking and texting? Or somehow missed seeing an important email in your inbox during a particularly busy week? If so, you can blame inattentional blindness.
There are times, of course, when tunnel vision clearly pays off – think of your hunter-gatherer ancestors, seeking out food and doing everything they could to avoid mortal danger. Once they heard the lion’s roar, there would only be two things to think about: working out where that roar was coming from, and running in the other direction. They would not have wanted to be distracted by anything else – the pain in their muscles as they ran, the sound of birds singing above, the sight of their favourite snack hanging from a tree – these would all have been irrelevant to their immediate survival.
But fast-forward a few millennia, and if you don’t want to get knocked over by a unicycling clown, miss a critical email or fail to spot a wrong charge on your credit-card statement, you’ll need to take your blinkers off and with eyes wide open improve your powers of attention and perception.
Think about what it is that is consuming your focus. At work, this may be your latest sales figures, or your company’s current stock price. At home, it may be the football scores. How many times a day do you check these? What might this mean you’re not paying attention to? Could you spend a day (or even a few hours) without looking at them? What would you notice that you hadn’t previously? And who around you could bring the unexpected to your attention? More on this point to come.
From PowerPoint to Hypnotised Chickens
Sometimes it is not our fault that we’ve only got partial vision.
Edward Tufte (pronounced tuff-TEE), an American statistician and Professor Emeritus of Political Science, Statistics and Computer Science at Yale University, knows this all too well.
A man of many talents, Tufte was hired by President Obama in 2010 to keep an eye on how his $787 billion stimulus package was being spent.15 He also has a sideline gig as an exhibiting sculptor. But in his academic career he has undertaken substantial research into how informational graphics impact on decision-making.16
One case he has looked at in real depth is the Columbia Space Shuttle disaster of February 2003. Seven astronauts lost their lives when their NASA spacecraft disintegrated shortly before the conclusion of a successful sixteen-day mission.
It is now well-known that a briefcase-sized piece of insulation foam from the Shuttle’s external fuel tank collided with its left wing during take-off, meaning that the Shuttle was unable to shield itself from the intense heat experienced during re-entry to the earth’s atmosphere.
But the official investigation also revealed a story that is both fascinating and curiously everyday in its nature.
A key underlying factor that led to the disaster was the way in which NASA’s engineers shared information.
In particular, the Columbia Accident Investigation Board singled out the ‘endemic’ use of a computer program. A program we more usually associate with corporate seminars or high-school classrooms: Microsoft PowerPoint.
The investigators believed that by using PowerPoint to present the risks associated with the suspected wing damage, the potential for disaster had been significantly understated.
As the Columbia circled the earth, Boeing Corporation engineers scrambled to work out the likely consequences of the foam striking the thermal tiles on the Shuttle’s left wing. Tragically, when they presented their findings, their methods of presentation proved to be deeply flawed. Information was ‘lost’, priorities ‘misrepresented’, key explanations and supporting information ‘filtered out’. The ‘choice of headings, arrangement of information and size of bullets … served to highlight what management already believed’, while ‘uncertainties and assumptions that signalled danger dropped out of the information chain’.17
In other words, the very design tools that underpin the clarity of PowerPoint had served to eclipse the real story. They had distracted its readers. They had served to tell a partial, and highly dangerous, story.
Tufte, applying his expertise in information design, investigated these claims further, and found even more to be concerned about.
He analysed all twenty-eight PowerPoint slides that had been used by the engineers to brief NASA officials on the wing damage and its implications during Columbia’s two-week orbit of the earth, and discovered that some were highly misleading.
The title of a slide supposedly assessing the destructive potential of loose debris, ‘Review of Test Data Indicates Conservatism for Tile Penetration’, was, in Tufte’s words, ‘an exercise in misdirection’. What the title did not make clear was that the pre-flight simulation tests had used a piece of foam 640 times smaller than that which slammed against the Shuttle. This crucial information was buried towards the bottom of the PowerPoint slide. Nobody seemed to take any notice of it – they were too focused on the headline at the top, and did not take in the full picture.18
Tufte found that the limited space for text on PowerPoint slides led to the use of compressed phrases, with crucial caveats squeezed into ever smaller font sizes. This created a reliance on ‘executive summaries’ or slide titles that lost the nuance of uncertainties and qualifications.
In cases such as this, in other words, oversimplification leads to the loss of vital detail.
Complexity, a reality of executive decision-making, is something the medium of PowerPoint dangerously disregards.
It’s not just NASA or professors who are concerned about the potential of PowerPoint to blinker our vision.
General James N. Mattis, who served as Commander of the United States Central Command after taking over from the subsequently disgraced General David Petraeus in 2010, has always had a way with words. He once advised his marines in Iraq to ‘Be polite. Be professional. But have a plan to kill everybody you meet.’19 Mattis’s assessment of PowerPoint was equally merciless: ‘PowerPoint makes us stupid,’ he said in 2010, at a military conference in North Carolina.
This is a feeling corroborated by his military colleagues. Brigadier General H.R. McMaster, who banned PowerPoint when leading the successful assault on the Iraqi city of Tal Afar in 2005, said, ‘It’s dangerous, because it can create the illusion of understanding and the illusion of control.’ Indeed, so aware is the US Army of the medium’s powers of evasion that it intentionally deploys PowerPoint when briefing the media, a tactic known as ‘hypnotising chickens’.20
Of course, this is not to say that we don’t ever need information to be summarised