Reframing Academic Leadership. Lee G. Bolman
college presidents advising Turner are cases in point. The tacit nature of the human sensemaking process can blind academic leaders to available alternatives and to gaps and biases in their framing (Argyris, 1982). It also leaves them seeing little reason to question their interpretations or retrace any of their steps from data selection through action.
2 Sensemaking is interpretive. When thrown into life's ongoing stream of experiences, people create explanations of what things mean – and often assume that others either see things the same way or are wrong if they don't. Each of the presidents advising Turner offered different advice, and each felt confident that his or her perspective was right.
3 Sensemaking is action‐oriented. People's personal interpretations contain implicit prescriptions for what they and others should do. If you conclude, for example, that your unit's budget problems result from overspending, then you'll cut expenses. If you see the problem as inadequate allocations from central administration, then you might lobby for more. If you bemoan inattention to revenue generation, you'll turn to new program development. If it's embezzlement, a call to the police is in order. Think about Nancy Turner. If she accepts that strong support from faculty is key to her success, then she will start building those relationships. If she concludes that the campus expects her to lead off with a compelling vision, she'll get to work on the big picture. You can see the ease and the potential complications in all this. Academic leaders anchor around their take on a situation and they're off and running before they're sure what's important, what they don't yet know, and where they should be heading.
Sensemaking is a personal search for meaning, governed by the tacit criterion of plausibility rather than accuracy. “We carve out order by leaving the disorderly parts out,” concludes eminent psychologist William James (Richardson, 2006, p. 5). Finding a “good enough” explanation of the situation will stop our search for other alternatives, even early in the hunt. We need not find the truth or the best of all possible solutions. We just want something that's good enough by our tacit standards to let us move forward and get things done. And we're rarely aware that this is what we are doing.
What's at stake is illustrated in a story from the work of Jerome Groopman on how doctors think (Groopman, 2000, 2007). Groopman tells about a patient he calls Ann Dodge. At age 20, Ann developed a serious eating problem – every meal produced pain, nausea, vomiting, and diarrhea. Over time, she saw some 30 doctors in a variety of specialties, and each confirmed the initial diagnosis. Ann had a psychiatric condition, anorexia nervosa with bulimia. The problem was in her mind, the doctors concluded, but still very dangerous and potentially deadly. Doctors prescribed a series of treatments, including diet, drugs, and talk therapy. Her doctor told her to consume 3,000 calories a day, mostly in easily digested carbohydrates like pasta. Over 15 years, she kept getting worse. In 2004, Ann was hospitalized four times in a mental health facility in hopes that close supervision of her food intake might enable her to gain weight. Nothing worked.
Finally, at her boyfriend's insistence, Ann traveled to Boston to see a highly recommended gastroenterologist, Dr. Myron Falchuk. Ann was reluctant, and her primary care doctor advised that the trip was unnecessary since her problem was so well understood. But Ann went anyway. Falchuk had reviewed Ann's records and knew what all the doctors had concluded. But he put the information aside – literally pushing the tall stack of folders and reports to the far side on his desk – and asked Ann to tell him her whole story again. As she did, Falchuk listened with a fresh mind and felt the story didn't quite add up. In particular, he wondered why Ann wasn't gaining weight if, as she insisted, she really was consuming as much as 3,000 calories a day. Well, he wondered, what if she couldn't digest what she was eating? He did more tests, and eventually concluded that Ann suffered from celiac disease – an intolerance of the gluten commonly found in grains like wheat, rye, and barley. Ann Dodge was being poisoned by the pasta diet her physicians had prescribed to save her. As soon as she shifted to a gluten‐free diet, she began to gain weight. In Ann's view, Dr. Falchuk was a miracle worker. From our perspective, Falchuk illustrates the power and importance of reframing in helping transcend the limits of – and our over confidence in – our own sensemaking.
Here's the point. When a doctor encounters a new patient, he or she tries to frame the patient by matching symptoms and selected pieces of information to patterns that the doctor has learned through experience and training. The process is quick and automatic: it begins with the first look at the patient when the physician enters the examining room. Doctors frame patients all the time.
Expert clinicians can often determine what's going on with a patient in 20 seconds. It's simple pattern recognition, honed by training and experience. But sometimes they get it wrong. One source of error is anchoring: doctors can lock onto the first answer that seems right – or what trusted others are tacitly encouraging them to see. “Your mind plays tricks on you,” says Groopman, “because you see only the landmarks you expect to see and neglect those that should tell you that in fact you're still at sea” (2007, p. 65). Another source of distortion is a doctor's own needs and feelings. Operating under time pressure and wanting to be helpful, physicians want to arrive at a diagnosis and prescription as quickly as possible. They interpret any new data in the light of their current conclusion, and often cling to their diagnosis in the face of disconfirming evidence. Kahneman calls this the illusion of validity: the common and unjustified sense of confidence that people have in their own judgments (Kahneman & Klein, 2009). What is true for physicians is also true for academic administrators. Notice how readily Nancy Turner's colleagues offered her advice. They wanted to help. She expected nothing less.
Daily life for academic leaders presents them with a continuous stream of challenges and opportunities that are even more complex and ambiguous than those facing physicians. They are also more vulnerable to errors because they operate in environments that are poorly designed for learning about the quality of their judgment. Successful leaders develop a kind of skilled intuition that allows them to act quickly and wisely. Kahneman and Klein (2009) argue that this works best in “high‐validity” environments where cause and effect are consistently and reliably connected, which is often not the case in the ambiguous world of higher education. The same comment by a dean at one faculty meeting, for example, may elicit a completely different response when said at another for a host of reasons, including something as simple as which faculty members happen to be in the room at the time.
Whether academic leaders realize it or not, they are continually making choices about how to see and interpret their world – and their choices are fateful. If, for example, Nancy Turner focuses her energies on recruiting a new chief academic officer while faculty morale continues to plummet – and news of the growing dissatisfaction bombards sympathetic board members – she may find herself in a deep hole before she can benefit from a stronger top leadership team.
A central mistake for leaders in any context is to lock into limited or flawed views of their world. If what you're doing is not producing the results you want, it is time to reflect on your sensemaking. Reframing – the conceptual core of the book – can serve as a powerful antidote. Reframing is the deliberate process of looking at a situation carefully and from multiple perspectives, choosing to be more mindful by considering alternative views and explanations. Turner's colleagues each framed her situation differently, and each identified a piece of a larger puzzle. Each bit of advice expressed the personal frame, the mental map, of its maker – and that is the beauty and utility in strategies that seek feedback from diverse others. Each colleague stretched Turner's original views of her campus and of her leadership options. Together they offered Turner a larger understanding of her challenges than any one alone might have. In the language of this book, they helped Turner to reframe.
Research shows that leaders often miss significant data or elements in decoding the situations and opportunities that they face (e.g., Bolman & Deal, 2008b; Weick, 1995). They will nonetheless press forward. The risk is that they'll do what Ann Dodge's early doctors did – focus on selected cues and fit what they see into a familiar pattern, even if it isn't quite right for the situation. Like Ann's doctors, they may insist that their answer is correct and that there's no need for further input or investigation – even if the diagnosis leads to options that don't work. In those cases, they will often conclude that someone else is at fault,