Reframing Organizations. Lee G. Bolman
2.1 lists some of the most important sources of organizational uncertainty.
ORGANIZATIONAL LEARNING
How can valid lessons be extracted from surroundings that are complex, surprising, deceptive, and ambiguous? It isn't easy, as many who tried have found out. Decades ago, scholars debated whether the idea of organizational learning made sense: Could organizations actually learn, or was learning inherently individual? That debate lapsed as experience verified instances in which individuals learned and organizations didn't, or vice versa. Complex firms such as Amazon, Apple, and Southwest Airlines have “learned” capabilities far beyond individual knowledge. Lessons are enshrined in protocols, policies, technologies, and shared cultural codes and traditions. At the same time, individuals often learn even when systems cannot.
Perspectives on organizational learning are exemplified in the work of Argote and Miron‐Spector (2011), Peter Senge (1990), Barry Oshry (1995), and Chris Argyris and Donald Schön (1978, 1996). Argote and Miron‐Spector review the literature on organizational learning and offer a rational perspective in which as organizational members use organizational tools to perform tasks, they acquire experience that leads to knowledge which is then embedded in the organizational context, including its culture. Changes in the context feed back to influence subsequent experience, completing the causal circle. Argote and Miron‐Spector acknowledge that knowledge can be ambiguous and difficult to verify, but devote little attention to barriers to learning. Senge, on the other hand, sees a core‐learning dilemma: “We learn best from our experience, but we never directly experience the consequences of many of our decisions” (p. 23). Learning is relatively easy when the link between cause and effect is clear. But complex systems often sever that connection: causes remote from effects, solutions detached from problems, and feedback absent, delayed, or misleading (Cyert and March, 1963; Senge, 1990).
Exhibit 2.1. Sources of Ambiguity.
We are not sure what the problem is.We are not sure what is really happening.We are not sure what we want.We do not have the resources we need.We are not sure who is supposed to do what.We are not sure how to get what we want.We are not sure how to determine if we have succeeded. |
Source: Adapted from McCaskey (1982).
Senge emphasizes the value of “system maps” that clarify how a system works. Consider the system dynamics of Covid‐19. In February, 2020, while America's attention was focused on the risk of the coronavirus invading from China, it arrived in New York among some two million travelers from Europe. The virus then spread quietly at a time when testing capacity was severely limited. Residents in a city of eight million continued to do all the things they usually did – including riding crowded subways, eating at restaurants, attending large conferences, and going to concerts and the theater. Without realizing it, they were engaging in very risky behavior. But, in the short term, they got no feedback, and saw no visible signs saying: “Warning! You have just been exposed to a deadly virus!” The lag between infection and symptoms was compounded by asymptomatic carriers and delays in testing. By the time very sick patients began to show up in emergency rooms, the virus was out of control.
Covid‐19 is one of many examples of actions or strategies that look good until long‐term costs become apparent. A corresponding systems model might look like Exhibit 2.2. The strategy might be cutting training to improve short‐term profitability, drinking martinis to relieve stress, offering rebates to entice customers, borrowing from a loan shark to cover gambling debts, or carelessly attending an unmasked “super‐spreader” event during a viral pandemic. In each case, the initial results seem fine, and the costs only emerge further down the road.
Oshry (1995) agrees that system blindness is widespread but highlights causes rooted in troubled relationships between groups that have little grasp of what's going on outside their own locality. Top managers feel overwhelmed by complexity, responsibility, and overwork. They are chronically dissatisfied with subordinates' lack of initiative and creativity. Middle managers, meanwhile, feel trapped between contradictory signals and pressures. The top tells them to take initiative but then punishes mistakes. Their subordinates expect them to intervene with the boss and improve working conditions. Top and bottom tug in opposite directions, causing those in the middle to feel pulled apart, confused, and weak. At the bottom, workers feel powerless, unacknowledged, and demoralized. “They give us bad jobs, lousy pay, and lots of orders but never tell us what's really going on. Then they wonder why we don't love our work.” Unless you can step back and see how system dynamics create these patterns, you muddle along blindly, unaware of better options.
Exhibit 2.2. Systems Model with Delay.
Both Oshry and Senge argue that our failure to read system dynamics traps us in cycles of blaming and self‐defense. Problems are always someone else's fault. Unlike Senge, who sees gaps between cause and effect as primary barriers to learning, Argyris and Schön (1978, 1996) emphasize managers' fears and defenses. As a result, “the actions we take to promote productive organizational learning actually inhibit deeper learning” (Argyris and Schön, 1996, p. 281).
According to Argyris and Schön, our behavior obstructs learning because we avoid undiscussable, verboten issues and carefully tiptoe around organizational taboos. That helps us avoid immediate conflict and discomfort in the moment, but in doing so we create a double bind. We can't solve problems without dealing with issues we have tried to hide. Yet discussing them would expose our cover‐up. Facing that double bind, Volkswagen engineers and Wuhan officials hid their cover‐up until outsiders caught on. Desperate maneuvers to hide the truth and delay the inevitable made the day of reckoning more catastrophic.
MAKING SENSE OF AMBIGUITY AND COMPLEXITY
Organizations try to cope with complexity and uncertainty by getting smarter or making their worlds simpler. One approach to getting smarter is developing better systems and technology to collect and process data. Another is to hire or develop professionals with sophisticated expertise in handling thorny problems. To simplify their environment, organizations often break complex issues into smaller chunks and assign slices to specialized individuals or units. These and other methods are often helpful but not always sufficient. Despite the best efforts, as we have seen or experienced, surprising—and sometimes appalling—events still happen. We need better ways to anticipate problems and wrestle with them once they arrive.
In trying to make sense of complicated and ambiguous situations, humans are often in over their heads, their brains too taxed to decode all the complexity around them. At best, managers can hope to achieve “bounded rationality,” which Foss and Weber (2016) describe in terms of three dimensions:
1 Processing capacity: Limits of time, memory, attention, and computing speed mean that the brain can only process a fraction of the information that might be relevant in each situation.
2 Cognitive economizing: Cognitive limits force human decision makers to use short‐cuts—rules of thumb, mental models, or frames—in order to trim complexity and messiness down to manageable size.
3 Cognitive biases: Humans tend to interpret incoming information to confirm their existing beliefs, expectations, and values. They often welcome confirming information while ignoring or rejecting disconfirming signals.
Benson (2016) frames cognitive biases in terms of four broad tendencies that create a self‐reinforcing cycle (see Exhibit 2.3). To cope with information