Название: Reframing Organizations
Автор: Lee G. Bolman
Издательство: John Wiley & Sons Limited
Жанр: Управление, подбор персонала
isbn: 9781119756842
isbn:
Exhibit 2.1. Sources of Ambiguity.
We are not sure what the problem is.We are not sure what is really happening.We are not sure what we want.We do not have the resources we need.We are not sure who is supposed to do what.We are not sure how to get what we want.We are not sure how to determine if we have succeeded. |
Source: Adapted from McCaskey (1982).
Senge emphasizes the value of “system maps” that clarify how a system works. Consider the system dynamics of Covid‐19. In February, 2020, while America's attention was focused on the risk of the coronavirus invading from China, it arrived in New York among some two million travelers from Europe. The virus then spread quietly at a time when testing capacity was severely limited. Residents in a city of eight million continued to do all the things they usually did – including riding crowded subways, eating at restaurants, attending large conferences, and going to concerts and the theater. Without realizing it, they were engaging in very risky behavior. But, in the short term, they got no feedback, and saw no visible signs saying: “Warning! You have just been exposed to a deadly virus!” The lag between infection and symptoms was compounded by asymptomatic carriers and delays in testing. By the time very sick patients began to show up in emergency rooms, the virus was out of control.
Covid‐19 is one of many examples of actions or strategies that look good until long‐term costs become apparent. A corresponding systems model might look like Exhibit 2.2. The strategy might be cutting training to improve short‐term profitability, drinking martinis to relieve stress, offering rebates to entice customers, borrowing from a loan shark to cover gambling debts, or carelessly attending an unmasked “super‐spreader” event during a viral pandemic. In each case, the initial results seem fine, and the costs only emerge further down the road.
Oshry (1995) agrees that system blindness is widespread but highlights causes rooted in troubled relationships between groups that have little grasp of what's going on outside their own locality. Top managers feel overwhelmed by complexity, responsibility, and overwork. They are chronically dissatisfied with subordinates' lack of initiative and creativity. Middle managers, meanwhile, feel trapped between contradictory signals and pressures. The top tells them to take initiative but then punishes mistakes. Their subordinates expect them to intervene with the boss and improve working conditions. Top and bottom tug in opposite directions, causing those in the middle to feel pulled apart, confused, and weak. At the bottom, workers feel powerless, unacknowledged, and demoralized. “They give us bad jobs, lousy pay, and lots of orders but never tell us what's really going on. Then they wonder why we don't love our work.” Unless you can step back and see how system dynamics create these patterns, you muddle along blindly, unaware of better options.
Exhibit 2.2. Systems Model with Delay.
Both Oshry and Senge argue that our failure to read system dynamics traps us in cycles of blaming and self‐defense. Problems are always someone else's fault. Unlike Senge, who sees gaps between cause and effect as primary barriers to learning, Argyris and Schön (1978, 1996) emphasize managers' fears and defenses. As a result, “the actions we take to promote productive organizational learning actually inhibit deeper learning” (Argyris and Schön, 1996, p. 281).
According to Argyris and Schön, our behavior obstructs learning because we avoid undiscussable, verboten issues and carefully tiptoe around organizational taboos. That helps us avoid immediate conflict and discomfort in the moment, but in doing so we create a double bind. We can't solve problems without dealing with issues we have tried to hide. Yet discussing them would expose our cover‐up. Facing that double bind, Volkswagen engineers and Wuhan officials hid their cover‐up until outsiders caught on. Desperate maneuvers to hide the truth and delay the inevitable made the day of reckoning more catastrophic.
MAKING SENSE OF AMBIGUITY AND COMPLEXITY
Organizations try to cope with complexity and uncertainty by getting smarter or making their worlds simpler. One approach to getting smarter is developing better systems and technology to collect and process data. Another is to hire or develop professionals with sophisticated expertise in handling thorny problems. To simplify their environment, organizations often break complex issues into smaller chunks and assign slices to specialized individuals or units. These and other methods are often helpful but not always sufficient. Despite the best efforts, as we have seen or experienced, surprising—and sometimes appalling—events still happen. We need better ways to anticipate problems and wrestle with them once they arrive.
In trying to make sense of complicated and ambiguous situations, humans are often in over their heads, their brains too taxed to decode all the complexity around them. At best, managers can hope to achieve “bounded rationality,” which Foss and Weber (2016) describe in terms of three dimensions:
1 Processing capacity: Limits of time, memory, attention, and computing speed mean that the brain can only process a fraction of the information that might be relevant in each situation.
2 Cognitive economizing: Cognitive limits force human decision makers to use short‐cuts—rules of thumb, mental models, or frames—in order to trim complexity and messiness down to manageable size.
3 Cognitive biases: Humans tend to interpret incoming information to confirm their existing beliefs, expectations, and values. They often welcome confirming information while ignoring or rejecting disconfirming signals.
Benson (2016) frames cognitive biases in terms of four broad tendencies that create a self‐reinforcing cycle (see Exhibit 2.3). To cope with information overload, we filter out most data and take in only what seems important and consistent with our current mind‐set. That gives us an incomplete picture, but we fill in the gaps to make everything fit with our current beliefs. Then, in order to act quickly instead of getting lost in thought, we favor the easy and obvious over the complex or difficult. We then code our experience into memory by discarding specifics and retaining generalities or by using a few specifics to represent a larger whole. This reinforces our current mental models, which then shape how we process experience in the future.
Exhibit 2.3. Cognitive Biases.
Cognitive Challenge | Solution | Risk |
---|---|---|
Too much data to process | Filter out everything except what we see as important and consistent with our current beliefs | Miss things that are important or could help us learn |
Tough to make sense of a confusing, ambiguous world | Fill in gaps, make things fit with our existing stories and mental models | Create and perpetuate false beliefs and narratives |