Judgment Calls. Thomas H. Davenport
Чтение книги онлайн.

Читать онлайн книгу Judgment Calls - Thomas H. Davenport страница 7

Название: Judgment Calls

Автор: Thomas H. Davenport

Издательство: Ingram

Жанр: Экономика

Серия:

isbn: 9781422183960

isbn:

СКАЧАТЬ Of course, as an evolving paradigm, organizational judgment can be found in various, often incomplete forms; it would be foolish to expect any single organization to have all of these ideas complete, perfect, and implemented at a 100-percent level. Our stories show the outlines that are emerging—and you and your own organization can fill in the details as you consider how decision making in your own work might benefit from the ideas of this new approach.

      Why You Should Read This Book

      If you think that you have a “golden gut,” that you always make excellent decisions on your own, that you are the only person whose opinions matter within your organization, and that social technologies are purely a waste of time, you probably won't be comfortable with this book—and you should drop it immediately. If you have only read this far, we are confident you can get your money back.

      But if you're still reading, that means you believe in the possibility that other people in your organization just might have expertise or opinions that could help in your decisions, and that evidence and data analysis might be helpful in decisions too. Maybe you'd simply like to get a better understanding of the iterative and deliberative decision processes that successful organizations employ. If you are a senior manager within your organization, with responsibility for making the organization better, you have really come to the right place. You naturally would like to help your firm or agency or school make better decisions over time. We hope to convince you that undertaking activities to improve your organization's overall and collective judgment is the best way to bring that about.

      If you are an individual contributor or educator or consultant, of course you make decisions too—and you can probably benefit from hearing about better ways to make them. You may not have a large organization that you are trying to get into shape, but everybody is a member of a social network (not Facebook, but the social relationships themselves) from whose wisdom you can benefit, and in the age of the Internet, anybody can gather and analyze some data to help with a difficult problem. We believe that knowing about the organizational context of judgment will help more junior or even free-floating individuals improve their ability to make decisions.

      No matter what your employment situation, we think you'll enjoy reading these stories about how organizations are making increasingly good decisions with the new and old tools at their disposal. So we invite you to read through our stories and see where this new world is headed.

      Part One

      Stories About the Participative Problem–Solving Process

      THE FIRST PART OF THE BOOK begins with a few stories of organizations that showcase the first two themes of the new pattern discussed in the introduction: framing decisions as an iterative problem-solving process, and engaging in a more self-consciously participative approach to getting to a good answer. The segment begins with the tale of NASA, and a difficult launch decision its personnel had to make one time about the (recently retired) space shuttle. We'll see how NASA—an organization known for some bad decisions with tragic consequences in the past—learned from its mistakes and embraced a better approach to organizational judgment. After that we'll focus on an interesting and innovative small business whose leader turned a home-building business challenge into a value-added problem-solving process. And we'll conclude the segment with a case study of the global consulting firm McKinsey & Company—and a major decision its partners made about acquiring and developing their most precious of assets: their people.

      1

      NASA STS-119

      Should We Launch?

      IN FEBRUARY 2009, the engineers and scientists of NASA were wrestling with a grave, potentially life-or-death decision: whether to green-light the launch of mission STS-119, the next flight of the space shuttle Discovery. Every launch of a NASA manned spacecraft puts astronaut lives and millions of dollars of equipment on the line; reputation, political capital, and scientific standing also ride on a successful launch, but catastrophe can undo them all. For every mission, NASA would like the maximum possible certainty, but there are project pressures against endless debate and analysis to unravel every possible concern. Operational schedules are tightly wound and project milestones are critical. The issue here was whether STS-119 might have a faulty valve in the systems supplying fuel to the engines, integral to maintaining pressure in the all-important hydrogen tank. The previous mission (STS-126) had experienced such a problem, which had happily not affected the success of that flight. But NASA engineers do not bet on good luck—and the risk of possible disaster with this next mission was very real. The piece of equipment in question, no longer manufactured, could not be easily replaced—but it was buttressed by some system redundancy with other valves. Should STS-119 be launched? Could the flight readiness review team get to the right “go or no-go decision” with the appropriate level of confidence?1

      Looming over these critical questions was the history of NASA itself, some fifty years of pioneering scientific triumphs punctuated by a few, but heartbreaking, accidents—where an occasional bad decision led to historic tragedy. How to be sure that this decision didn't become another tragedy? The real story of the launch of STS-119 is not about what finally happened, but about the power of how NASA personnel finally decided what to do—a process of disciplined and iterative decision making, buttressed by a strong but pragmatic culture of inquiry, things NASA developed in the morning-after clarity and learning following some historic and very public errors in judgment.

      Learning from History

      As the whole world knows, the first of those errors in judgment resulted in the fireball in the sky on January 28, 1986, when the space shuttle Challenger exploded in its second minute of flight, killing its entire crew. Despite concerns that cold weather could reduce the effectiveness of the O-ring pressure seals at the joints of the space shuttle's solid rocket motors, NASA managers had approved the launch of Challenger on that day, when the temperature at the Kennedy Space Center was barely above freezing. The spacecraft was destroyed as the failure of an O-ring to seal its joint allowed a jet of hot flame to escape and breach the shuttle's external fuel tank, causing a fatal ignition of the liquid hydrogen and liquid oxygen it contained.

      Like every shuttle launch, the January 1986 mission was preceded by a flight readiness review (an FRR, in NASA acronymspeak), whose purpose was to evaluate issues that might threaten mission success and to withhold launch permission until those issues are resolved. Two weeks earlier, an FRR had certified Challenger ready for flight. Of course, participants in that meeting could not foresee how cold it would be two weeks hence. The day before the launch, NASA personnel became concerned about the weather; the solid rocket motor manager at Marshall Space Flight Center asked Morton Thiokol, the manufacturer of those motors, to review their safety in cold weather. In a series of teleconferences that evening, Thiokol engineers initially recommended against a low-temperature launch. But after their view was challenged by NASA shuttle managers, an offline “caucus” among engineers and managers at Thiokol reversed that recommendation. Challenger lifted off the next morning and was destroyed seventy-three seconds later.

      The presidential commission set up to examine the Challenger disaster found that pressure to maintain the shuttle program's launch schedule led managers to minimize the seriousness of engineers' concerns about the O-rings.2 The perceived need for shuttle “productivity” certainly contributed to the error in judgment. Sociologist Diane Vaughn's detailed study of the launch-approval process in The Challenger Launch Decision offers a fuller and more nuanced explanation. Vaughn points to what she calls “the normalization of deviance” as a key factor. Because earlier cold or cool weather flights that suffered O-ring problems did not result in disaster, that initially unexpected damage was gradually accepted as normal. FRR participants had come to view it as an acceptable risk. In other words, the success of nearly two dozen previous missions СКАЧАТЬ