The Little Black Book of Decision Making. Nicholas Michael
Чтение книги онлайн.

Читать онлайн книгу The Little Black Book of Decision Making - Nicholas Michael страница 5

СКАЧАТЬ on the technical causes of the accident, but was more specific about the contributing causes:

       The Committee feels that the underlying problem which led to the Challenger accident was not poor communication or underlying procedures as implied by the Rogers Commission conclusion. Rather, the fundamental problem was poor technical decision-making over a period of several years by top NASA and contractor personnel, who failed to act decisively to solve the increasingly serious anomalies in the Solid Rocket Booster joints. 6

      The Problem with Hindsight

      In examining the events leading up to the Challenger accident, it would be completely understandable to have the urge to scratch your head and wonder how so many obviously intelligent people (we are talking about rocket science, after all) could have displayed such apparent ineptitude. How did NASA, an organisation that places such importance on safety, end up so flagrantly violating its own rules and appear to have so little regard for human life?

      “Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.”

      – Daniel Kahneman, Nobel Prize-winning Professor of Psychology and international best-selling author on judgment and decision making

      When a decision has gone badly, the benefit of hindsight often makes the correct decision look as though it should have been blindingly obvious. But once you are aware of this bias, you'll see it everywhere – from the immediate aftermath of the horrendous terrorist atrocities in Paris in November 2015, where the press began questioning how intelligence services had failed to anticipate the attacks as soon as the “facts” leading up to them began to emerge, to football supporters who believe they have far greater expertise at picking the team than the manager, to the times when we second-guess our own decisions: “I should have known not to take that job”, “I knew the housing market would collapse/go up”, “I should have known that he was being unfaithful to me”, “I knew that if I trusted her she'd hurt me”, “I should have listened to my intuition”, and on it goes …

      This “hindsight bias” refers to the tendency for uncertain outcomes to seem more likely once we know the outcome that has occurred. Because of it, we are prone to view what has already happened as relatively inevitable and obvious, not realising how the information about the outcome has affected us.

      One of the first psychologists to investigate hindsight bias was Baruch Fischoff who, together with Ruth Beyth, used President Richard Nixon's historically important 1972 diplomatic visits to China and Russia as the focus for a study. Before the visits took place, participants were asked to assign probabilities to 15 possible outcomes, such as whether the U.S. would establish a diplomatic mission in Peking or establish a joint space programme with Russia. Two weeks to six months after the visits had taken place, the same people were asked to recall what their earlier predictions had been. The results were clear. The majority of participants inflated their estimates for the outcomes that had occurred while remembering having assigned lower probabilities to those that had not. This bias also became stronger as the time between the initial prediction and the recall task increased. Many other events that captured public attention have since been studied, with similar results.

      The heart of the problem seems to be that once we adopt a new understanding of the world, we immediately find it difficult to reconstruct past beliefs with any accuracy. This inevitably causes us to underestimate our own level of surprise at past events and, on the flip side of the coin, explains why it is so easy to be surprised when others overlook the obvious, as NASA did in the run-up to the Challenger accident.

      Hindsight, because it is always 20:20, ensures that we feel on safe ground when criticising others’ irrationality or lack of foresight; moreover, it simultaneously reduces our ability to evaluate past decisions objectively (our own or those of others). It can have an extremely detrimental impact on both decision making and decision makers:

      • Decisions that don't work out can often be punished, because the variety of factors that were outside the control of the decision maker are difficult to recognise after the event.

      • If decision makers come to expect that their decisions will be scrutinised with hindsight, they are much more likely to seek risk-averse and bureaucratic solutions.

      • Irresponsible risk seekers can be undeservedly rewarded when their decisions work out because it is hard to recognise their gamble, so they don't get punished for taking too much risk. Meanwhile, anyone who doubted them may get branded as conventional, over-cautious, or plain weak.

      • Perhaps most importantly, hindsight severely reduces our ability to learn from past decisions. We'll look at why this is so important in the next couple of chapters.

      We are all susceptible to hindsight bias, but it can be very difficult to recognise what is happening.

      Running on Instinct

      Psychologists use the term heuristics to describe the unconscious mental shortcuts that we take to arrive at judgments or solve problems. To date, dozens of them have been identified; hindsight bias being just one example. When we are faced with difficult questions, high complexity or ambiguity, or a need for high speed, heuristics can help us to find answers or solutions that would otherwise be beyond conscious reach. However, because they evolved to enable us to cope with an evolutionary past when we were living on the plains, hunting and gathering, the biases they introduce are often imperfect and may lead to terrible mistakes.

      Mental shortcuts can even lead to inappropriate biases in life or death situations, as demonstrated by a study by Amos Tversky which looked at how the way that data is presented can affect doctors’ choices. All of the participants received the same data on the effectiveness of two interventions for lung cancer: surgery and radiation treatment. It indicated that radiation offered a much better chance of survival in the short term, but a lower life expectancy over the next few years.

      For half of the participants the data was presented in relation to survival rates, whilst for the others it was provided in terms of death rates; for example, the statistics for the surgical treatment of 100 patients were as follows:

      Clearly, from a mathematical/logical point of view, the two columns of data are exactly the same, yet 82 % of the doctors presented with the survival data recommended surgery versus only 56 % of those who were given the opposite perspective. Studies like this demonstrate the enormous influence that heuristics can have on our decision making; in particular, how difficult it is for us to divorce decisions from their emotional components.

      Heuristics can be considered to be much like instincts. Animal instincts are easy to recognise; indeed, we assume that this is how animals do pretty much everything. As human beings, however, we generally prefer to think of ourselves as rational. We like to hang on to the evidence of our conscious experience, which suggests that our experience of the world is “accurate” and that we form beliefs and opinions based on the facts of the situation. Social psychologist Lee Ross called this conviction “naïve realism” – the conviction that we have the ability to experience events as they are. It enables us to justify any opinion as reasonable, because if it wasn't we wouldn't hold it! Sounds great, doesn't it? And it is completely wrong. The logic of this kind of thinking does not bear scrutiny, but that's okay because it's an easy choice not to investigate …

      Конец ознакомительного фрагмента.

      Текст предоставлен ООО «ЛитРес».

      Прочитайте СКАЧАТЬ



<p>6</p>

U.S. House Committee on Science and Technology (29 October 1986), “Investigation of the Challenger Accident; Report of the Committee on Science and Technology, House of Representatives.”