Название: The Black Swan Problem
Автор: Håkan Jankensgård
Издательство: John Wiley & Sons Limited
Жанр: Ценные бумаги, инвестиции
isbn: 9781119868163
isbn:
Confirmation biasThis is one of the leading causes of Swan‐blindness discussed in The Black Swan, where Taleb refers to confirmation as ‘a dangerous error’ (Taleb, 2007, p. 51). It has to do with the general tendency to adopt a theory or idea and then start to look for evidence that corroborates it. When we suffer from this bias, all the incoming data seems, as if by magic, to confirm that the belief we hold is correct; that the theory we are so fond of is indeed true. Whatever instances contradict the theory are brushed aside or ignored, or re‐interpreted (tweaked) in a way that supports our pre‐existing beliefs. Out the window goes Karl Popper's idea of falsification, the true marker of science and open inquiry. Using falsification as a criterion, a theory is discarded if evidence contradicting it becomes undeniable. In the specific context of managing risks, the confirmation bias is a problem because we will be too prone to interpret incoming observations of stability to suggest that the future will be similarly benign.
The optimistic biasResearch has shown that humans tend to view the world as more benign than it really is. Consequently, in a decision‐making situation, people tend to produce plans and forecasts that are unrealistically close to a best‐case scenario.14 The evidence shows that this is a bias with major consequences for risk taking. In the words of Professor Daniel Kahneman (2011): ‘The evidence suggests that an optimistic bias plays a role – sometimes the dominant role – whenever individuals or institutions voluntarily take on significant risks. More often than not, risk takers underestimate the odds they face, and do not invest sufficient effort to find out what they are.’15 Pondering on extreme and possibly calamitous outcomes will clearly not be a priority for an individual with an optimistic bent. Taking a consistently rosy view distorts expectations and therefore invites the Black Swan.
The myopia biasMyopia, in the literature on the psychology of judgement, refers to the tendency to focus more on short‐term consequences than long‐term implications. Because of our desire for instant gratification, we tend to place much less weight on future gains and losses relative to those in the near‐term. Professors Meyer and Kunreuther call this the most ‘crippling’ of all biases, resulting in gross underpreparedness for disasters that could have been mitigated with relatively simple measures.16 This was the case, for example, with the tsunami in the Indian Ocean in 2004. Only a few years prior, in Thailand, relatively inexpensive mitigation measures had been discussed – and dismissed. The reason? There were many reasons, but among other things, there was a worry that it might cause unnecessary alarm among tourists. Such miniscule short‐term benefits got the upper hand in preparing for events with colossal consequences.
The overconfidence biasHumans are prone to overrate their own abilities and the level of control they have over a situation. The typical way of exemplifying this tendency is to point to the fact that nearly everyone considers himself an above‐average driver. Taleb prefers the more humorous example of how most French people rate themselves well above the rest in terms of the art of love‐making (Taleb, 2007, p. 153). As for the effect of overconfidence on decision‐making, it is profound – and not in a favourable way. Professor Scott Plous (1993) argues that a large number of catastrophic events, such as the Chernobyl nuclear accident and the Space Shuttle Challenger explosion, can be traced to overconfidence. He offers the following summary: ‘No problem […] in decision‐making is more prevalent and more potentially catastrophic than overconfidence.’17 Overconfidence has been used to explain a wide range of observed phenomena, such as entrepreneurial market entry and trading in financial markets, despite available data suggesting high failure rates.
Considering the above, one is inclined to agree with Taleb when he remarks that ‘… it is as if we have the wrong user's manual' (Taleb, 2007, prologue xxii) for navigating successfully in a world of wild uncertainty. We crave simple but coherent narratives. We value elegant theories and become committed to them. We think we are special and that the world around us is benign. We are equipped with a mind that was created for an existence with much fewer variables and more direct cause‐and‐effect mechanisms. Reflecting deeply about interconnected systems was not key to survival in our evolutionary past. In a somewhat shocking passage, Taleb says that ‘our minds do not seem made to think and introspect’ because, historically speaking, it has been ‘a great waste of energy’ (ibid.).
In fact, information, which potentially helps us rise above sucker‐status, is costly to acquire and process. Imagine that I bring up the possibility of nuclear terror affecting a major US city. Such a scenario involves hundreds of thousands of dead and an upheaval of life as we knew it, before even considering what the countermeasures might be. Any firm with operations in the US is likely to be greatly affected by this calamity. Now what is your gut reaction to this proposed topic of conversation? In all likelihood, your kneejerk reaction is to immediately try to shut it down. The sheer unpleasantness of the topic makes us not want to go there, even for a brief moment of time. It is too much to take in, and frankly too boring, so, to save us the mental energy, we are perfectly willing to resort to the handy tactic of denial.
As problems, extreme and abstract possibilities, remote from everyday practicalities, are not inspiring enough to energize us. They are out of sight and therefore out of mind. We are unable to maintain a focus on them for long enough. Our thoughts will gravitate towards something more tangible, some action that yields a more gratifying sense of accomplishment here and now. It often takes a herculean effort to process remote possibilities and we are rarely in the mood for it. They are therefore not necessarily ‘unknown unknowns’, rather they can be thought of as ‘unknown knowables’. Unknown knowables is meant to convey that it is within our reach to form an understanding of the possibility and most of its consequence, but we fail to do so because of our laziness or disinterest. That makes it, for practical purposes, a Black Swan, at par with the unknown unknowns. At least to some, that is, because others might be prepared to take up the challenge.
THE RELATIVITY OF BLACK SWANS
Earlier in this chapter, we noted that the popular view of Black Swans is that they strike quickly and unexpectedly. Except that there is nothing in the Black Swan framework that says it has to be sudden or even happen within a reasonably short time‐period, like a few months. In fact, many of the examples discussed in Taleb's book are episodes that may seem like distinct and well‐delineated events in a history book, but were prolonged affairs with a long lead‐up. World Wars I and II are both in this category. The rise of Christianity is mentioned as another Black Swan event. A dominant Christianity would no doubt have appeared like an absurd proposition to someone living around the time of the birth of Jesus. Its consequences were certainly immense, so it meets this criterion too. It also took centuries to gain a foothold and start making its impact felt. The rise of the internet and social media were mentioned earlier as examples of technology‐driven Black Swans. They too emerged gradually over many years, infiltrating our lives one small step after the other. Therefore, from the viewpoint of a decision‐maker in the real world (which is the perspective that Taleb urges us to take) they were not instantaneous.
The fact that monumental changes can take a long time in gestation adds to the relativity of Black Swans. Those that are less wedded to specific ideas and more open to rewriting the story they tell themselves come around quicker to change. This introduces a strategic dimension to Black Swans, massive agents of change as they are. The observation to make is that when others refuse or are unable to see a changing reality, the value of being a non‐conformist increases. The sucker status of those that you interact with competitively is a variable of interest, a theme we will come back to many times in this book.
Apart from the biases that shape our thinking, the relativity of Black Swans is also a matter of information and knowledge in СКАЧАТЬ