The book, written by Rolf Dobelli, presents simple, clear and useful thoughts and ideas to shape the way we think – at work, at home, and at our every day lives. The book includes examples of cognitive biases, simple errors we all make in our day-to-day thinking. The author claims that, by knowing what they are and how to spot them, we can avoid them and make better choices – whether dealing with a personal problem or business negotiation.
The ten main ‘biases‘ and mistakes regarding decision-making that I would highlight out of this book are:
- Confirmation bias (Part 1) – ‘Facts do not cease to exist because they are ignored‘, said writer Aldous Huxley. However, we do exactly that, as super-investor Warren Buffet knows: ‘What the human being is best at doing, is interpreting all new information so that their prior conclusions remain intact‘. The confirmation bias is alive and well in the business world. Confirmation bias (Part 2) – No professionals suffer more from the confirmation bias than business journalists. Often, they formulate a easy theory, pad it out with two or three pieces of ‘evidence’ and call it a day. For example: ‘Google is so successful because the company nurtures a culture of creativity’. Once this idea is on paper, the journalist corroborates it by mentioning a few other prosperous companies that foster ingenuity. Rarely does the writer seek out discomforting evidence, which in this instance would be struggling business that live and breathe creativity or, conversely, flourishing firms that are utterly uncreative. Both groups have plenty of members, but the journalist simply ignores them. If he or she were not to mention just one, the storyline would be ruined… The Internet is particularly fertile ground for confirmation bias. To stay informed, we browse news sites and blogs, forgetting that our favored pages mirror our existing values, be they liberal, conservative or somewhere in between. Moreover, a lot of sites now tailor content to personal interests and browsing history, causing new and divergent opinions to vanish from the radar altogether. We inevitably land in communities of like-minded people, further reinforcing our convictions – and the confirmation bias.
- Authority bias – Over the past decade, airlines have learned the dangers of the authority bias. In the old days, the captain was king. His commands were not to be doubted. If a co-pilot suspected an oversight, he wouldn’t have dared to address it out of respect for – or fear of – his captain. Since this behavior was discovered, nearly every airline has instituted ‘Crew Resource Management‘ (CRM), which coaches pilots and their crews to discuss any reservations they have openly and quickly. In other words: they carefully deprogramme the authority bias. CRM has contributed more to flight safety in the past twenty years than any technical advances have… Many companies are light years from this sort of foresight. Especially at risk are firms with domineering CEOs, where employees are likely to keep their ‘lesser’ opinions to themselves – much to the detriment of the business.
- Availability bias – The availability bias says this: we create a picture of the world using examples that most easily come to mind. This is absurd, of course, because in reality things don’t happen more frequently just because we can conceive them more easily… People prefer information that is easy to obtain, be it economic or data recipes. They make decisions based on this information rather than on more relevant but harder to obtain information – often with disastrous results… We prefer wrong information to no information. Thus, the availability bias has presented the banks with million of losses… Fend it off spending time with people who think different than you think – people whose experiences and expertise are different than yours. We require others’ input to overcome the availability bias.
- Outcome bias – Never judge a decision by its outcome – that is precisely the outcome bias. Specially, when randomness or ‘external factors’ play a role. A bad result does not automatically indicate a bad decision and vice versa. So rather than tearing your hair out about a wrong decision, or applauding yourself for one that may have only coincidentally led to success, remember why you chose what you did. Were your reasons rationale and understandable? Then you would do well to stick with that. method, even if you didn’t strike lucky last time.
- Forecast illusion – There are two kind of forecasters: those who don’t know, and those who don’t know they don’t know, wrote Harvard economist John Kenneth Galbraith… The problem is that experts enjoy free rein with few negative consequences. If they strike it lucky, they enjoy publicity, consultancy offers and publication deals. If they are completely off the mark, they face no penalties – either in terms of financial compensation or in loss of reputation. This win-win scenario virtually incentivises them to churn out as many prophecies as they can muster. Indeed, the more forecasts they generate, the more they will be coincidentally correct… So what is predictable and what is not?... Be critical when you encounter predictions. Whenever I hear one, I make sure to smile, no matter how bleak it is. Then I ask myself two questions. First, what incentives does the expert have? If he is an employee, could he lose his job if he is wrong? Or is he a self-appointed guru who earns a living through books and lectures? The latter type of forecaster relies on the media’s attention so, predictably, his prophecies tend to be sensational. Second, how good is his success rate? How many predictions has he made over the past five years? Out of these, how many have been right and how many have not? This information is vital yet often goes unreported. Media should not publish any more forecasts without giving the pundit’s track record.
- Action bias – A study performed by Israeli researcher Michael Bar-Eli showed that, in a penalty situation in soccer, the ball takes less than 0.3 seconds to travel from the player who kicks the ball to the goal. There is not enough time for the goalkeeper to watch the ball’s trajectory. He must take a decision before the ball is kicked. Soccer players who take penalty kicks shoot one third of the time at the middle of the goal, one third pot the time at the left and one third of the time at the right. Surely goalkeepers have spotted this, but what do they do? They dive either to the left or to the right. Rarely they do stay in the middle – even though
- Omission bias – Suppose you are the head of the Federal Drug Administration. You must decide whether or not to approve a drug for the terminally ill. The pills can have fatal side effects: they kill 20% of patients on the spot, but save the lives of the other 80% within a short period of time. What do you decide? Most would withhold approval. To them, waving through a drug that takes out every fifth person is a worse act than failing to administer the cure to the other 80% of patients. It is an absurd decision, and a perfect example of the omission bias. Suppose that you are aware of the bias and decide to approve the drug in the name of the reason and decency. Bravo. But what happens when the first patient dies? A media storm ensues, and soon you find yourself out of a job. As a civil servant or politician, you would do well to take the ubiquitous omission bias seriously – and even foster it… Such bias is very difficult to detect – after all, action is more noticeable than inaction. In the 1960s students movements coined a punchy slogan to condemn it: “If you are not part of the solution, you are part of the problem“.
- Twaddle tendency – Here, reams of words are used to disguise intellectual laziness, stupidity, or underdeveloped ideas. Sometimes it works, sometimes not… The twaddle tendency is especially rife in sport. Breathless interviewers push equally breathless football players to break down the components of the game, when all they want to say is: “We lost the game – it’s really that simple.” But the presenter has to fill airtime somehow – and seemingly the best method is by jabbering away and by compelling the athletes and coaches to join in. Jabber distinguishes ignorance… In conclusion: verbal expression is the mirror of the mind. Clear thoughts become clear statements, whereas ambiguous ideas transform into vacant ramblings. The trouble is that, in many cases, we lack very lucid thoughts. The world is complicated, and it takes a great deal of mental effort to understand even one facet of the whole. Until you experience such an epiphany, it’s better to heed Mark Twain: “If you have nothing to say, say nothing“. Simplicity is the zenith of a long, arduous journey, not the starting point.
- Information bias – In his short story ‘Del rigor de la Ciencia‘, which consists of just a single paragraph, Jorge Luis Borges describes a special country. In this country, the science of cartography is so sophisticated that only the most detailed of maps will do – that is, a map of scale 1:1, as large as the country itself. Its citizens soon realize that such a map does not provide any insight, since it merely duplicates what they already know. Borges’ map is an extreme case of the information bias, the delusion that more information guarantees better decisions… Forget trying to amass all the data. Do your best to get by with fare facts. It will help you make better decisions. Superfluous knowledge is worthless, whether you know it or not. Daniel J. Boorstin put it right: “The greatest obstacle to discovery is not ignorance – it is the illusion of knowledge“. So next time you are confronted by a rival, considering killing him – not with kindness but with reams of data and analysis.
- Why people prefer novels to statistics (Personification) – In a particular experiment, psychologist Paul Slovic asked people for donations. One group was shown a photo of Rokia from Malawi, an emaciated child with pleading eyes. Afterward, people donated an average of $2.83 to the charity (out of $5 they were given to fill out a short survey). The second group was shown statistics about the famine in Malawi, including the fact that more than three million malnourished children were affected. The average donation dropped 50%. This is illogical: you would think that people’s generosity would grow if they knew the extent of the disaster. But we do not function like that. Statistics don’t stir us: people do… So be careful when you encounter human stories. Ask for the facts and the statistical distribution behind them. You can still be moved by the story, but this way, you can put it into the right context. If, however, you seek to move and motivate people for your own ends, make sure your tale is seasoned with names and faces.
In the epilogue, the author describes a question that the Pope asked to Michelangelo: ‘Tell me the secret of your genius. How have you created the status of David, the masterpiece of all masterpieces?‘ Michelangelo’s answer: ‘It’s simple. I removed everything that is not David‘.
We do not know for sure what makes us successful. We can’t pinpoint exactly what makes us happy. But we know with certainty what destroys success or happiness. This realization, as simple as it is, is fundamental: Negative knowledge (what not to do) is much more potent than positive knowledge (what to do).
Thinking more clearly and acting more shrewdly means adopting Michelangelo’s method: don’t focus on David. Instead, focus on everything that is not David and chisel it away. In our case: eliminate all errors and better thinking will follow.
The Greeks, Romans, and medieval thinkers had a term for this approach: via negativa. Literally the negative path, the path of renunciation, of exclusion, of reduction. Theologians were the first to read the via negativa: we cannot say what God is; we can only say what God is not. Applied to the present day: we cannot say what brings us success. We can pin down only what blocks or obliterates success. Eliminate the downside, the thinking errors, and the upside will take care of itself. This is all we need to know.