Author: Stuart Sutherland
This book is a critical evaluation of why humans make frequent irrational decisions. It reviews the wide spectrum of settings where illogical thinking thrives; these are both mundane and professional with the author saying ‘…the decisions of doctors, generals, engineers, judges, businessmen and others are no more rational than those made by you or me…’ (page vii). The author defines rationality as ‘… any thought process that leads to a conclusion or a decision that is not the best that could have been reached’ (page 7).
The book’s main theme could be summarised as ‘the failure of intuition‘ as the author focused on the pitfalls of the biases that influence human decision-making. As an example, he illustrated the consequences of the availability bias by the large number of erroneous diagnoses doctors make under the influence of the most recent conditions they have seen (page 17). Illustrating the dangers of overconfidence in professionals, the author says ‘…doctors, engineers, financial advisers and others have an unwarranted confidence in their judgments’ (page 176). He showed the consequences of faulty risk assessment with anecdotes of the Three Mile Island nuclear reactor disaster (page 179) and the sinking of the Herald of Free Enterprise (page 182). He demonstrated the widespread influence of the halo effect on the outcome of job interviews and other forms of assessment (page 22). He book also explored in detail other important generic biases such as representativeness and anchoring.
The book also focussed on biases that relate to faulty statistical reasoning. These have particular relevance to professionals who often make wrong correlative or causal associations. He gives examples of irrational thinking in medical research which he said explain why ‘… so many false theories flourish in medicine’ (page 134). He blamed doctors’ poor understanding of conditional probability for frequent breast cancer misdiagnoses following mammograms (page 124). The book extensively reviewed other statistics-based biases including base-rate neglect and regression to the mean.
The foundations of the book are the research works of key decision-making psychologists. A prominent cited example is Solomon Asch and his work on primacy error which explains the irrational influence of first impressions on decision-making (page 18). Solomon Asch also researched conformity and how this sways opinions in favour of the majority view. The author cites the works of Stanley Milgram on blind obedience and how this explains mass atrocities as those committed by the Nazi regime in the 2nd World War. He referred to the work of Irving Janis on groupthink and corporate adoption of extreme decisions. The author illustrates this with John F. Kennedy’s decision to launch the disastrous Bay of Pigs invasion of Cuba (page 46), and Lyndon B. Johnson’s decision to escalate the Vietnam War (page 47).
The author used other war-related and medical stories to illustrate confirmation bias or ‘distorting the evidence’. He says this phenomenon was first recognised by the philosopher Francis Bacon who defined it as the ‘…tendency to distort evidence that is inconsistent with one’s own beliefs’ (page 105). The author attributed several accidents and disasters to confirmation bias. The outcome of the Japanese attack on Pearl Harbour for example was probably the result of the Pacific Fleet commander’s irrational attachment to a previous viewpoint. Inflexible decision-making is also shown by doctors when they cling to their first diagnosis despite the emergence of contradictory evidence (page 97).
The book also looked at the factors that exacerbate irrational decision-making. The most important are strong emotions and stress which ‘… reduce flexibility of thinking and lead to irrational behavior’ (page 89). Less dramatic emotions may also be relevant as the author illustrated with boredom, a factor he argues was responsible for accidents such as the Chernobyl nuclear reactor disaster (page 92).
The author gave tips throughout the book on how to avoid irrational thinking. He listed these as ‘morals’ at the end of each chapter. He says for instance ‘…don’t take important decisions when under stress or strong emotion’ (page 93) and ‘if you are forming a committee, ensure that different viewpoints are represented’ (page 54). His overall argument is that people should make decisions based on actuarial prediction rather than human intuition (page 197).
This is a relatively old book, written before the concepts of heuristics and biases became widely known. Age has however not lessened its relevance. It is well-written and the chapters are appropriately named. Some of the statistical concepts were difficult to grasp and the chapter on ‘Utility’ was too academic and philosophical for the general reader. My main criticism however is the author’s over-critical and sometimes condescending attitude to intuitive decision making; this is difficult to justify today with works such as Gerd Gigerenzer’s Gut Feelings and Gary Klein’s Sources of Power. It is nevertheless an accurate representation of most human decision-making activities.
This is an extensive review of an important subject and the author, a psychology professor, brought his knowledge to bear on it. His humorous approach and anecdotes made this a thoroughly enjoyable read. The book addresses significant decision-making errors which are relevant to doctors and I recommend it.
- Publisher, place, date: Pinter and Martin, London, 2007
- Number of chapters: 23
- Pages: 258
- ISBN: 978-1-905177-07-3
- Price: £21.19
- Star rating: 4