We all might be a little wiser if we read Daniel Kahneman’s study of how decisions are made …
Penguin Books, May 2012
Paperback, pp 512, £8.99
Among the most enjoyable features of Daniel Kahneman’s best-selling Thinking Fast and Slow are the examples which tend to undermine the reputations of those whose judgement the general public are frequently urged to admire. It is rather gratifying to find that their vaunted decision-making skills often turn out to be illusory. For instance, when Kahneman carried out a statistical analysis of investment outcomes for twenty-five wealth advisers over eight years, he was surprised to find a zero correlation between the yearly performances. In other words, there was no statistical evidence for the existence of any real and persistent difference in skill between the advisers. Instead, the results resembled what would have been obtained by a similar analysis of scores in a dice-rolling contest – where, of course, everything depends on luck.
An even more piquant corollary is that when Kahneman presented these findings to the company which provided the data he sensed it was falling on deaf ears. The management’s prevailing a priori belief in the individual skill of the wealth advisors seemed not to be shaken by a piece of mere statistical evidence; and the next set of annual bonuses was almost certainly allocated (just as in previous years) to reward luck as if it were skill. A similarly entrenched faith in the existence of financial talent that must be cherished at all costs surely lies behind warnings that new regulations or higher taxes might upset certain business “stars” and cause them to take their skills elsewhere. Such exaggerated respect is maintained (in some quarters) in spite of the evidence of numerous poor judgements leading up to the financial crash that began in 2007-8.
A second example of this human tendency to make, and stick to, decisions in defiance of relevant facts is provided by a study of railway development projects across the world between 1969 and 1998. The initial budgets for such projects have consistently overestimated the numbers of passengers who will use the new rail systems and hence economic benefits have usually been non-existent or far smaller than were forecast. The key observation is that governments and developers appear to have learned nothing from anyone else’s experience even though a steadily growing body of evidence of the dangers of over-optimism has been widely available. Over recent decades in Great Britain it seems pretty clear that a similar sort of wilful blindness to past experience has also beset a number of public-sector IT and data-management projects which have ended in costly failure.
Thinking Fast and Slow sets out to explain the causes of such all-too-familiar human inadequacy at making decisions, choices and forecasts. It draws on the author’s considerable experience and expertise as both an academic psychologist and a Nobel Laureate in economics. Reports of well-conducted laboratory experiments are used to illustrate how people actually make choices – and this often conflicts with the assumptions about human behaviour and motivation made by some economists and advertisers. The consistency and rationality ascribed to human decision-making is sometimes greater than is truly merited. (This applies to an assessment of our own decisions as well as those of others.) As the above examples show, people can be unreasonably defensive of an existing but erroneous point of view even to the extent of ignoring respectable new evidence. There is an inertia effect that produces inflexibility – particularly in regard to attitudes shared with (and reinforced by) a peer group. Kahneman calls this the What You See is All There Is (WYSIATI) fallacy. Enthusiasm for (and emotional investment in) our own particular endeavour can make us disinclined even to look at similar experience from elsewhere because we already “know” that this project – our project – is different! Kahneman is frank enough to admit that he has himself fallen into this particular trap.
Much of the book is concerned with studying how people make decisions involving risk. It is, for instance, a pillar of certain economic models that people will behave rationally in order to maximize the (financial) value of the expected outcome of their decisions. This anticipated cash payback is supposed to be the sole determinant governing a gamble or investment decision. But such a theory ignores something that most of us recognize intuitively – namely that willingness to take a risk depends asymmetrically on whether the result can be perceived as a win or a loss. We will usually accept longer odds on a chance of avoiding a loss than we will on obtaining a gain. Here is one of Kahneman’s examples. Consider the following two invitations:
(1) You have been given £1000 and now have to choose between (a) getting an extra £500 for certain or (b) tossing a coin to get an extra £1000 for ‘heads’ or nothing for ‘tails,.
(2) You have been given £2000 and now have to choose between (a) losing £500 of it for certain or (b) tossing a coin to lose nothing for ‘heads’ or lose £1000 for ‘tails’.
It is easy to see that the cash outcomes 1(a) and 2(a) are identical (a net gain of £1500), as are the outcomes 1(b) and 2(b) (a net gain of either £2000 or £1000, depending on ‘heads’ or ‘tails’). Therefore we would expect a financially rational person to respond in an identical way to both invitations. However, when invitation (1) is put to a group of people the majority will choose to accept (a) – the ‘sure thing’; but when faced with invitation (2) the majority prefer the gamble, option (b). This shows that people do not simply look at the end result when choosing how to act: the starting conditions under which a choice is made act as a reference point which gives an emotional as well as a financial perspective to the outcome.
There are other ways in which preference can be influenced by how a choice is presented – the so-called “framing effect”. An example (that Kahneman says is his favourite) involves the amount of tax-relief which is available on a per-child basis for all families regardless of family income. When students were asked the question Should the amount of relief per child be larger for the rich than the poor? the overwhelming majority (unsurprisingly) said a firm “no”. However this question is “framed” by the unquestioned acceptance of how the tax system works in (arbitrarily) setting the reference point as a childless family. If the reference point had been a two-child household then the existing taxation system would have operated as applying a “surcharge” to childless and one-child families. In that case the previous question would have been Should the childless poor pay as large a surcharge as the childless rich? When presented with this question the students were just as quick to give a vehement negative answer. But it takes only a little thought to see that one cannot logically reject both proposals since in financial terms they amount to the same thing.
These few selected illustrations from the book show that understanding how human beings make choices and predictions is a complex matter. Kahneman argues that it is helpful to imagine that there are two “systems” underlying our decision-making. System 1 is somehow associated with our “old” brain and it does the “fast” thinking, attaching much importance to emotional and physical signals. Our system 2 “slow” thinking (which possibly resides in more developed areas of the brain) is supposed to take a more calculating and analytical point of view of the decisions thrown up by System 1. Unfortunately, System 2 appears to be rather lazy and is quite likely to let System 1’s snap judgement go unchallenged more often than they should.
In many ways System 1 is very good at what it does – which is just as well because we cannot “turn it off”! It guides all the unconscious and almost-unconscious rapid decisions that enable us to walk down the street without bumping into obstacles and to react appropriately in routine encounters with other people – i.e. will I get across the road before that car comes? should I believe this salesman? can I trust that stranger? But System 1 has its limitations. We have already seen that it can be emotionally attached to the status quo and hence unmotivated to look for evidence beyond what is immediately to hand. System 1 is also more inclined to look at single moments within an experience rather than considering cumulative effects. Moreover, its objectivity can be easily lost when the way in which a choice is framed touches on some existing prejudice. Thus the conflicting answers to the two questions about child tax relief indicate that the students had a System 1 bias towards favouring the poor over the rich.
Other System 1 weaknesses explored by Kahneman include an ability to invent spurious causes behind consecutive events and a tendency to make a false connection between the truth of a conclusion and the ease with which it is reached. It is also inclined to substitute a similar but simpler question for the difficult one that is really at hand. Thus a System 1 prediction of whether or not something is likely to happen may be little more than a report on how good or bad it might feel if the event did happen.
Clearly we need an effective System 2 to be alert for shortcomings in our System 1 decisions so that we can correct at least some of them. Unfortunately System 2 can quite easily be distracted by basic physical symptoms like tiredness and hunger – something we have all probably recognize from our own personal experience. Kahneman does make some suggestions about how we might keep our System 2 in better shape: there is experimental evidence, for instance, that the simple act of frowning when dealing with a problem can improve the vigilance of System 2. But, for me at least, the book is valuable not so much as a manual giving advice about how to make better choices but rather as a fascinating and illuminating catalogue of surprisingly naive human behaviour. It has certainly helped me to understand why so many choices do not turn out the way they were meant to. And it also shows why different people can come to deeply-felt but opposing views about the same issue, perhaps depending on the way in which the issue was first presented to them.
At one level, nothing in the book should come as a surprise since, if we are honest, we will all recognize that we have made mistakes similar to the ones described. At another level, however, the sheer number and variety of shortcomings that beset human reasoning might leave us wondering how it is that any substantial human plans have ever come to fruition!
This is a very readable book, well-written and authoritative. The author does not exempt himself from the failings he describes – embarrassing though they may be! If one or two of the anecdotes and examples seem to duplicate each other, it should be remembered that repetition can also be a factor in poor decision-making – as with the man who could claim that he learned so much from his mistakes that he could be sure of reproducing them perfectly!
In this short review I have so far mentioned only a few of the many topics that Kahneman deals with. Other intriguing ideas which await new readers of Thinking Fast and Slowinclude: the notion of “halo effect”; an account of the “Moses illusion”; an introduction to the different points of view of our “experiencing self” and “remembering self”; and an explanation of how “regression to the mean” appears to justify a teaching strategy that favours blame over praise. Most striking of all perhaps is the distinction between “Econs”, the theoretical perfectly rational beings that appear in economic models, and “Humans”, the real-life faulty decision makers Kahneman has shown us to be.
Perhaps some curiosity has now been aroused! If so, it is just as well because the book has a sting in its tail. There are considerable political implications if government (or business) chooses to believe citizens (or customers) are Econs rather than Humans. The Econ is assumed to make choices based on all available evidence and with clear goals in mind. And for many organisations this is a very convenient assumption: since an Econ (unlike a Human) cannot be distracted or misled by the way that a choice is framed, any way of presenting a consumer choice is theoretically as good as another. Thus an organisation can justify putting important information in the small print it by saying that an Econ should and would read it. Moreover, since the Econ does not make mistakes(!), there is no obligation to provide any safeguards or compensation for wrong choices. The Econ model of human decision-making thus underpins the more extreme forms of libertarian capitalism; and it is to be hoped that Thinking Fast and Slow will help to discredit it once and for all.
(c) Michael Bartholomew-Biggs, 2012