Thinking, fast and slow

Thinking, fast and slow is a book by the eminent psychologist, Daniel Kahneman, that presents his view of how the mind works. It draws on recent developments in cognitive and social psychology, and includes as an appendix the "Prospect Theory" article "Judgement under uncertainty: heuristics and biases", for his part in which he was awarded the Nobel Prize in economics. The word fast in the title refers to "system 1" thinking, which operates automatically with little or no effort, and no sense of voluntary control. The word slow refers to "system 2", of mental activities that require concentration, effort  and self-control. The book examines the evidence concerning circumstances under which system 1 supplies false information to system 2. Its style is narrative rather than didactic.

Part I. Two systems
Part I presents the basic elements of Daniel Kahnemann's two-systems approach to judgement and choice. Its purpose is to introduce a vocabulary for thinking about the mind.

The cognitive effort and self-control of system 2 is shown to draw upon a limited resource of "mental energy" - and to actually involve the depletion of the blood system's glucose. The concept of "cognitive strain" is introduced as a response to effort and unmet demands -  the absence of which is termed "cognitive ease". It is shown that cognitive ease is both a cause and a consequence of pleasant feelings (when in a good mood, people become more intuitive and more creative, but also less vigilant and more prone to logical errors).

System 1 is seen as conserving mental energy while maintaining and updating a model of its possessor's personal world by forming associations with regularly-ocurring events and outcomes. It operates on the assumption that "what you see is all there is" (WYSIATI), constructing the best story it can from the information that is available and making no allowance for the existence of information that it does not have. When information is scarce - which it often is - it acts as a "machine for jumping to conclusions", putting together a coherent story without reservations about the quality and quantity of the information on which it is based. Much of the time, the coherent story that it creates is close enough to reality to provide a reasonable basis for action, but its dependence upon WYSIATI can lead to a wide variety of errors of judgement and choice.

Part II. Heuristics and biases
Part II explores some of the ways in which judgements and choices can be distorted by interactions between system 1 and system 2. The distortions are attributed either to system 2's "laziness" in resorting to an uncritical dependence upon system 1, or to its "ignorance" in being unaware of the shortcomings of system 1.

The deliberate use of heuristics to get rough-and-ready answers to  difficult questions, is a well-known system 2 strategy. An example, suggested by the eminent mathematician Georgs Polya is the substitution of an easier question (for example, responding to the question "who will win next year's presidential election?" by giving the answer to the question "who has been doing best in this year's polls?"). As an example of the involuntary (system 1) use of a similar strategy, Daniel Kahnemann cites Paul Slovic's "affect heuristic" , in which people let their likes and dislikes determine their beliefs about the world, as well as his own research (with Amos Tversky) on the "anchoring" and "availability" heuristics .

The concluding chapters of Part II are concerned with the problems of statistical inference. Fresh light is thrown on findings that, although people can make useful intuitive judgements of such matters as time and  distance, their judgements of probability are almost invariably wrong. Nevertheless, such judgements are often accepted with confidence, and system 1 is usually willing to predict rare events from the weakest of evidence.

Part III. Overconfidence
Part III explores some of the forms of overconfidence that arise from the inability of the WYSIATI approach to allow for information that it does not have. An "illusion of understanding" occurs when the mind bases its understanding of the world upon a plausible "story"  that system 1 has created from the information available to it. It is an illusion because it fails to take account of things that did not happen.