The reader might think that being prone to so many different kinds of mistakes naturally teaches people humility. Having been wrong many times in the past, people ought to learn to hold opinions cautiously and to be skeptical about their own ability, and that of others, to predict the future. The reality, however, is that people’s cognitive shortcomings nudge them toward overconfidence.

System 1, the storyteller who sees causality where there is none, minimizes the role of luck in the success or failure of a person or a business venture. Successful corporate CEOs are thought to possess exceptional skills when in fact other, equally-skilled CEOs did less well for reasons no one could foresee or control.

On top of that, storytelling System 1 favorably rewrites people’s memories of the past. They will remember predicting the outcome of an election, or a murder trial, when in fact they made no such prediction; this is called “hindsight bias.” A variant of this mistake is holding experts to unrealistic high standards: doctors, baseball coaches, and politicians are blamed for failing to anticipate events that in fact no one could have predicted except by lucky guess. The experts themselves are convinced of their competence because, after all, they have special training and skills and are exerting considerable effort. Surely, they feel, that adds up to something positive. Thus, financial analysts and media pundits convince themselves that they have insight into the course of future events even when their track record shows that they don’t. Social and commercial incentives reinforce this illusion: no one wants to hear from a pundit or investment advisor whose standard take on the future is “Who knows?”

Having explained why people tend toward overconfidence, Kahneman considers possible correctives. One of these is to rely less on human judgment and more on algorithms, simple procedures for generating judgments mechanically. Example: The future price of a wine can be predicted more accurately by a simple formula involving the grapes’ growing conditions than by an expert who tastes the wine. Expert opinion need not be ignored entirely; it can be incorporated into the algorithm as an additional variable. And some expert opinions are, for objective reasons, more valid than others.

A professional environment characterized by regular causal processes in which decisions generate quick, error-correcting feedback, fosters the development of sound expert intuition. An environment dominated by chaotic processes, where decision-makers receive little or no immediate feedback, discourages the development of sound intuition. So, anesthesiologists should be listened to, stock pickers should not.

Another corrective for overconfidence is to compare plans and projections of a project (inside view) against the experiences previously gathered by others (outside view). Example: Kahneman was part of a team that expected to finish a writing project within two and a half years. Similar teams had taken anywhere from seven to 10 years, and some teams had disbanded without finishing. Kahneman’s team ended up taking eight years.

Focusing on inside-view data and ignoring outside-view data is a common error Kahneman and Tversky dubbed the “planning fallacy.” Since the inside-view picture of a project tends to be a best-case scenario (desired steps envisioned in detail, complications imagined only dimly, if at all), the planning fallacy typically leads to optimism. Socially, this is not an entirely bad thing. The world needs entrepreneurs who ignore the fact that most small businesses fail and who believe, for no good reason, that they are smarter than their competition.

For planners who want a check on their upbeat outlook (and perhaps don’t have much outside-view data relevant to their situation), Kahneman recommends an exercise called a “premortem.” Planners are asked to imagine a future date when they can look back and assume that their project failed. They are then invited to write up an account of why it failed. This exercise engages planners’ imaginations in the search for threats to the project and encourages them to be more realistic about its chances of success.