There are many ways for any plan to fail, and although most of them are too improbable to be anticipated, the likelihood that something will go wrong in a big project is high.
Daniel Kahneman, Thinking, Fast and Slow
Are you considering a major building project? An addition to your home? An information technology overhaul for your business? If so, be acutely aware of the planning fallacy — and try to avoid its pitfalls.
The planning fallacy is simply the overly optimistic forecast of the outcome of any project. One need not think too hard for examples. Close to home, our recent kitchen renovation exceeded budget by about 30%. Oxford University’s Said Business School provides this classic list of cost overruns:
- Sydney Opera House, 1,400 percent
- Concorde supersonic airplane, 1,100 percent
- Boston’s Big Dig, 275 percent
- Denver International Airport, 200 percent
- Copenhagen metro, 150 percent
- Northeast Corridor rail line, 130 percent
- Channel Tunnel, 100 percent
Blame it on optimism bias and beware the “black swan”
Why are we prone to the planning fallacy? Often, it’s just a consequence of optimism bias. According to Daniel Kahneman in Thinking, Fast and Slow: “Most of us view the world as more benign than it really is, our own attributes as more favorable than they truly are, and the goals we adopt as more achievable than they are likely to be. We also tend to exaggerate our ability to forecast the future, which fosters optimistic overconfidence. In terms of its consequences for decisions, your optimistic bias may well be the most significant of the cognitive biases.”
At other times, the cause is less innocent. Managers prey on their board of director’s propensity for optimism bias and aggressively push to get a project approved by providing the best-case scenario as if it were the likely base-case scenario. There is no analysis of the probabilities of this “likely case” scenario coming to fruition and no calculation of the risk of a worst-case scenario.
Falling victim to the planning fallacy, says Kahneman, executives “make decisions based on delusional optimism rather than on a rational weighting of gains, losses, and probabilities. They overestimate benefits and underestimate costs. They spin scenarios of success while overlooking the potential for mistakes and miscalculations. As a result, they pursue initiatives that are unlikely to come in on budget or on time or to deliver the expected returns — or even to be completed.”
And then there is the threat of a “black swan,” a rare, unpredictable, high-impact event that in retrospect seems not so improbable.
In Why Your IT Project May Be Riskier Than You Think, Bent Flyvbjerg and Alexander Budzier report that the average information technology overun is 27%. “But that figure masks a far more alarming one. Graphing the projects budget overruns reveals a “fat tail” — a large number of gigantic overages. Fully one in six of the projects we studied was a black swan, with a cost overrun of 200%, on average, and a schedule overrun of almost 70%.” No organization considering a major information technology project should proceed without careful consideration of these findings.
Avoiding the planning fallacy
To avoid the planning fallacy, first be cognizant of its dangers. Then with great discipline, question all key project assumptions. Don’t blindly accept — or if a board member — capitulate to a best-case scenario as the likely case scenario. Do your research. Find objective information on similar projects in order to articulate worst, best and likely case scenarios. Imagine the consequences of a black swan and determine whether or not you can absorb them.
Kahneman offers this three-step protocol for managing the pitfalls of optimism bias and the planning fallacy:
- Identify an appropriate reference class (e.g., school building project, IT project, family room addition, etc.).
- Obtain the statistics of the referenced class (e.g., percentage by which expenditures exceeded budget, project delays, cost per square foot, etc.). Use this objective research to generate a baseline prediction.
- If, despite your disciplined efforts, you believe optimism bias is still at play, adjust the baseline prediction as necessary.
To avoid a black Swan, Flyvbjerg and Budzier propose a “stress test.” Management should not proceed on a risky project unless it believes it can absorb a 400% cost overrun with only “25% to 50% of the project benefits realized.” Seems like a project-killer threshold test. But with a one-in-six chance of a black swan IT project event, it is a prudent exercise, indeed.