In his post Cause of Death: Invalid Assumptions, my colleague Mark Moore observed that project risk management often excludes consideration of underlying assumptions on which event probabilities and prospective impacts are based. Obviously, we cannot operate without relying on what we know or we would have to reinvent the wheel every time we had to go somewhere. On the other hand, failing to challenge what we believe we know or to consider the possibility that there are relevant factors about which we have no idea (so-called “unknown unknowns”) can result in vastly underestimating risks or missing opportunities. This article will raise questions more than it will provide answers but it does suggest that some changes in PM discipline can help reduce the risks our assumptions create.
Measures of Risk in Traditional Project Risk Management
In project planning (or investment management, for that matter) risky outcomes are estimated as a weighted single figure based on a distribution calculated from the probability of occurrence and value of potential results. For instance, an investment that has a 25% chance of returning -20% of our investment, a 50% chance of returning +10% and a 25% chance of returning +25% can be budgeted to return 6.25%, which is the weighted sum of the probability multiplied by the return of each possible outcome.
In his post, Mark gave the example of his estimates of the time required to get to work, which is near the airport close to where he lives, and the airport, itself. Most days, the trip to either place is predictable but occasionally bad weather or an accident causes a delay. If he is delayed by a few minutes in getting to work, Mark can live with that. Being late to a flight, however, has a greater cost so Mark leaves a good bit earlier for a flight than he does to get to work.
In a project context, we are usually focused on the time and manpower required to execute work packages in the project schedule. Work packages are defined at a very granular level specifically to make them easy to estimate with high reliability, thereby reducing the variance of the overall project budget and schedule estimates. This is a sensible strategy, provided that our estimates are reasonably accurate, which requires knowledge, experience and valid underlying assumptions. Assumptions, however, are where problems can arise—we don’t always question what we think we know and get stung by what we assumed that proved to be misguided. In the words of the Twain quote—“what we know that just ain’t so.”
Rote Application of Risk Techniques Can Increase Risk
Very often project managers address risks by rote, based on probability and impact, which is what the PMI, and others, say they should do. So, for example, we might see something like this: a work package is estimated to take five working days but there is a 20% chance that it will require seven days, so we may budget 5.4 days, to accommodate the possible overrun. For one thing, this doesn’t actually provide the protection we need because the overrun on a single instance is stated to be either two days or nothing. If we experience the two-day overrun, our 0.4 day reserve won’t cover us entirely. Our budget would be more accurate if we had a population of a number of similar work packages, one or two of which might run over while the remainder completed on time.
So, we need to give some consideration to the number of occurrences of a risk in the work plan. We also need to consider that we may figure out a way to eliminate the risk for later occurrences, based on our experience with the earlier ones. More importantly, we need to consider the basis for our estimates of the time required and the probability of a late completion. On what assumptions have we based these? The tools to be employed in executing the work package? The experience of the team members working on it? The complexity of the work package? What about our own experience in making estimates and judging risks, in similar circumstances?
We often rely on our intuition, don’t really articulate or examine our assumptions and get away with it, until we don’t. Our “knowledge” can become a major source of project risks.
Ignoring Others’ Assumptions Increases Risk, Too
One of the biggest risks we may face is that we cannot completely control the impact others have on the risk profile of an initiative. Where this most often shows up is in the comprehensiveness and stability of project requirements. If we undertake to build a four-bedroom, center hall colonial home of a specific size, we may be able to construct a very accurate estimate of the work required. When the customer decides that instead it should be a three-bedroom contemporary with a pool and a pool house after we’ve dug and poured the foundation, all our planning and the original project’s risk profile go out the window.
Now, the homeowners must have thought they knew what they wanted. After all, they signed the contract, which is our primary risk mitigation mechanism for home construction projects. We assumed the build would be the colonial and now we’re wrong. It’s not our fault but, while there is change control process in the contract, there probably isn’t anything specifically to cover the contingency of a total plan makeover. We’re clearly entitled to compensation for work already performed and whatever it will take to rework the plan and re-estimate the project but are we really likely to preserve our profit margin on this project now?
One thing that is often done in planning a software development project is to enlist representatives from the team that will actually execute the work packages to contribute to estimating the time and effort that will be required. While this has an advantage of having them put some ‘skin in the game,’ it also may introduce variance into the work estimate, either consciously, through padded estimates, or unconsciously as a result of peoples’ differing knowledge levels, experience and opinions. If we truly want to understand the basis for these estimates, we will have to untangle our team’s assumptions, as well as our own.
So?
We must be cognizant of where our assumptions, and those of others, can cause us to mis-estimate probabilities or impacts or to miss risks entirely. The report on the September 11th attack on the World Trade Center cited ‘failure of imagination’ (which may be read as a reliance on assumptions) as a major contributing factor in our failing to predict, detect and stop the attack. The Wikipedia article on the subject also identifies the Apollo I fire and the HMS Titanic sinking as potentially resulting from the same underlying cause.
Projects we manage may not carry similar consequences to these events but they have a lot in common with respect to the threats embodied in assumptions that don’t prove out. Our only protection against such blindness is to develop intuition about how our assumptions figure into our planning and design work and commit to identify and validate them before they can kill us.