With Flawless Execution, anyone can make predictions like an “expert”
Philip Tetlock is a bit of a legend. Twenty years ago he published a groundbreaking study that exposed pundits and experts as being only slightly better at predicting things than the general public…ouch! No one really wanted to hear that, and that’s a problem. We love to hear experts tell us what they think, but as Tetlock proved, their expertise doesn’t count for much. And, beyond 3-5 into the future, they are no more accurate than a “dart-throwing monkey.”
However, there are a few individuals that are very good at predicting. Tetlock calls these people ‘superforecasters.’ They don’t always get it right, but they get it right more often than not and are able to sustain their predictive accuracy over a long period of time.
But when he and his co-author of Superforecasting: The Art and Science of Prediction, Dan Gardner, write about prediction they mean something different than what we typically hear pundits talk about. For superforecasters to get accuracy in their predictions that questions they are asked to predict must be highly defined. Most ‘experts’ frame their predictions in ambiguous language so that, even if proven wrong, they can respond with “yes but what I meant when I said…was…so you see, I was actually right.” This is wat fortune tellers and horoscopes do, not superforecasters.
Superforecasting requires specificity so that the prediction can be evaluated over a short period of time. So, you don’t ask a superforecasters if we will soon have flying cars. Instead, you might ask them, “What is the probability that the Ford Motor Company will announce a joint venture with Northrop Grumman in the next 6 months.” That’s a lot less sexy, but it’s measurable…and superforecasters can be scored and tracked for accuracy.
This book provided some considerable insight into forecasting and ties it directly to Auftragstaktik, or mission command – what in Flawless Execution® we call strategic corporal concept and leader’s intent. Fundamentally, the authors argue that creating a plan is like forecasting and that the plan is less valuable than the act of planning itself. Planning is a collaborative effort that creates alignment and agility…and a better plan.
Flawless Execution practices for Superforecasting
How do you ‘superforecast?’ This book outlines “Ten Commandments” for better forecasting. Let’s look at a few of those that are most relevant for practitioners of Flawless Execution.
Strike the right balance between inside and outside views. First ask what is the base rate; or, how similar is this situation to another? This is the ‘outside’ view.Then look at how the situation is different from the base rate, or the ‘inside’view. This is great advice for a team, that is involved in the planning process and trying to estimate or identify threats and resources. These two views also highlight the value of Red Teaming. Red Teaming separates the team’s narrowed ‘inside’ view that comes from being too close and involved with a plan from the ‘outside’ view that looks at things from a detached perspective. The outside view provides a less-biased judgment of a plan by comparing it to similar plans and experiences.
Bring out the best in others and let others bring out the best in you. Superforecasters do best when they share information with other superforecasters although they come to different forecasts! This is a testament to the power of collaboration. Collaboration creates an overall better solution than any superforecaster can accomplish in isolation. In Flawless Execution, we call collaboration Teamstorming™. Teamstorming builds upon the best ideas to improve the overall plan or decision.
Master the error-balancing bicycles. Learning requires doing – even for superforecasters. To become a superforecaster you have to do it, fail, and then learn from your failures to become more proficient. It is the result of deliberate practice. Superforecasters, like practitioner’s of Flawless Execution, must stop and frequently debrief in order to improve.
Look for errors behind your mistakes but beware of rearview-mirror hindsight bias. Own your failures and conduct debriefs to discover where you went wrong. Learn from your mistakes. There is an important distinction between this ‘commandments’ and the one above about the error-balancing bicycle. What worked in the past doesn’t always work in the present or the future. Things change. That’s why lessons learned, the outputs of proper debriefing, are so important to updating and adapting standards and other best practices.
Use Superforecasting to Build Your Agile Team
There is much more to Superforecasting, that will help you think more clearly and avoid some of the common errors of estimation and prediction. It’s a worthy exercise and an entertaining read. If you want more help with superforecasting to make your agile team, contact our team below for more information!