Do you learn from your mistakes? Most of us believe that we do, but do we really? Black Box Thinking by Matthew Syed takes a critical view of our frequent inability to learn and provides some solutions that support the Flawless Execution® processes, particularly the S.T.E.A.L.T.H Debrief℠.
In case you were wondering, ‘yes’ the black box referred to in the title is the recording equipment found in commercial and military aviation. The capacity to refer back to empirical evidence that black box devices provide and to attend to objective truth, however difficult, lies at the foundation of learning. To ignore the truth or turn a blind eye to it, reinforces our ignorance and condemns us to repeat the same mistakes.
Creating a learning culture
The author’s thesis is simple, yet profound. Experts are a problem in learning cultures because they act as the gatekeepers of truth. Yet, a commitment to confronting the truth, seeking root causes for error, and making systemic changes from the learning gleaned, has transformed open cultures like commercial and military aviation. The contrast of closed, expert cultures like in medicine and open, learning cultures like aviation is staggering. The author compiles a few simple statistics to demonstrate the difference. In one powerful example, Syed takes the year 2013 to compare. That year, it is estimated that between 44,000 and 400,000 Americans died as a result of preventable medical errors. In contrast, of the three billion passengers that flew on 36.4 million commercial flights worldwide, there were only 210 deaths. There were no deaths due to commercial aviation errors in the United States that same year.
Afterburner highly recommends Black Box Thinking to help readers understand the incredible power of iterative learning and a culture that is open to examining failures as a means to improve systems rather than issue blame. Syed differentiates the thought processes behind effective and ineffective learning as open-loop and closed-loop thinking, respectively.
Closed-loop Thinking vs. Open-loop Thinking & Flawless Execution
To demonstrate the failure inherent in closed-loop thinking, the author explores the antiquated medical practice of bloodletting that predominated for thousands of years. It was believed that draining blood from a patient cured them of their ills. It seems ridiculous in retrospect, but the trap of closed-loop thinking looked something like this: if the patient lived, then their survival was attributed to bloodletting, but if they died, the physician merely excused the death on the grounds that the patient was too far gone for the bloodletting to be effective. In closed-loop thinking, expertise reinforces itself without empirical testing. Open-loop thinking, however, seeks empirical evidence to understand root causes for error and then tests hypotheses by setting up experiments with both control and treatment groups. Open-loop thinking also does not seek blame. Rather, it seeks systemic failure and, therefore, systemic solutions to inevitable human error. The result of open-loop thinking is incredible successes in the face of almost impossible odds – i.e. the infinitesimal number of deaths in airline accidents in spite of the inherent danger of transporting people in a small metal tube at hundreds of miles an hour through an atmosphere incapable of sustaining life.
Part of the success of aviation in nearly eliminating accidents is the notion of marginal gains. Marginal gain is a strategy of attacking big problems or challenges by making small improvements and aggregating them over time. This same strategy is built into the iteration of frequent Flawless Execution missions. Syed expands upon this form of rapid, iterative learning as a catalyst for innovation. “Creativity,” says the author, “is a response [to failure].” It’s a response to a problem or challenge. Contradicting what most of us learned in school about collaborative brainstorming, politely accepting other’s ideas is less effective than criticizing each other. We need to be challenged and criticized in order to respond creatively to failure.
Fixed Mindset vs. Growth Mindset
Finally, Syed explores some recent research into brain function that classifies people into two categories – fixed mindset and growth mindset. Fixed mindset individuals believe their intelligence and talent is fixed; that it cannot be changed. Growth mindset individuals, however, believe that their talents can be developed through hard work. What neuroscientists have discovered is that our brain pays special attention when we fail, but that there is a measurable difference in brain activity between the fixed and growth mindsets. EEG tests record what is known as a Pe signal when individuals experience failure. The Pe signals in response to failure are three times greater in growth mindset individuals than in fixed. The fixed mindset individuals tend to ignore failure signals whereas the growth mindset individuals paid special attention to them.
At a team or organization, learning from mistakes and continuously improving comes down to a simple principle: “We must institutionalize access to the error signal.”
By Matthew Syed