The great British economist John Maynard Keynes is reputed to have once quipped to someone, “When my information changes, I alter my conclusions. What do you do, sir?”
In his book, Black Box Thinking, journalist and author Mathew Syed builds on the spirit of Keynes’ famous quote to explore one of the most common limiting tendencies in human behaviour: our profound reluctance to face, scrutinize and learn from our mistakes. This is especially so in the case of politics, where learning from the past and failure is generally avoided.
Syed argues that one of the most important component of success, whether for individuals or organizations, is a progressive attitude towards failure. Mistakes, he asserts, are a crucial part of the learning process, which, if properly examined, can lead to correctives that might ensure later success. Indeed, he finds it is one of the most effective ways of producing answers. But for a few reasons, including the power of self-esteem and cultures that stigmatize and punish errors, we ignore or cover up our mistakes and thereby lose countless opportunities to adjust our approaches for the better.
“A failure to learn from mistakes has been one of the single greatest obstacles to human progress,” the author writes.
Syed juxtaposes the differing attitudes to error within the airline and healthcare industries to make his point. The aviation sector, he asserts, has an astonishingly good safety record because mistakes are learned from rather than concealed; a function of the fact that, if it were not so, all involved are at mortal risk. Therefore, the information in an airplane’s black box is never covered up in the aftermath of a mishap. Instead, the data is analyzed, and experts figure out exactly what went wrong. Then the facts are published and procedures are changed, so that the same mistakes won’t happen again, and we end up with the incredible safety record of aviation today.
The medical profession, which, like the aviation industry, takes many lives into its hands everyday, has a completely different approach to failure. In an industry racked by malpractice suits, and where physicians are simultaneously expected to have a perfect record, an attitude prevails that little is to be gained from transparency when things go wrong. Mistakes tend to be covered up and not learned from. As a result more people die from errors made by doctors and hospitals than from traffic accidents.
This thinking pattern exists in varying degrees throughout society, across numerous disciplines. Syed tells us there is a host of other behavioural and psychological causes, including our culture of blame and punishment. All are linked to our desire to avoid the inconvenience of change and the blow to our self-image that comes with admissions that we were wrong.
“When we are confronted with evidence that challenges our deeply held beliefs we are more likely to reframe the evidence than we are to alter our beliefs,” Syed writes. “We simply invent new reasons, new justifications, new explanations. Sometimes we ignore the evidence altogether.” Interestingly, this is especially the case with more successful and intelligent individuals, whose “self-esteem” puts them beyond the pale of admitting error.
This tendency to avoid learning from failure and absorbing new information also occurs regularly in the fields of politics and diplomacy. At The Conciliators Guild, we see an absolute need for “black box thinking” in the field of international relations and have made one of our key mandates the encouragement of a rigorous and sincere learning process based on the examination of past failures. This includes scrutinizing assumptions about what approaches work, and don’t work in diplomacy versus new ideas and approaches which can improve the effectiveness of the craft. Our goal is to help bring excellence to the field and thereby change the future of a profession that has recently become very much in doubt.
As Syed recommends for all individuals and collectives looking for greater success, we ultimately endorse the creation of a new system and culture within the diplomatic field that will encourage governments and organizations to want to learn from errors, rather than feeling threatened by them. Contrary to instinct, it is one of the most organic and effective ways of learning and meeting our needs.