Gregory (Scotland Yard detective): “Is there any other point to which you would wish to draw my attention?”
Holmes: “To the curious incident of the dog in the night-time.”
Gregory: “The dog did nothing in the night-time.”
Holmes: “That was the curious incident.”
– A. Conan Doyle, “The Silver Blaze”
Our first-year MBA students are facing a battery of exams this week. I chatted with one student who is concerned about Decision Analysis and was going in for some tutoring. Always a good idea to ask for help when you need it. I wished her great success. Exams are a form of assessment; they give you feedback on what you know. The meaning you make of the results is often more important than the results themselves. Let me explain.
This morning, I attended a meeting in which Michael Morell, the recently-retired Deputy Director of the CIA, spoke. He described at some length the operation to find Osama Bin Laden. It took nine years of arduous sleuthing. A breakthrough occurred with the identification of Bin Laden’s likely courier, Abu Ahmed. Tracking Ahmed’s movements led the CIA to a compound in Abbottabad, Pakistan, that seemed unusual for its large size relative to the rest of the neighborhood, the high walls, barbed wire, and extensive security. It also seemed odd that the inhabitants didn’t socialize: the children didn’t go to school; the compound had no phone or internet connection; the compound burned its own garbage unlike the neighbors who set theirs out for collection. On the strength of assessment of these and other facts, the CIA recommended an operation to seize Bin Laden, whom they believed was inside. The President approved; and on May 2, 2011, Navy SEALs took Bin Laden.
What is interesting about this story is that the CIA had no certainty. (See the italicized words in the preceding paragraph.) It’s a story quite like the classic Sherlock Holmes yarn, The Silver Blaze, in which Holmes deduces that the theft of a race horse was an inside job because of the silence of a watch-dog. Like the dog that didn’t bark in the night-time, the CIA deduced Bin Laden’s presence by what was not happening around the compound.
The problem is that Sherlock Holmes enjoys a degree of intellectual certainty, unlike the CIA. For instance, one of the stark events of the last decade was the CIA’s inference of the presence of weapons of mass destruction (WMD) in Iraq. A couple of years ago, a Darden conference hosted George Tenet, former Director of Central Intelligence, who discussed the WMD finding. Tenet had been quoted in 2002 as saying that the evidence provided a “slam-dunk case“ that there were WMDs in Iraq. This became a pillar of justification for the subsequent invasion of Iraq. When no WMDs were found, Congress investigated. At our conference, Tenet denied that he had offered certainty about WMDs and said that his phrase, “slam-dunk case,“ and the CIA’s evidence had been conflated by others in the Executive Branch to justify the invasion. Congressional investigators seemed to agree: Senator Bob Graham said, “The administration wasn’t using intelligence to inform their judgment; they were using intelligence as part of a public relations campaign to justify their judgment.“
So, today I asked Michael Morell what the CIA had learned from the WMD episode and how that contributed to the seizure of Bin Laden. He said that the WMD episode “triggered a revolution in intelligence. We’ve learned that we must not only discuss our assessments of evidence and our recommendations. We must also discuss our confidence in our assessments and recommendations. We say four things:
- Here’s what we found.
- Here’s what we recommend.
- Here’s how confident we are.
- Here’s why we’re that confident.“
This struck me as useful advice for business professionals—and business students. Decision-makers know relatively little with any certainty. As a general manager of a relatively small enterprise (400 employees, 800 students, 14,000 alumni), my days are filled with advice from others, much of it unsolicited and delivered with strong assertion. Yet it is interesting how the emphasis melts a bit when I ask,
- What is your evidence? What do you infer from the evidence? (This corresponds to Morell’s “Here’s what we found.“)
- Specifically, what do you recommend? Why does this dominate other possible courses of action? How well does your assessment support your recommendation? What are the risks associated with your proposal and any alternatives? What if we took no action at all?
- How confident are you about your assessment and recommendation?
- What basis do you have for that confidence? These last two questions usually lay bare the depth and breadth of the recommender’s thinking.
Confidence is hugely important to leadership. The awful WMD episode is a stark reminder of the need to probe one’s confidence as part of a decision-making discipline. By “confidence“ I refer not to the buoyant enthusiasm of the promoter or salesperson. Rather, as the Oxford English Dictionary suggests, confidence entails a belief in yourself and others, a sense of trust, and/or a grasp of predictability. Here’s where statistics and data analysis come in handy. Quantitative methods can help us assess uncertainty, build a confidence interval around our expectations, and thus help us make wiser decisions. Darden teaches data analysis because belief in oneself and others, trust, predictability, and wisdom are crucial to the development of business leaders.
To my friend, the First Year student who is uncertain going into her Decision Analysis exam: how confident are you about your uncertainty? Do you know what you don’t know? If you’re not very confident, go back and review; isolate those ideas that confuse you and dog them until you get them. Build your confidence. Hang in there. And trust the process.
And take a few minutes to enjoy Halloween.