Jan 2019 - Sept 2020
Yesterday I had my first Oxford case study. It was an epic fail and I want to share with you what I have learned from that.
The case study was a mock group decision under time pressure. The decision was whether or not to withdraw from a car race due to cold weather. The group was supposed to come to a decision within 45 minutes.
There was rather detailed information about the payoffs and the drawbacks of various scenarios, such as joining the race and finishing in the money (racing lingo for among the top five finishers), joining the race and the engine failing and withdrawing from the race outright.
Engine failure was a particularly large risk since it had occurred a number of times in previous races and the mechanic had a hunch that it was due to cold weather. However a quick scan of the scatter plot shows those incidents happened rather randomly across the temperature spectrum.
When the group meeting started, I suggested that we take a quick poll after reading the case. And to get the ball rolling, I volunteered my choice: I would absolutely join the race. Other group members came in either strongly or marginally in favor of joining the race.
Noticing that we had too much consensus, we assigned two members as devil’s advocates to persuade us why we should not join the race.
They went into detail with probability analysis and calculated the odds of each outcome and the expected payoff. The end result did not change any minds. In the end, we reached a consensus to join the race.
So far so good right? Or so it seemed. Alas, if we join the race, the chance of engine failure is near 100%, based on historical data. With all the smarts in our group, we came to the wrong decision. The reason is that even though engine failures occurred across the temperature spectrum, when the temperature was under 65, it happened 100% of the time. As the temperature increases, the odds of engine failure also diminish. Of course, this information was not provided to us initially, otherwise we would have arrived at the right decision. But all we needed to do was to just ask the professor for it and yet none of us thought of that.
This epic fail reveals several inherent biases in decision making:
In my job, I deal with my wealth management clients’ biases day in and day out, so I am very familiar with everything on that list…and yet I still fell for them.
At the end of the class, the professor revealed that the case was based on a real life decision NASA officials and engineers made that eventually sent seven American astronauts to their death on national television in the space shuttle Challenger mission.
Yes! Our biases can have deadly consequences!
Michael Zhuang is an EMBA student at Saïd Business School. He is also principal of MZ Capital Management, a wealth advisory firm in Washington DC.Back to top of article