Case 3: The experts were wrong
This post is the third in my series looking at cases where it seems that "believing we are right" has led to bad outcomes, sometimes even spectacularly bad results, for leaders, teams and organizations.
For my upcoming book, Big Decisions: Why we make decisions that matter so poorly. How we can make them better, I have identified and categorized nearly 350 mental traps and errors that lead us into making bad decisions. In the many high-profile current and historical situations that I have examined, indications abound that the bad outcomes have been produced by mental traps and errors. My premise is that, at the least, if we recognize and admit that we don't know the answer, we will put more effort into looking for better decision options and limiting the risks stemming from failure when making important decisions.
In this case, common wisdom was trumped, in part because people were blinded by traps and errors.
"I'M WITH HER"
Hillary Clinton's campaign team believed that with Donald Trump as her opponent she would be elected President.
Winning the popular vote did not produce the victory that her team predicted: Clinton lost in the Electoral College.
Here are some mental traps and fallacies that likely led the Clinton team to err and predict victory:
Hillary's campaign team was tripped up by the Expert Problem (relying on "experts" when evidence abounds that in many cases that people assumed to be experts in their field typically do no better than non-experts in forecasting what will happen). Her advisors at least to some degree believed in the predictions of the pollsters and commentators because "they were the experts."
Evidence suggests they fell sway to the Fundamental Attribution Error (attributing one's actions to the situation while attributing the other's behaviors to their personality or character). To some extent, they believed that their campaign had been rightly directed at "what it takes to win" while they attributed Trump's campaign strategy and actions to Trump's "craziness," the actions of someone who "can't be elected President!"
Team members' individual tendencies to believe that "Hillary has this election won" were likely reinforced by the Availability Cascade (the self-reinforcing process in which a collective belief gains more and more plausibility and support merely through its public repetition).
Hubris is believing we know when we do not know. Not recognizing the limits of knowledge, seeing cause and effect when it does not exist and elevating hopes to certainties ensnared the Clinton campaign, just as it ensnares many of us when we try to predict an outcome in a complex and confused situation.
I expect to publish my new book, Big Decisions: Why we make decisions that matter so poorly. How we can make them better, later this year. It will be an antidote for bad decision individual and organizational decision making. You can help me get it published and in the hands of decision makers whose decisions not only affect their lives but all of ours.
Learn more about Big Decisions: Why we make decisions that matter so poorly. How we can make them better and my special half-price pre-publication offer. Thank you!