Case 2: The battle that didn’t go as expected

6a00d8341c594453ef01bb09b44a3d970d-500wi.png

This post is the second installment in my series looking at cases where it seems that "believing we are right" has led to bad outcomes, sometimes even spectacularly bad results, for leaders, teams and organizations.

For my upcoming book, Big Decisions: Why we make decisions that matter so poorly. How we can make them better, I have identified and categorized nearly 350 mental traps and errors that lead us into making bad decisions. In the many high-profile current and historical situations that I have examined, indications abound that the bad outcomes have been produced by mental traps and errors. My premise is that, at the least, if we recognize and admit that we don't know the answer, we will put more effort into looking for better decision options and limiting the risks stemming from failure when making important decisions.

Our second case is historical, a situation involving a revered leader who seemingly never failed.

CHARGE!

Confederate General Robert E. Lee believed that his troops could overrun the Federal's front line at the battle of Gettysburg. 

He was wrong: Pickett's charge by 15,000 Confederate soldiers against 6,500 entrenched Federals resulted in over 6,000 Confederate casualties and ended Lee’s last invasion of the north.  

Here are several traps and errors that may have led Lee to believe the charge would succeed:

  • When Lee thought about past battles, what likely came to mind was Confederate victories, along with several stalemates and almost no losses. With this winning tableau in mind, Lee may have fallen victim to the Availability Heuristic (when one assesses the probability of an event by the ease with which related instances or occurrences come to mind, rather than by carefully examining the situation and alternatives).

  • As the idolized, highly successful General, Lee was deferred to and supported by his officer corps: His officers tended not to express concerns and alternative ideas. Thus, Lee was hurt by Confirmation Bias (subconsciously dismissing information and ideas that threaten our world view and surrounding ourselves with people and information that confirm what we think): He was not hearing from his circle that the charge was a bad idea - except from his top corps commander, Lt. General Longstreet, whom Lee discounted as overly worried and advancing unworkable alternatives. This lack of a differing view was exacerbated by the absence of cavalry Major General J.E.B. Stuart, who through another chain of wrong belief had led his troops on a long trek around the Union Army that took them out of communication with Lee and yielded no useful intelligence about the situation.

  • The underlying premise guiding Lee and his officers was that the Confederate Army was better and generally won. Lee certainly knew that the last day of the Gettysburg battle was a different situation with the Rebs charging and the Federals entrenched, in their home territory, yet. But he resolved the Cognitive Dissonance (discomfort from holding two contradictory beliefs in mind) between "we are better" and "this is different" by acting on his "going in" belief that his was the better army that prevails.

Being the leader of an organization that thinks it is superior poses a double trap, as General Lee found out. The leader can actually know less because of the power and isolation of his or her position, and the past successes of the organization can blind it to risks of failure. 

I expect to publish my new book, Big Decisions: Why we make decisions that matter so poorly. How we can make them better, later this year. It will be an antidote for bad decision individual and organizational decision making. You can help me get it published and in the hands of decision makers whose decisions not only affect their lives but all of ours.

Learn more about Big Decisions: Why we make decisions that matter so poorly. How we can make them better and my special half-price pre-publication offer. Thank you!

Previous
Previous

Case 3: The experts were wrong

Next
Next

The risk of ignoring risk