Flight 370 and strategic thinking

The tragedy of a lost Boeing 777 airliner and its passengers has dominated the news, starting with 24/7 coverage by CNN.  This news coverage has spawned and forwarded many theories about just what happened to Malaysia Airlines Flight 370, some simple, some complex, some as far-fetched as flying into a black hole.

6a00d8341c594453ef01a3fcdee68b970b-300wi.png

The cacophony over the loss of an airplane echoes an incident that dominated the news nearly 80 years ago. We are as drawn to figure out Flight 370's fate as our parents and grandparents were to learn the fate of aviation pioneer Amelia Earhart when her Lockheed Electra disappeared over the Pacific Ocean during her attempt to circumnavigate the globe.

While the phenomenon is not new, the tremendous divergence of ideas, claims, hypothesis and theories about what transpired with Flight 370 is striking, and even disturbing if the standard by which to judge the thinking is logic.

The torrent of conversation about Flight 370 offers an opportunity to explore how strategic thinking can be used to best assess a complex situation.  The applicable points for organizational strategists seeking to find the best path to a better future include situation/SWOT analysis, strategy formulation and risk assessment.

We can't begin to claim that we have expertise and knowledge to sort out what happened to Flight 370 and why it happened.  But we can offer powerful principles and approaches, mostly ignored by the media, that can help those with facts and expertise sort out and assess what happened to Flight 370.  We trust that the "best and the brightest" of the authorities, companies and consultants who are charged with understanding the situation are using these principles and approaches.  

The point we seek to make is that these same powerful principles and approaches can be used by organizational leaders to improve the effectiveness of their strategy making.

PARSIMONY

According to The Free Dictionary, parsimony is:

Adoption of the simplest assumption in the formulation of a theory or in the interpretation of data.

Thus, applying the principle of parsimony, whatever you think might have happened to Flight 370, use the simplest explanation.

Likewise, when you are looking at an organization's opportunities and threats, go to what's simplest and appears to be most likely.

This leads straight to Occam's Razor.

OCCAM'S RAZOR

From Wikipedia:

"Occam's razor (also written as Ockham's razor from William of Ockham (c. 1287 – 1347), and in Latin lex parsimoniae) is a principle of parsimony, economy, or succinctness used in problem-solving. It states that among competing hypotheses, the hypothesis with the fewest assumptions should be selected. Translated to common English: if there are multiple possible explanations for an event or result, the simplest is almost always correct.

The application of the principle often shifts the burden of proof in a discussion. The razor states that one should proceed to simpler theories until simplicity can be traded for greater explanatory power. The simplest available theory need not be most accurate."

Occam's Razor challenges us to stay with the simplest explanation of what happened to Flight 370 - but to be ready to incorporate new evidence that supports a less simple but more comprehensive explanation.

In the case of an organizational situation or SWOT analysis, when dissecting what the evidence suggests, stay with what's the simplest explanation - but be ready to change the explanation based on new information.

DEGREES OF FREEDOM

From TutorVista.com:

Degrees of freedom is the number of values which are involved in the final calculation of a statistic that are expected to vary...the independent part of data used in calculation. It is used to know the accuracy of the sample population used in research. The larger the degrees of freedom, the larger the possibility of the entire population [being] sampled accurately.

Relationship of degrees of freedom to parsimony, from Talkstats.com:

The principle is that we want our model to provide an accurate, but parsimonious description of the data. As the number of parameters in the model approaches the number of data points, the model will be better able to accurately fit any arbitrarily complex dataset, but the tradeoff is that the model is also less and less parsimonious. In the limit where there are just as many parameters as data points, all the model has really done is provide a verbatim redescription of the dataset, so that it's really not clear if we've learned anything. In practical terms, when the ratio of parameters to data points becomes too high, the generalization error of the model (i.e., the ability of the model to predict data points not found in the original data set from which parameters were estimated) suffers.   

The concept of degrees of freedom further supports simplicity, lest we "over fit" the data we have to give us a predicted outcome that in reality is not valid given the evidence at hand.  With the limited data set available on what happened to the plane while in the air, whatever "model" created to explain its fate needs to be simple.

For understanding plane disappearances and projecting an enterprise into the future for the purpose of developing strategies for the best outcome, complex interpretations built on limited information - that is, non-parsimonious models with few degrees of freedom using limited data - are more likely to fall apart as more information is introduced.

Next, consider Bayes Theorum.

BAYES'S THEORUM

According to Encyclopaedia Britannica, Bayes’s theorem is:

A means for revising predictions in light of relevant evidence.

The website askdefine.com states:

Philosophers and scientists who follow the Bayesian framework for inference use the mathematical rules of probability to find [the] best explanation. Bayesianists identify probabilities with degrees of beliefs, with certainly true propositions having probability 1, and certainly false propositions having probability 0. To say that "it's going to rain tomorrow" has a 0.9 probability is to say that you consider the possibility of rain tomorrow as extremely likely. Through the rules of probability, the probability of a conclusion and of alternatives can be calculated. The best explanation is most often identified with the most probable.

According to Wise Geek:

Bayesian probability...views likelihoods as probabilities rather than frequencies. Bayesians emphasize the importance of Bayes' theorem, a formal theorem that proves a rigid probabilistic relationship between the conditional and marginal probabilities of two random events. Bayes' theorem puts great emphasis on the prior probability of a given event -- for instance, in evaluating the probability that one patient has cancer based on a positive test result, one must be sure to take into account the background probability that any random person has cancer at all.

Bayesian inference instructs us to start with a base rate - the singlemost salient piece of known information that points to a probable answer.  In the case of Flight 370, this table from PlaneCrashInfo.com summarizing 60 years of world-wide commercial airline accident data offers a great starting point for a base rate:

6a00d8341c594453ef01a73d9987df970d.jpg

In the case of Flight 370, the applicable base rate might be that 50% of plane "events" in which passengers suffer fatalities - 542 or so of the 1,085 accidents summarized in the table - are the result of pilot error of one sort of the other.  We would thus hypothesize a 0.5 probability that pilot error was the cause of the loss of Flight 370.

Note that we do not start with hijacking as the premise.  The data show that "sabotage" caused only 9% of plane accidents with fatalities - about 98 incidents.  In support of the infrequency of hijackings, whether resulting in fatalities or not, my count of Wikipedia's list shows only 122 hijackings of commercial airplanes since 1932, with the most in the 1970s -1990s. After 2001, the rate of hijacked airliners dropped dramatically, presumably due to increased airport and airplane security.

Then using Bayesian inference we would look for new evidence that we would use to modify the base rate. As Robert Hagstrom writes in The Warren Buffet Portfolio:

Bayesian analysis gives us a logical way to consider a set of outcomes of which all are possible but only one will actually occur. It is conceptually a simple procedure. We begin by assigning a probability to each of the outcomes on the basis of whatever evidence is then available. If additional evidence becomes available, the initial probability is revised to reflect the new information.

Bayes’s theorem thus gives us a mathematical procedure for updating our original beliefs (which had resulted from what he called a prior distribution of information) to produce a posterior distribution of information. In other words, prior probabilities combined with new information yield posterior probabilities and thus change our relevant odds.

One factor that appears not to have played a role in the fate of Flight 370 is weather. Bayesian analysis would have us revise our base rate 0.5 probability of pilot error upward to reflect the elimination of weather as a cause. The table on accidents with fatalities shows 12% attributed to weather.  If we remove weather as a cause for Flight 370, our probability estimate that pilot error was the cause grows to 0.57.  (The math, while unimportant in understanding the effect, is: 0.5 pilot error base rate / (1.00 all factors - 0.12 weather factor) = 0.5681818 revised probability of pilot error.)

Similarly, other evidence can be incorporated to modify the base rate prediction - perhaps even to justify moving to another cause as more probable.

But why not instead start by considering something unique and unforeseen, be it flying into a black hole, capture by extraterrestrials or a meteor strike?  (These and many more wild theories were put forth in the wake of the plane's disappearance.)  

Could the loss of Flight 370 be the result of a true "Black Swan" event?

BLACK SWAN

Nassim Nicholas Taleb writes in The Black Swan:

A Black Swan is an event with the following three attributes.  First, it is an outlier, as it lies outside the realm of regular expectations, because nothing in the past can convincingly point to its possibility. Second, it carries an extreme impact.  Third, in spite of its outlier status, human nature makes us concoct explanations for its occurence after the fact, making it explainable and predictable.

. . .

How is it that some Black Swans are overblown in our minds when...we mainly neglect Black Swans? The answer is that there are two varieties of rare events: a) the narrated Black Swans, those that are present in the current discourse and that you are likely to hear about on television, and b) those nobody talks about since they escape models...  I can safely say that it is entirely compatible with human nature that the incidences of Black Swans would be overestimated in the first case, but severly underestimated in the second one.

How do convoluted hijacking theories strike you?  As seemingly "likely" Black Swans because the media have forwarded them?  Here are two, combined: The plane was hijacked by a passenger who figured out how to use a mobile phone to hack into the plane’s flight management system to change the plane’s speed, altitude and direction. and who then flew it in the shadow of another airplane to escape radar detection.

While a Black Swan event indeed could have caused the loss of Flight 370, the likelihood and especially the predictability of a true Black Swan occurrence in conjunction with Flight 370's fate is extremely low - that's why such events are labeled "Black Swans."  

SYSTEM 1 AND SYSTEM 2, AGAIN

A reason we don't typically think like Bayesians and use a base rate reweighed by new data as it emerges is also a reason why, however remote, we miss seeing the true possibility of Black Swan events.  When presented with a situation, we are programmed to immediately "fit" the information at hand to our experience - magnified by what is most current to us and what instinctively seems "like" or relevant to the situation - whether it is really applicable or not.

As described in our last post, Often wrong but never in doubt, in his book Thinking, Fast and Slow, Nobel Prize winning psychologist Daniel Kahneman describes research that he and his late partner, Amos Tversky, conducted on judgment and decision-making.  His premise is that our minds operate through two systems: System 1, which is automatic, intuitive and through associative memory instantly fits the sensory input we receive into the framework that fits best using our previous experience, and System 2, which is controlled and deliberate. Using System 2 to identify a base rate and modify it using new evidence demands attention, choice, concentration and effort to overcome System 1's intuitions and impulses.

What was the first thing you thought of when you heard Flight 370 was missing?  My mind (System 1) jumped to terrorism.  Odds are yours did, as well, given the years-long focus we have in the U.S. and elsewhere on terrorists commandeering airplanes. Yet, we have already demonstrated that the way we are programmed to think led us astray, at least from what Bayesian inference would have us consider as the most likely initial hypothesis.

BACK TO THE FUTURE

The intent of this exercise in applying statistical methods to Flight 370 is to highlight seemingly powerful approaches that can improve strategy development.  The extent to which strategic thinkers, analysts and leaders in organizations are using these methods is not apparent.  (Perhaps we can start with a base rate estimate of 0.01 and look for evidence of use!)

Yet the possibilities offered by this analytical thinking are fascinating and maybe even compelling.  Consider:

  • Increasing confidence about the rate of adoption of a given technology which the organization is developing or can develop.


  • Weighting the likelihood of various organizational threats - weather related, competitor based, consumer attitudinal, whatever.


  • Positing a key strategy and then playing out its most likely impact using Bayesian inference.


  • Elevating one scenario or even vision for the organization over others proposed based on its parsimony, "fit" and degrees of freedom.

This approach is far from the "chasing every shiny object" mode of operating that far too many organizations pursue.  Maybe "slow thinking" is true "strategic thinking."

Previous
Previous

Success in the long run

Next
Next

Often wrong but never in doubt