Tuesday, December 31, 2013

Incomplete List of Decision Traps



“The situation has provided a cue; this cue has given the expert access to information stored in memory; and the information provides the answer. Intuition is nothing more and nothing less than recognition” - Herbert Simon

Intuition is not always wrong or right. Intuition could be based on experience and practice. A chief firefighter directed his team to get out of a house after a fire was put down in a kitchen, just before the floor collapsed. His sense and intuition detected the original source of the fire at the basement.

Decision makings are prone to biases and other traps. In this post we will review some of them and identify possible strategies to avoid the traps.

Law of Small Numbers
  • Making generalization or decision based on insufficient evidence
  • Big Data may help – use of large data from multiple sources may provide stronger evidence


Anchoring
  • Using an initial piece of information to make subsequent judgments
  • It could be part of an adjustment process from a baseline condition one is familiar with or priming/suggestion
  • Baseline example: Investment decision to avoid a certain type of accident may be influenced by the most recent, high-cost accident (error in baseline definition)
  • Priming example: Decision makers will be willing to spend close to the value suggested in the first question to answer the second question below:
    • Would you spend $1 million (replace with $1 billion in another occasion) per year to avoid human factor accidents?
    • How much would spend annually to avoid human factor accidents?
  • Thinking the opposite” e.g. different perspectives, opportunity cost, comprehensive cost-benefit analysis may reduce anchoring effect
  • In team discussion, requesting a short memo (1 paragraph or so) from each member before the discussion could avoid group anchoring on the more vocal, opinionated or influential members 

 

Availability Effect
  • The tendency to overestimate the likelihood of events with greater "availability" in memory, which can be influenced by how recent the memories are or how unusual or emotionally charged they may be.
  • Other than frequency, factors causing availability bias include high-profile or dramatic events and personal experiences
  • Eg. A commuter train ridership may drop after a major train accident, resulting in more highway traffics and car accidents 
  • Availability cascade: a self-sustaining chain of event that leads to irrational fear and reaction to a  minor issue
  • Similar strategies to reduce anchoring effect might be relevant to reduce availability effect



Statistics Over Causality
  • Emphasizing abstract statistics e.g. percentage distribution of different accident causes without explaining the causal relationships
  • Presenting statistics with causal interpretations has a stronger effect on decision makers’ thinking
  • Data from unreliable sources could be treated irrelevant
  • Bayesian inference that combines a base rate with a new evidence and the reliability of the new source will improve decision makings


Hindsight or Outcome Bias
  • Assess the quality of a decision not by whether the process was sound, but by whether the outcome is good or bad
  • Use the best information available to determine and act for the best expected outcome

 
Narrative Fallacy
  • Flawed stories of the past shape our views of the world and expectations for the future
  • Identify real causality, use data to debunk fallacies


Planning Fallacy
  • Plans and forecasts that are unrealistically close to best-case scenarios
  • Relevant during the initial phase of planning for a risk management process and during implementations of specific risk reduction strategies
  • Could be improved by consulting the statistics of similar cases
  • Premortem: a brief mock post-mortem to discuss the causes of the failure after a decision is made (hypothetically after a year of implementation) to consider all the threats 


Sunk-Cost Fallacy
  • Decision to invest additional resources in a failing endeavor due to irrational consideration of the costs  previously invested
  • Similar strategies to reduce anchoring and availability effects might be relevant to avoid sunk-cost fallacy


Acknowledgements
  • Kahneman, D. (2011). Thinking, Fast and Slow.
  • Dilbert by Scott Adams


Wednesday, December 11, 2013

"You'll know only if you try"

Sometimes, a death could be inspiring. Reading Augusto Odone's obituary would certainly do that. He was an economist, persistent father and self-taught biologist to save his son's life. He invented the Lorenzo's Oil. He spent nights reading research journal papers and reports to find the cure for his son's rare genetic disease. Initially treated with cynicism, his invention was later embraced by the scientific community, and saved lives. The above-mentioned title was his motto.

Long live the virtues of persistence and hardworking! "Verily, after harshness there will be ease, after harshness there will be ease".