Saturday, January 18, 2020

The Art of Critical Decision Making by Michael A. Roberto



The Art of Critical Decision Making by Michael A. Roberto covers many difficult decisions that often focus on man made disasters. This is a worthwhile course for managers or individuals responsible for safety culture. He dissects the Columbia Space Shuttle accident. Managers adopted the wrong frame of reference setting a high bar to proving it was not safe rather than proving it is safe. The leader of the group also did not actively seek out dissenting views.

There are two responses to a threat. Confirmatory response (actively discounting the problem) presuming that our knowledge is complete and there will be no change unless there is overwhelming evidence of a problem that needs a solution vs an exploratory response actively attempting to understand the threat. In the face of an ambiguous threat learn by doing, conduct experiments, and conducting experiments.

The Three Mile Island nuclear incident is an example of complex coupled system. These often have a series of adverse events lining up to cause a problem. It is like the holes of several slices of Swiss Cheese lining up.

He also reviews the Bay of Pigs failed invasion of Cuba and the resolution of the Cuban Missile Crisis. The Bay of Pigs was an insular process where differing opinions were suppressed. In the Cuban Missile Crisis, a diversity of opinions was sought out including former President Dwight Eisenhower. The former president asked questions about on the process of decision making. The problem solving groups were deliberately set up in opposing groups who offered solutions. The other group would attempt to poke holes in the proposal and request more information. This sharpened the solutions and provided better options for President Kennedy to review.

There was a failure to connect the dots with respect to the 911 terrorists. Both the Arizona and the Minneapolis Field office had people taking pilots lessons with large commercial aircraft with little previous pilot experience. Obstacles to information sharing include differentiation without integrations when dealing with an uncertain environment. Power is derived by the control of intelligence information. Work on both formal and informal network sharing. They have adopted password protected Wikis that allow information sharing between multiple agencies.

High Reliabilty Organizations (HRO) - Design enterprises to deal with interactive complexity and tight coupling (processes that are highly dependent on each other).

  1. Preoccupied with failure ( may be a symptom of a larger problem) Actively seek out what might go wrong. Do not be preoccupied with success. Failures are not confined to a small area but may be systemic. Revealing problems allow people to work on them. 
  2. Reluctance to Simplify interpretations. Do not put problems in a small box.
  3. Sensitivity to the front line:   Invest time understanding in front line staff - Medicine used to be (ABC) Accuse Blame and Criticize.  Develop a culture of blameless reporting. 
  4. Commitment to resilience: People who are resilient don't just anticipate they focus on mitigating the risk to build resilience. 
  5. Deference to localized specialized expertise: Need to actively seek out localized specialized knowledge to find problems. 

Focus on concrete processes to improve safety. How to be preoccupied with failure. Code blue vs Rapid Response teams to diagnose people who are exhibiting ambiguous symptoms that may predict a cardiac event.  High reliability is a preoccupation with failure and resilience.

How do you become a problem finder? Seek out problems like Winston Churchill. He saw the rise of Hitler, the rise of the Iron Curtain. How do you do this. Become a voracious learner and go out in the field and ask questions. Discard the notion that you have all of the answers. The process matters.





Being a devil's advocate by Michael Roberto - 13 min. 

No comments:

Post a Comment