[Infowarrior] - Can you envision a “successful failure”?

Richard Forno rforno at infowarrior.org
Fri Jul 13 06:14:17 CDT 2012


(c/o MM)

July 13, 2012

Can you envision a “successful failure”?

Filed under: Catastrophes,Risk Assessment,Strategy — by Philip J. Palin on July 13, 2012

http://www.hlswatch.com/2012/07/13/can-you-envision-a-successful-failure/

In the movie Apollo 13 — recounting the nearly deadly 1970 moon mission —  the heroic NASA mission director says, “Failure is not an option.”

The real hero — Gene Kranz — never said this.   It’s a scriptwriter’s creation.   After the movie’s success, Mr. Kranz did use the phrase as the title of his memoir.

Failure is always an option.  We recently received several reminders of this reality:

The final report on Air France Flight 447 found that “the crew was in a state of near-total loss of control” because of inconsistent data reports.

A  Japanese parliamentary commission found the Fukushima nuclear emergency was a “profoundly man-made disaster.” (See a good summary from the BBC.)

Last week from Columbus, Ohio to Charleston, West Virginia to Washington DC the best laid plans of intelligent people and competent organizations unraveled before an unexpected strong storm.

There was failure.   There was passivity, fear, denial, selfishness and greed.

At Fukushima and in response to the derecho there was also creativity, courage, patience,  generosity, self-sacrifice and resilience.  We don’t know enough about what happened over the South Atlantic to be sure, but I expect even in those horrific 3 minutes, 30 seconds the full range of humanity could be found.

Across all these situations there was uncertainty.   Some level of uncertainty is innate to nearly every context.  But we are increasingly adept at self-creating even more.

Responding to the Air France Final Report, William Voss, President of the Flight Safety Foundation, told The Guardian, “Pilots a generation ago would have… understood what was going on, but [the AF447 pilots] were so conditioned to rely on the automation that they were unable to do this,” he said. “This is a problem not just limited to Air France or Airbus, it’s a problem we’re seeing around the world because pilots are being conditioned to treat automated processed data as truth, and not compare it with the raw information that lies underneath.”

It’s a problem well-beyond commercial aviation.  We organize much of our lives around the assumption that automated processes will persist and critical information will be available.  We expect to be warned of a threat, about the location and condition of our family and friends,  and about when a crisis will be over.  We expect to be able to access our credit and cash accounts. We expect to be able to travel from here to there to purchase what we need and reunite with those we love.   If necessary, we expect to be able to call 911 and quickly get professional help.  Over the last two generations everyday life has — increasingly — demonstrated these are reasonable expectations.

We are habituated to success.

But like the Air France pilots, when our information habit is not being fed our response can be self-destructive.   In the absence of information we tend to continue as usual or focus on restoring access to information. Both behaviors can significantly increase our risk by ignoring rapidly changing conditions and/or delaying thoughtful engagement with changed conditions.

The Apollo 13 Review Board found the accident, “…resulted from an unusual combination of mistakes, coupled with a somewhat deficient and unforgiving design.”

The deficient and unforgiving design that many of us — private citizens as well as public safety agencies — have adopted is dependence on just-in-time information.

My twenty-something children  seldom pre-plan in any significant way. They expect cell phones, text messaging, Facebook, and email to allow them to seize the best opportunities that unfold.   It works and I envy them.  Except when it does not work.  Except when these digital networks fail.

Much of our consumer culture is built around the same approach. We have become an economy, a society optimized for just-in-time. It can be a beautiful dance of  wonderful possibilities emerging in a moment and rapidly synchronized across time and space.  Until the music stops.

In the three examples above (not all catastrophic) there is a shared over-confidence in the fail-safe capabilities of protective design and effective communications.   In each of these cases the design bias increased risk exposure, communications was confusing or worse,  and both the design and the communications protocols complicated effective human response once risk was experienced.

There are several contending definitions of resilience.  Something that all the definitions I have encountered share is an expectation of failure.  Resilience is in many cases the learned-response to failure.  If it doesn’t kill you, you can learn from it.   The good news — and the bad news — is that catastrophes are sufficiently rare that we don’t get many opportunities to learn about catastrophic resilience.  What is a “forgiving design” for encountering catastrophe?

In April 2010 Jim Lovell, the commander of Apollo 13, called the mission a “successful failure.” Lovell explained that while Apollo 13 never reached the moon, there was  ”a great success in the ability of people to take an almost certain catastrophe and turn it into a successful recovery.”

Envision a complete blackout of telecommunications (voice and data) across a region, say, extending from the mouth of the Susquehanna River south to the Potomac River and from about the Bull Run Mountains in the West to the Chesapeake Bay in the East.  This encompasses roughly 5 million residents.

Such a blackout for any sustained period  is an “an almost certain catastrophe”.   Can we envision how to “turn it into a successful recovery?”  What could be done?  What should be done?  What does the mental exercise (more?) tell us about our dependencies, our operational options, mitigation opportunities, and creativity?

I know, I know… such an event is wildly unlikely… nearly unimaginable.  Just about as silly as a bad thermostat undoing a mission to the moon.



---
Just because i'm near the punchbowl doesn't mean I'm also drinking from it.



More information about the Infowarrior mailing list