Opportunity in Ambiguity
Ambiguity is something that I really respond to. I like the complexity of it. — Robert Redford
While actor Robert Redford might like ambiguity, it is generally not something that technically minded people enjoy. On the contrary, those drawn to aviation vocations or avocations strongly prefer the kind of “numbers don’t lie!” certainty that one of the lead characters in the “Hidden Figures” movie asserted. We take pride in basing our actions on data. The language of aviation is replete with the imagined certainty of binary go/no-go decisions.
After being immersed in aviation for over 25 years, I understand the appeal. As a liberal arts major, though, the alleged certainty of data often makes me squirm. The zeros and ones of binary code may seem solid. We focus so heavily now on risk management because we recognize that people, policies, and realities more often lie somewhere in the infinite number of fractions between zero and one. You can’t even permanently pinpoint which fraction, because circumstances shift continuously, whether minutely or by magnitudes.
Fractions
The classic aviation scenario of the go/ no-go decision illustrates the point. You get weather data. It can be “good” (VFR), “bad” (IFR), or somewhere in between (MVFR). That’s one level of ambiguity. The weather data becomes information when you put it in the context of a specific pilot, passenger(s), plane, and plan. Each of these elements has multiple facets, any of which can change in a heartbeat. So, it’s never a one-and-done decision. As more recent training practices acknowledge, it’s really a continuous process of putting new data (e.g., updated weather) in the context of the pilot-passenger-plane-plan elements and using that information to evaluate and manage the resulting risk(s).
Here’s another level of ambiguity and complexity. People in general and pilots in particular take pride in being rational, and in making decisions based on facts. But what about those “gut feelings” we all sometimes experience?
One of my favorite books is Malcolm Gladwell’s Blink, which explores the reasoned underpinnings of so-called snap judgments and gut feelings. The core idea is that human beings take in a great deal more data than we can consciously, or “rationally,” process. Nevertheless, other parts of the brain do note, process, and catalog data that might eventually be served up in the form of eye-blink conclusions, or in a gnawing sense of unease. The book explains that we have to work to separate the signal from the noise in such cases. But the opportunity to manage the risk of this ambiguity starts with accepting that “all available information” includes those “doesn’t look right” observations and “doesn’t feel right” instincts.
Actions
Circling back to Mr. Redford’s affinity for ambiguity, I suspect he might love aviation. I also think we aviators have more in common with the improvisational stage than we realize. We might think we prefer to operate with a carefully memorized script, using that hard data to know exactly what’s going to happen as we move through each flight phase “scene” toward the grand finale of planned destination. But aviation is more like improvisational theatre: we are constantly challenged to adapt — to accept and incorporate new data into information that influences the next move.
Improvisational theatre works because it uses the scaffolding of its “yes, and” prime directive to safely manage the ambiguity and complexity of unscripted action. Risk management offers the same kind of scaffolding to aviators — enabling us to use it for growth and discovery, while keeping safe for many encore performances. (FAA Safety Briefing – SepOct 2020)