Building Blocks & Safety Circles
There are many jokes about how airplanes of any size can legally fly only when the weight of the paperwork equals the weight of the airplane. It can appear that way sometimes, giving rise to the idea that if you adhere faithfully to all the rules and regulations for flying, you will be safe. In an ideal world, it really would be that simple. But as you have likely discovered, we do not live, work, or fly in an ideal world.
Think about it in terms of a practical example of one of my students. As required by school rules, he carefully checked the Cessna 152’s maintenance and airworthiness records before heading out to the airplane. All maintenance and airworthiness paperwork was in apple-pie order. Reaching the plane, my student discovered a sheen of oil on the nosewheel fairing. When he bent down for a closer look, he noticed that it was fresh and growing larger from the steady drip-drip-dripping of oil droplets escaping from somewhere in the engine compartment. While he had established compliance with the paperwork and maintenance and airworthiness service requirements, my student correctly concluded that a bleeding airplane was not in a condition for safe flight. We went back inside for coffee, and wound up using the scheduled lesson time to discuss safety rules, safety realities, and the concept of safety risk management.
Bricks and Mortar. There is no question that following regulations is a vital part of aviation safety risk management. The macabre truth is that many regulations evolve from serious or fatal accidents-prompted by known safety considerations. The rules, therefore, provide an essential foundation for aviation safety. They are meant to direct the pilot’s path toward practices that contribute to safe operation and away from activities that undermine it. The problem is that while regulations are necessary they are not sufficient in and of themselves. They offer comprehensive and sometimes exquisitely detailed treatment of individual issues. Still, regulations are simply not designed to cover the nearly infinite number of possible combinations of situations that can undermine safety. In this respect, regulations alone are like bricks without mortar.
Enter the system-safety approach. The term is admittedly abstract and it has a formal (and somewhat formidable) official definition. But, as the characters repeatedly assert in the slapstick Airplane! movies, “that’s not important right now.” To make the concept more concrete, think of system safety as the mortar needed to bind individual regulatory bricks together and build a sturdy barrier to accidents.
You know about the bricks, so let’s focus on the mortar. A system can be defined as a combination of people, procedures, equipment, facilities, software, tools, and materials that operate in a specific environment to perform a specific task or achieve a specific purpose. GA flight operations clearly constitute a complex system with many variables:
- Pilots have different levels of knowledge, skill, experience, ability, and discipline.
- Procedures, such as instrument approaches, can be very complex.
- Equipment, airframes and avionics, is changing rapidly.
- Services, such as those provided by airports and air traffic control, vary widely and will change significantly as Next Generation Air Transportation System technologies are deployed in the national airspace system.
- The flight environment, including weather, is a critical factor in the safety of every flight.
- External factors can have a substantial impact, especially if the pilot doesn’t consciously recognize them.
Systematic Safety Management. A key part of the system-safety approach is risk management: a decision-making process designed to methodically identify hazards, assess the degree of risk, and determine the best course of action. To put risk management to work in your personal aviation safety system, you need to be familiar with some of the basic concepts:
A hazard is a present condition, event, object, or circumstance that could lead or contribute to an unplanned or undesired event. For example, a ¼” nick in the propeller is a hazard.
Risk is the future impact of a hazard that is not controlled or eliminated.
Using the earlier Cessna 152 example, the oil leak is a hazard, but it becomes a risk if the airplane is flown. A risk-assessment matrix shows that the level of risk posed by a given hazard is measured in terms of severity (extent of possible loss), and probability (likelihood that a hazard will cause a loss). Exposure (number of people or resources affected) can also be considered in assessing risk. The hazard presented by the nick in the propeller, in the example above, poses a risk only if the airplane is flown. If the damaged prop is exposed to the constant vibration of normal engine operation, there is a high risk it could fracture and cause catastrophic damage to the engine and/or airframe and, by extension, to the airplane’s occupants. Using the chart, you can assess the level of risk as low, medium, or high.
PAVE the Way to Safe Operations. To make system safety and risk management more practical for real-world GA operations, the FAA Safety Team developed a simple three-step process:
1. Perceive the hazards listed on the well-known PAVE checklist:
- Pilot-experience, recency, currency, physical and emotional condition
- Aircraft-fuel reserves, experience in type, aircraft performance, aircraft equipment, e.g
- enVironment-airport conditions, weather (VFR and IFR requirements), runways, lighting, terrain
- External factors-allowance for delays and diversions, alternative plans, personal equipment
2. Process by evaluating the level and severity of the risk posed by the hazard you identified in step one.
3. Perform for safety by finding ways to eliminate or mitigate the severity, probability, and/or exposure of each of the identified hazards.
With consistent use, running through the three-P cycle can become a habit that is as smooth, continuous, and automatic as a well-honed instrument scan of cross-check, interpret, and control. (FAA Safety Briefing – JanFeb 2011)