Some Ruminations On Aeronautical Decision-Making

Harold Green

by Harold Green

Please be advised that this discussion includes some whining and preaching! No panacea absolving pilots of fault for bad decisions is offered, and no single solution is put forth. Rather, it is suggested that we assume greater individual responsibility for our own safety, along with a realistic recognition of the possible consequences of our decisions.

Year after year as the general aviation accident statistics are published, we see the same thing. The vast majority of general aviation accidents are preventable if only pilots used better judgment. That has been true since records have been kept. Virtually every pilot knows this. Yet, we keep doing the same things year after year. If an individual keeps doing the same thing over and over expecting different results, we call that “insanity.”

Reflect on this quote attributed to Wilbur Wright by AOPA: “In flying, I have learned that carelessness and overconfidence are usually far more dangerous than deliberately accepted risks.” How far back does that put the issue? For further perspective, consider the fact that in World War II, more aircraft and pilots were lost to operational and training accidents than were lost in combat. In that case pilots operated in conditions that would be abhorrent to us, but the point is that the known scary stuff did not kill as many people as did the relatively routine.

My opinions are based not only on experience as a civilian pilot, but also on experience as an enlisted U.S. Air Force airborne crewmember in the mid 1950s. With direct vested interest in the outcome, close attention was paid to the actions of the pilots. (Some pilots were very good and some were not). In high risk missions, and some were very high risk, everyone knew this in advance, so we planned for and prepared for emergencies. The results were few tense moments, but no gut wrenching fear. However, in routine flying, some situations became downright scary because of lack of preparation and proper planning since vigilance tended to be relaxed. Most of the scary instances were caused by ego, stupidity or simply bad judgment in getting into the scary situation in the first place.

The key is knowing and acknowledging in advance that there is risk. Think about your own flying and the scary times you created for yourself. Probably the scariest times were when something happened that you had not anticipated, or were woefully wrong in your assessment of that risk. Why did they happen? Probably ignorance and/or wrong assessment of the risks were significant factors.

There are a plethora of intellectual devices and acronyms to help us make smart decisions, yet many of us choose either to not use them or to ignore the results if we do.

It is doubtful that any of us take off with the intent to get into trouble just to see if we can get out of it. We all know that an accident rarely is the result of one bad decision, but rather results from a series of small errors in judgment.

So how do we fix this? I don’t have any magic answers, but I believe this from experience and observation: No matter what, if the pilot does not accept and use the answers, nothing will change. Currently, we analyze each flight with the idea that we wish to conclude the flight safely, and we tend to ask questions in the vein of how can we do this. That is a good thing to do. Should our flight planning also include telling ourselves that there is risk of death in this flight? If not, why not? Maybe an additional question we should ask ourselves is “How do I avoid dying on this flight?”

The advent of modern aircraft design, avionics and data communication capability has lent another dimension to this age-old issue. While adding additional navigation and situational awareness capability to general aviation, it has also added need for additional training and another level of decision making. We now have weather depiction in the cockpit, a moving map display and facility data at the pilot’s fingertips to say nothing of a capable autopilot coupled to all our navigational aids. Along with this capability we need to train more extensively to recognize the information presented to us and to act upon it with decisiveness. Along with this there is increased demand for correct risk analysis and aeronautical decision-making.

Recently, a Cirrus SR-20 accident in the Midwest received considerable publicity, including the recorded conversations between the pilot and the control tower. While not second-guessing or judging his actions, the pilot apparently did not use either of two key resources available to him – the built-in parachute or the autopilot. Even though it was an early model SR-20, all Cirrus SR-20 and SR-22 aircraft are equipped with both since the onset. The autopilot, coupled to the GNS-430 GPS would have flown an ILS approach if set up and activated. Even so, the parachute would probably have saved him if deployed. Why these things did not happen, we will probably never know.

The following is not a plug for the Cirrus Aircraft Parachute System (CAPS), but rather recognition that as things change, we should also change our perspective.

According to Rick Beach of the Cirrus Owners & Pilots Association (COPA), there has been no fatality when the CAPS has been deployed within the operational limitations in the Pilots Operating Handbook. In the three cases in which there were fatalities, the chute had been deployed outside the recommended operational conditions: either too low or when going too fast – way too fast! In short, when deployed as recommended, there has been a 100 percent survival rate. Yet, in all too many cases, people could have deployed the chute, but for whatever reason, did not. I have heard everything from “I don’t think it works,” to “The chute just gives you a false sense of confidence.” If the pilots of those aircraft had recognized the true situation with respect to CAPS and simply said in advance, “I WILL activate the chute if…” they might still be alive.

In a parallel and more traditional situation, the fatality rate in twins would be improved significantly if pilots treated engine failures on takeoff as though they were flying a single-engine airplane and simply landed under control, straight ahead as in a single, rather than attempt to continue to take off. Is it possible that these pilots place too much emphasis on saving the airplane and not enough on preventing death?

As a final note, the U.S. Air Force Auxiliary, the Civil Air Patrol (CAP), has implemented a preflight risk analysis system, which enables any review level to cancel a flight if they are not satisfied with the risk analysis. The U.S. Air Force and Navy have such a system as well. The result has been a dramatic reduction in operational accidents. While we do not have a review level to approve our flights, maybe we should have split personalities and review our own risk analysis. Perhaps our attitude should be that the risk defines the probability of success in flying the mission. Then we review it with the idea of how might I die on this flight, and how do I avoid that.

In summary, we may be well advised to accept that we can die flying and we need to plan to prevent that from occurring. It is necessary to plan how we can complete our flight, but we also need to plan what to do if things go dramatically and drastically wrong, and then be prepared to act accordingly. In doing so, we can use the risk management decision-making tools available to all of us. They are important tools, but that is all they are. A tool is only as good as the person using it and her/his willingness to do so.

EDITOR’S NOTE: Harold Green is a Certified Instrument Flight Instructor at Morey Airplane Company, Middleton, Wisconsin.

This entry was posted in Columns, Columns, February/March 2012, Flight Safety, Pilot Proficiency and tagged , , . Bookmark the permalink.

Leave a Reply

Your email address will not be published.