Silas is a safety inspector overseeing manned and unmanned aircraft operations after obtaining his Master of Science degree in Aeronautics
How To Determine Why Pilot Error Happened
Pilot error is a term used to describe the reason for an aircraft accident. We have heard news broadcasts and read reports from the National Transportation Safety Board (NTSB) that state pilot error caused the accident. For example, Atlas flight 3591 that crashed in Texas killed all on board. Several statements suggest that human error played a role in the fatal Amazon Air cargo crash.
This article focuses on an investigation aid using the HFACS model that identifies potential mistakes resulting from human error.
The term "pilot error" has become the buzzword to describe the root cause of an event. Instead, a better perspective related to pilot error suggests that they were the last person involved in the accident. So what happened before the accident, and why did the pilot act in a manner that caused a crash? In other words, what lead up to the pilot error. Did the pilot's judgment result from organizational influence, unsafe supervision, or precondition for an unsafe act? To investigate the root cause, a safety model presents a path to gather the details. This article will analyze the cause of pilot error using the Human Factors Analysis and Classification System (HFACS).
Significance of the Human Issue
An accident is a complicated endeavor that requires effort to understand human behavior. We associate behavior with the actions and conditions that cause the accident. Responses and mistakes add up by increasing the likelihood of an accident. The best way to understand the cause of an accident is to view the events that lead up to the crash (Martinussen & Hunter, 2018).
Understanding the root cause of an accident is best understood through a series of events. Many accidents report the last event as the root cause that happened at the crash site (Martinussen & Hunter, 2018). In most cases, the pilot was at the airplane's controls and, therefore, the cause. However, the investigation must show why a human error occurred by searching for events that led to the pilot mistake.
Actual and Potential Errors
Two human error types cause disasters, known as the actual error and potential error (Gordon, 1998). The actual errors are almost immediate, and the potential error may lie dormant in the system and cause the active error. The actual error is the trigger event creating the accident, and the potential errors develop from the manufacturer, manager, and personnel removed from the direct airplane control input (Gordon, 1998). To figure out the potential errors that lead up to the accident, safety models allow one to understand the entire chain of events. Potential errors remain within organizational oversight, training, and weather events. The HFACS model helps to identify the potential errors that lead to the actual error.
Human Factors Analysis and Classification System (HFACS)
The Human Factors Analysis and Classification System (HFACS) is a framework used to analyze the human factor aspects of potential contributing factors (Reason, 1990). Humans are intelligent beings and capable of adapting to the environment, using creative thought and awareness. However, none are immune to aviation accidents. Identifying potential error using James Reason’s Swiss Cheese Model is perhaps one of the most well-known models (Wiegmann & Shappell, 2003). This model uses barriers to prevent accidents. The figure listed below illustrates the Swiss Cheese Model.
Swiss Cheese Model
The Swiss Cheese Model has helped investigations over the past few decades. Each barrier represents a defense to prevent the accident from occurring. The latent conditions represent areas noted as potential errors. These are failures that happen before the flight that leads to the accident. Potential errors exist as organizational influences, unsafe supervision, and preconditions of unsafe acts. Next, the operational failure represents the unsafe act created by the pilot. Pilot error is the last defense to prevent the accident. However, potential errors or latent conditions that lead up to human error remains the goal to prevent.
To determine the reasons that caused the pilot made a mistake requires a safety model such as the Human Factors and Classification System (HFACS). Often barriers fail, and they leave the pilot as the last resort to defend against the accident. In this case, the goal is to identify what potential errors occurred and why.
Human Factors and Classification System (HFACS)
To present the human factor element into the equation, the Human Factors Analysis and Classification System (HFACS) describe potential failures from a list of four components. The components include organizational influences, unsafe supervision, preconditions of unsafe acts, and the unsafe acts themselves (Wiegman & Shappell, 2003). The analysis will review each layer to determine potential factors related to human error that result in accidents. They design each component of the HFACS to uncover other potential errors that cause the pilot mistake.
HFACS Model Chart
Decisions of upper management affect supervisory practices, conditions, and actions of the operator. Potential errors revolve around the causes of an accident related to resource management, organizational climate, and operational processes. The organizational errors often go unnoticed and result from human resources, equipment, policy, and culture. Having a regulatory process includes procedures that standardize, define, instruct, and provide proper safety oversight (Wiegmann & Shappell, 2003).
The unsafe supervision topic comprises inadequate supervision, planned inappropriate operations, failure to correct a known problem, and supervisory violations. The role of any supervisor is to offer the opportunity to succeed. To do this, the supervisor must provide guidance, training, leadership, and motivate employees (Wiegmann & Shappell, 2003).
Supervisors must emulate the proper role model by showing action against known deficiencies and preventing potential errors that may lead to an accident. An example may include improper landings, and supervision allows the operation to continue uncorrected (Cohen, 2013). Thus, supervision must become a proactive source instead of reactive to aviation accidents.
Preconditions for Unsafe Acts
The preconditions of unsafe acts involved environmental factors, conditions of the operator, and personal factors. Human disabilities may hurt performance because of low vision and the general knowledge of each maneuver required to land an airplane (Cohen, 2013). Human factors exist and include distraction, fatigue, situational awareness, and general design elements that resemble the precondition for an unsafe act (Blum, 2017). Individuals and organizational leaders often ignore a problem until the accident occurs (FAA, 2007). Preconditions require a proactive approach to identifying potential errors and developing controls to reduce an accident's likelihood.
Unsafe acts are errors that occur from a decision, skill, and pilot perception. The violations refer to the routine or bending of the law (Cohen, 2013). Active failures associated with landing an airplane are errors and violations having an immediate negative effect caused by the pilot. However, this article emphasizes identifying potential errors such as the lack of resources and inadequate training that support the actual error. A proactive approach requires identifying the landing hazard before an accident. For example, additional landing requirements other than the FAA minimum standard of three every 90 days will provide proficiency and lower the risk of a landing accident.
An investigation strategy requires a model to highlight the probable cause and the events that lead up to the mishap. Aircraft accident reports need more than listing the probable cause that point to the pilot. Instead, more detail is necessary to find what leads to the pilot error and how it can prevent the next accident. Using the HFACS model helps connect all of the dots that lead to the crash.
Connecting the dots requires a detailed look at organizational influences, unsafe supervision, pre-condition for an unsafe act, and the unsafe act itself. Suggesting pilot error as the accident's root cause resembles a reactive measure used during the 20th century. Instead, a proactive approach should name the events that describe the larger picture. The HFACS presents a proactive approach to identify events that lead to the root cause of the accident.
To support and understand an accident requires a systematic process to formulate the cause and develop a prevention strategy. Determining the appropriate errors and adding prevention methods requires a guide. Using the HFACS process identifies the facts resulting in the accident as humans make mistakes that occur from complex tasks.
Accident prevention is a challenging business related to what steered the pilot to make a mistake. In this case, using the HFACS model will allow for an understanding of the potential errors using the framework to target preventive mistakes. Thus, the goal is not to blame but understand the underlying factors that lead the pilot to error.
Blum, S. (2017). Aircraft automation policy implications for aviation safety (Doctoral dissertation). https://pqdtopen.proquest.com/doc/1881313198.html?FMT=AI
Cohen, T. (2013). A Human Factors Approach for Identifying Latent Failures in Healthcare Setting. Dissertation. https://commons.edu/edt/290.
Federal Aviation Administration (2007). Fatigue in aviation. https://www.faa.gov/pilots/safety/pilotsafetybrochures/media/Fatigue_Aviation.pdf
Gordon, R. (1998). The Contribution of Human Factors to Accidents in the Offshore Oil Industry. Reliability Engineering and System Safety 61(1–2): 95–108.
Martinussen, M., & Hunter, D. R. (2018.). Aviation psychology and human factors.
Reason, J. (1990) Human Error. Cambridge University Press
Wiegmann, D. & Shappell, S. (2003). A human error analysis of commercial aviation accidents using the human factors analysis and classification system (HFACS). https://www.faa.gov/data_research/research/med_humanfacs/oamtechreports/2000s/media/0103.pdf
This content is accurate and true to the best of the author’s knowledge and is not meant to substitute for formal and individualized advice from a qualified professional.