Colgan Flight 3407: Human Factors of the Accident Case Study

Exclusively available on Available only on IvyPanda® Written by Human No AI

This paper is a summary of human factors that could have contributed to Colgan Air, Inc. Bombardier DHC-8-400, N200WQ, operating as a continental connection flight 3407 as it approached Buffalo-Niagara International Airport. It crashed into a residence in Clarence Centre, New York, about five nautical miles northeast of the airport. This resulted in the loss of four crew members, 45 passengers, and one person on the ground. The cause of the accident was attributed to the failure of the flight crew to monitor airspeed according to the rising position of the low-speed cue, the flight crew’s failure to stick to the cockpit procedures, the pilot’s failure to manage the flight and Colgan Air’s inability to follow procedures for airspeed selection and management when approaching icing conditions.

The Human Factor Analysis and Classification System (HFACS) is a tool that was developed for testing human error and the causes of aviation accidents. It was first tested by the US military and has been applied to commercial aviation accident records maintained by the National Transportation Safety Board (NTSB). HFACS is critical in analyzing all critical safety issues that need to be addressed and are in need of intervention research.

Unlike mechanical errors, the causes of human errors are generally elusive and qualitative. In addition, the techniques used are less sophisticated than those used to investigate mechanical errors. This makes a determination of human error analysis a tedious practice, and as a result, the information entered in the accident database is ill-defined. Due to the limited feedback given to the government and airlines, it becomes less difficult to determine the proper types of safety or research programs to sponsor. HFACS describes three types of errors: skill-based, perceptual, and decision errors. The most common form of error is the decision error (Reason, 1997).

Decision-making errors in aviation are considered to be errors in naturalistic contexts. This means that the decision-maker acts in accordance with the understanding of their own situation. There are, however, major problems that are encountered when determining decision errors in such contexts. To begin with, there is no clear definition of the level of “correctness.” This means that the best decision may not be well defined as it would be structured in a laboratory test. Secondly, one cannot determine a good decision simply by judging from the event outcome and the decision that was made (Reason, 1997). The situation may be beyond the decision maker’s control, thus resulting in an undesirable outcome. It is, therefore, important to understand the nature of naturalistic decision making more so in the aviation industry.

Naturalistic decision making involves a close analysis of the current situation and defining the nature of the problem in addition to its contributing factors. A solution should, therefore, be effectively arrived at and applied to the situation. Unfortunately, most decisions that are arrived at do not usually involve an exhaustive analysis of all the above.

Naturalistic situations tend to compel the decision-maker to “satisfice.” This means that they would opt to go for a decision that would meet their immediate goal. Models such as Multi-Attribute Utility Theory may not be applied because it requires time to investigate all options before arriving at a decision thoroughly. In an emergency situation, the decision-maker will be more likely to apply their own expertise and flexibility (Geller, 2000).

Decision-making processes involve two components: situation assessment and choosing a course of action. Situation assessment is where the risk is assessed, the level of the risk is determined, and the time it takes to solve the problem is evaluated. Once all this has been carefully established, a course of action is then chosen. Thus Colgan Air’s inadequate procedure for airspeed selection and management during approaches in icing conditions falls under this category. This is because the problem should have been evident to someone with domain expertise.

Errors may thus occur in two ways: wrong interpretation of the problem and choosing the wrong course of action. Cues about the situation may be adequately identified but misinterpreted, ignored, the amount of time misdiagnosed, or the risk levels may be wrongly assessed. The course of action errors may occur because the decision-maker may have failed to retrieve the appropriate response from memory, or the response may not have been known to them. In addition, the decision-maker may only retrieve one kind of response when there are actually several responses to choose from. The decision-maker may fail to visualize the consequences of their responses to the situation.

There are a variety of factors that contribute to decision errors. To begin with, a certain level of ambiguity exists when making decisions. This means that the cues that indicate a problem may not be clear to the decision-maker. The situation may deteriorate gradually, and the decision initially made may not be effective. Some events may not occur as frequently, and the decision-maker may not be equipped with enough experience to determine a certain course of action. The weather and system functions are ever dynamic; that is, they are ever-changing. This can pose a big challenge in the situational assessment. There are some decisions that have certain consequences, and if the situation is ambiguous, changing the course of action may prove to be difficult. Such decisions include rejecting a take-off or delaying a flight (Geller, 2000).

Secondly, the risk may be underestimated. This often happens when the crew faced a similar situation, and a particular course of action worked to their advantage. The crew will take the same course of action, hoping that it would lead to the same favorable outcomes.

Thirdly, conflicting factors such as the organizational factors and the overall safety of the flight may be in conflict. Most organizations advocate for fuel-saving, on-time arrival, and departure, etc. the pilot may overlook the safety of the flight and take a risk in order to meet the organization’s expectations. Social factors such as peer pressure may encourage the pilot to engage in risky behaviors that may put the flight at risk.

Fourthly, consequences that the decision-maker had not anticipated may arise, especially in situations where the decision-maker may be under time pressure. They may not be able to visualize the outcome of their course of action or may realize their mistake when it’s too late. Whilst under stress, the decision-maker may fail to retrieve the appropriate responses even though they have enough knowledge to deal with the situation.

Skill-based errors are another category of errors. People put in little or no effort when turning the steering wheel or changing gear in an automobile. Similarly, pilots utilize skills such as radar movements and visual scanning without conscious thought. Unfortunately, these practiced and automatic behaviors are subject to attention and/or memory failure. The skill and manner with which the pilot flies the plane also affects the safety of the flight. The pilot’s failure to effectively manage the flight could be attributed to skill-based errors (Johnson, 2003).

Perceptual errors occur when sensory input is degraded. At times the aircrew may misjudge distances, altitude, and descent rates. They may as well react inappropriately to a variety of visual or vestibular illusions (Johnson, 2003).

Other factors that contribute to aviation accidents are the substandard conditions of the operator. These conditions have been divided into three. The first category is adverse mental states where the mental state of the operator is believed to affect their performance. This typically includes fatigue, pernicious attitudes such as overconfidence, and loss of situational awareness, all of which contribute to unsafe acts. The second category involves adverse physiological states that determine the safe conduct of a flight. Factors such as intoxication, illness, and other pharmacological, medical conditions directly affect performance. The third category is the physical and mental limitations of the operator. There are situations where individuals may not have the skill or time to deal with available sensory information safely. At times this information may be unavailable to the operator.

Substandard practices of the operator, such as crew resource management, may lead to confusion and poor coordination in the cockpit. Crew resource management, in this case, is used to refer to failures of both intra- and inter-cockpit communication. In addition, it can be used to refer to a lack of communication between ATC and other ground personnel. There are instances under substandard practices where crew personnel fails to work together as a team before, during, and after a flight. Fatigue in cases where the crew is not allowed enough time to rest may result in inadequate preparation for the flight. Personal readiness also accounts for those instances where the crew may not observe good dietary practices and the violation of rules and restrictions.

Crews are entirely responsible for their own actions and therefore are held accountable for any consequences. However, there are latent failures that can be attributed to those that supervise the crew. This may be as a result of failures within the chain of command. This may be due to the lack of a supervisory action or inaction. It is expected that the crew is trained, given adequate professional guidance, oversight, and operational leadership. When this kind of training is not offered to the crew, they end up feeling isolated, as day to day operational risks continue to increase. Supervisory failure takes many forms. The first form is whereby there are planned inappropriate operations. This includes all aspects of improper crew scheduling and planning, such as crew pairing, or crew rest. The remaining two forms are failure to correct known problems and supervisory violations. These two are similar but are considered to be different under HFACS.

Cases, where the supervisor is aware of certain deficiencies among individuals or equipment but allows the flight to proceed, can be attributed to failure to correct known problems. A good example is whereby the supervisor knowingly allows inappropriate behavior that fosters an unsafe atmosphere even though the behavior doesn’t break any particular rule or violation. Instances, where violations of strict rules and regulations are ignored by the supervisor or violated by the supervisor, is considered to be a supervisory violation. A good example is whereby the supervisor allows the crew to fly without current or valid qualifications, consequently setting the stage for an accident (Shappell & Wiegmann, 2001).

Organizational influences, such as unsound decisions by upper-level management, can directly affect supervisory practices. Unfortunately, they usually go unnoticed or unreported by even the most skilled and honest accident investigators. These organizational failures are usually attributed to three causes: resource management, organizational climate, and operational processes. Management, organization, and allocation of organizational resources fall under the resource management category. Cooperate decisions usually have two objectives in mind while making decisions regarding resource management: safety and on-time factor. In times of fiscal austerity, safety is rarely considered, especially where there have to be some budget cuts. Organizational climate involves the organization’s treatment of individuals. Communication and coordination are vital within the organization. The operational process is whereby the procedures and oversight within the organization are observed. If these factors are managed poorly, they may have a negative impact on operator performance and system safety.

References

Geller, E. (2000). Behavioral safety analysis: A necessary precursor to corrective action. Journal of Professional Safety, 22 (2), 37-56.

Johnson, C.W. (2003). A Handbook of Accident and Incident Reporting. London: Glasgow University Press.

Reason, J. (1997). Managing the Risks of Organizational Accidents. New York: Ashgate Publishing Company.

Shappell, S. & Wiegmann, D. (2001). Applying Reason: The Human Factors Analysis and Classification System (HFACS). Journal of Human Factors and Aerospace Safety, 35(1), 59-86.

More related papers Related Essay Examples
Cite This paper
You're welcome to use this sample in your assignment. Be sure to cite it correctly

Reference

IvyPanda. (2020, May 27). Colgan Flight 3407: Human Factors of the Accident. https://ivypanda.com/essays/colgan-flight-3407-human-factors-of-the-accident/

Work Cited

"Colgan Flight 3407: Human Factors of the Accident." IvyPanda, 27 May 2020, ivypanda.com/essays/colgan-flight-3407-human-factors-of-the-accident/.

References

IvyPanda. (2020) 'Colgan Flight 3407: Human Factors of the Accident'. 27 May.

References

IvyPanda. 2020. "Colgan Flight 3407: Human Factors of the Accident." May 27, 2020. https://ivypanda.com/essays/colgan-flight-3407-human-factors-of-the-accident/.

1. IvyPanda. "Colgan Flight 3407: Human Factors of the Accident." May 27, 2020. https://ivypanda.com/essays/colgan-flight-3407-human-factors-of-the-accident/.


Bibliography


IvyPanda. "Colgan Flight 3407: Human Factors of the Accident." May 27, 2020. https://ivypanda.com/essays/colgan-flight-3407-human-factors-of-the-accident/.

If, for any reason, you believe that this content should not be published on our website, please request its removal.
Updated:
This academic paper example has been carefully picked, checked and refined by our editorial team.
No AI was involved: only quilified experts contributed.
You are free to use it for the following purposes:
  • To find inspiration for your paper and overcome writer’s block
  • As a source of information (ensure proper referencing)
  • As a template for you assignment
1 / 1