Critical Evaluation of Workload Methodologies
The workload is an essential notion in the discipline of human resource management; in the field of aviation, this notion gains critical importance because it is one of the factors upon which the safety of airport and aircraft personnel depend. Inappropriate levels of workload may inhibit the performance of pilots, dispatchers, or security personnel, leading to mistakes in aircraft control and air route management or increased risk of terrorism. Therefore, it is of paramount importance that airport staff are assigned an appropriate amount of work which does not cross certain acceptable limits of workload (both upper and lower ones).
The workload can be defined as a “required demand for the human” (Wise, Hopkin & Garland 2009, p. 14-4); in other words, it is the amount of work which needs to be done by a certain employee. However, this definition only depends upon the external source, the one that requires the work to be done and largely ignores the state of the employee. Therefore, it might be recommended to use a different definition of workload—for instance, one that identifies it as the ratio of human resources available for performing a certain task to the amount of the resources needed to perform it (Wise, Hopkin & Garland 2009).
According to Stanton et al. (2010, p. 127), both excessively low and excessively high levels of workload can lead to adverse consequences including stress, fatigue, mental saturation, high rates of mistakes, boredom, complacency, and lowered vigilance. To avoid such problems, it is crucial to make sure that the employees’ levels of the workload are appropriate. For this purpose, there exist several ways to measure workload. Three types of workload assessment, in particular, have proved to be of the greatest utility, specifically when considering human-machine interface design; these are the measurement of performance, the assessment of subjective ratings, and the estimation of psychological parameters (Wise, Hopkin & Garland 2009).
Three main types of performance measurements are offered: primary-task measures, which assess reaction time, speed of performance, and so on; secondary-task measures, which assess an employee’s performance on secondary tasks when the employee is also engaged in the execution of a primary task; and reference task assessments, which estimate performance before and after executing the primary task and compare it with performance on a secondary reference task (Wise, Hopkin & Garland 2009).
While these three performance measurements are useful, they are unsuitable for a vast array of situations because, at some levels of task difficulty, the performance of employees is not correlated with the amount of effort that they put into their work (Wise, Hopkin & Garland 2009). Therefore, it is also possible to employ subjective measures, such as the Subjective Workload Assessment Technique, or SWAT; however, it is not advised to utilise such methods on their own, for their results can be affected by factors other than workload (Wise, Hopkin & Garland 2009). Apart from subjective measures, physiological measures can also be employed (Lean & Shan 2012). Some of the most frequently used physiological measures include electrocardiography and heart rate measurements (Dahlstrom & Nahlinder 2009; Lean & Shan 2012).
In the field of aviation, the levels of the workload of employees must be appropriate, for if there is an overload or an underload (which is, in most cases, less likely) (Stanton et al. 2010), the lives of both staff and passengers may be threatened. Most obviously, this issue pertains to the personnel who are responsible for controlling flight paths and operating the aircraft, such as pilots and air communicators. For instance, if a pilot is mentally overloaded and has to deal with too much information, an instance of “load shedding” might occur in which the pilot forgets a piece of important information. However, the danger of overload also relates to the rest of the airport staff; for instance, security officers with inappropriate levels of workload may suffer from reduced vigilance and miss an illegal or potentially dangerous object carried by passengers who are to board the aircraft, increasing the threat of terrorism (Sweet 2008).
To ensure that the workload of security personnel in airports is appropriate, it is possible to take some steps in security management. For instance, Richardson, Gordon and Moore (2007, p. 83) stress that funding allocations for security measures should be made considerably more often than once a year, preferably monthly and that this funding should be based on a transparent formula for determining workload. This formula should be precise; also, it ought to be based on calculations carried out by several monitoring organisations, not only by airport management (Richardson, Gordon & Moore 2007). In the same way, the allocation of security personnel should be carried out often as well and, because situations in airports tend to change rapidly, ought to be monitored regularly to ensure that there is always enough staff to properly control the passengers and their luggage (Sweet 2008). Utilising innovative technology can also help reduce the workload of security personnel in airports due to the higher efficiency granted by such technology (Richardson, Gordon & Moore 2007).
It is also important to emphasise that too little workload harms employee performance as well, as demonstrated by the Yerkes-Dodson Law (Corbett 2015). According to this law, one’s arousal is related to performance in a manner which can be graphically visualised as a curve. That is, as one’s arousal increases, his or her performance also increases but only until a certain point of arousal is reached; after that, performance starts decreasing. This turning point may be different for different individuals (Corbett 2015). Thus, when an employee has a workload that is too low, especially one that involves repetitive or routine tasks, he or she has a low level of arousal as a result; consequently, his or her performance will also be suboptimal. However, one should not try to increase arousal indefinitely, for doing so will also harm performance.
Proper levels of workload in airports are paramount not only because of possible harm to profits but primarily because of safety risks for both the personnel and passengers that inappropriate levels of workload cause. There are several techniques for assessing workload which should be regularly used to estimate the levels of the workload of airport personnel; actions should be taken if these levels are too high or too low. In particular, for security officers, the workload should be measured and corrected more often so that their vigilance remains high, and the risk of terrorism does not increase.
Case Study of CRM and Systemic Failures in an Accident
Abstract
The current paper provides an analysis of an aviation accident that took place in Canada in 2011 and considers some of the factors which caused it. The paper utilised the official report of the investigation of the Transportation Safety Board of Canada as the source of information about the crash, as well as a variety of scholarly sources (e.g., books and journal articles) for further theoretical analysis of the issue. It was found that the accident occurred, to a large extent, as a result of failed communication between the pilots, which in turn resulted from the lack of proper and precise phraseology and the absence of guidelines for emergencies in CRM protocols. It is concluded that to avoid aircraft crashes due to similar reasons in the future, it is essential to upgrade the CRM protocols by adding standard phrases which would permit fast, clear, and precise communication in emergencies, as well as overall guidelines for behaving in emergencies.
Introduction
Crew resource management (CRM) is a set of management methods and systems which allow for maximising the effectiveness and efficiency of procedures in situations where close and precise coordination of crew members is of critical importance because their errors may lead to serious adverse consequences. CRM developed from cockpit resource management, which was introduced in aviation in 1978 and grew to be used in other branches of industry as well (Kanki, Helmreich & Anca 2010). However, CRM remains critical in aviation because it greatly increases the efficacy of collaboration between crew members; failures to adhere to the guidelines of CRM, or systemic errors in the principles of CRM in aviation, as will be shown, may lead to devastating and even lethal outcomes.
This paper considers an aviation accident which took place on 20 August 2011, involving a charter flight from Yellowknife Airport in the Northwest Territories (Canada) to Resolute Bay Airport in Nunavut (Canada). First, brief general information is provided about the accident, and the training and condition of the members of the crew are considered. After that, the communication problems that took place between the two pilots before the accident are scrutinised in detail; some of the most important instances of miscommunication are discussed. Next, these instances are analysed further and are associated with several systemic failures from which they probably resulted before the crash. After that, recommendations which might help avoid similar problems in the future are made. Finally, some overall conclusions related to the situation in question and the use of CRM in aviation are supplied.
The Accident: A General Description
The details of the crash were investigated; on the whole, the situation was as follows. The aircraft, a Boeing 737-210C, was carrying 15 people: 11 passengers and 4 members of the crew; only 3 passengers survived the crash with serious injuries (Transportation Safety Board of Canada 2014b). It was concluded that the accident was a controlled flight into terrain (Transportation Safety Board of Canada 2014a).
The investigation revealed that the crash was caused by several factors; these included a failure to intercept the runway localiser beam and thus deviating from the required course to the right, consequently crashing into a hill; the use of an unstable approach until the moment of the crash; and the difference in the two pilots’ understanding of their deviation from the course and their failure to communicate effectively to take the proper corrective action (Transportation Safety Board of Canada 2014b). It is also stressed that the pilots’ communication problems played a key role in the outcome of the crash, which implies that considerable improvements in the sphere of CRM need to be made to avoid accidents due to similar reasons in the future (Transportation Safety Board of Canada 2014b).
Crew Training and Condition
The captain of the aircraft received his first training in CRM in 1997, and his last recurrent CRM instruction was provided to him in November 2010. The fatigue of the captain was not judged to be a contributing factor to the crash (Transportation Safety Board of Canada 2014a, p. 7). The first officer (FO) of the crew obtained his first training in CRM in 2007, and his last recurrent training was finished in March 2011; in his case, it was also decided that fatigue did not serve as a contributing factor. Although these pilots did receive CRM training on multiple occasions, it was probably shortened for the pilots of the First Air company due to the desire to “accommodate for other training requirements” (Transportation Safety Board of Canada 2014a, p. 135).
Communication Problems Between Pilots
The investigation revealed that there was a miscommunication between the captain and the FO before the crash (Transportation Safety Board of Canada 2014a, p. 112). On the day of the crash, during the period from 16:38:19 to 16:38:56 (37 seconds), the captain was monitoring the autopilot as usual, which was supposed to carry out the turn onto the final approach; the captain was also providing directions for the FO to start the aeroplane’s configuration of its landing gear to prepare for landing. At a certain point, the captain made a call for the landing gear, which means that FO was supposed to be reaching for the flap lever and switching it. However, the problem was that it occurred just before 16:38:43 when the aircraft was transiting through a deviation of 1 dot from the appropriate course.
In this case, the FO should have called “Localiser”, and the captain should have responded “Correcting”; however, because the FO was busy with other tasks, including the interaction with the lever, he missed calling out the term (Transportation Safety Board of Canada 2014a, p. 112). It is also noted that the captain decided that configuring the aircraft was more important than his colleague’s communication with the tower and that such a view does not contradict the general principles of aviation; nevertheless, the result of this was that the FO’s workload increased greatly during the indicated period of 37 seconds, which is probably why the FO missed calling out “Localiser” (he might not have noticed the deviation, or he might have assigned a lower priority to it than to radio calls or changes to the configurations), and the aircraft continued deviating from the course (Transportation Safety Board of Canada 2014a, p. 112).
There were further problems with the communication between the pilots. For instance, during the period lasting from 16:39:13 to 16:39:30, the FO uttered five statements on the progressive deviation of the plane from the appropriate course, and the captain made an acknowledgement of the initial statement of his colleague (Transportation Safety Board of Canada 2014a, p. 113). However, during the time from 16:39:33 to 16:39:36, the captain made two statements, from which it is possible to conclude that he was satisfied with the tracking procedures carried out by the autopilot; it is apparent that he was convinced that the autopilot’s mode was VOR/LOC, while in fact, it switched to MAN and HDG HOLD mode previously (Transportation Safety Board of Canada 2014a, p. 114). Also, the FO most likely believed that the plane was still deviating from the desired course; this difference in beliefs was a sign that an awareness of the situation was not shared by the two pilots (Transportation Safety Board of Canada 2014a, p. 114).
Other instances of miscommunication occurred; the voiced concerns of the FO at 16:40:11 that the plane was not on the localiser gained a response from the captain at 16:40:14 that suggested that the captain was convinced that the autopilot captured the localiser (Transportation Safety Board of Canada 2014a, p. 116). The FO disagreed at 16:40:17, stating that the localiser was not captured, and pointed out that there was a hill near the airport (into which the plane eventually crashed); at 16:40:23, the FO repeated his concerns (Transportation Safety Board of Canada 2014a, p. 116). Thus, the FO was aware of the aircraft’s deviation from its course, but he voiced his concerns by pointing out the deviation rather than by identifying the possible consequences; the captain ignored his concerns.
Furthermore, each of the pilots was attempting to address a different problem. At 16:40:30, the FO proposed making a go-around but did it informally and unassertively, without actually utilising the term “go-around”; his suggestion failed to persuade the captain (Transportation Safety Board of Canada 2014a, p. 117). It is stated that there was no standard phraseology in the Standard Operating Procedures (SOP) guidance that applied to the current situation; furthermore, the SOP required the FO to supply information to the captain, and the captain had to make decisions (Transportation Safety Board of Canada 2014a, p. 118). Therefore, although the FO acted a bit too unassertively, his behaviour was consistent with the protocol. Unfortunately, the FO’s actions did not persuade the captain to take action until it was too late (a go-around was initiated 2 seconds before the crash).
CRM Failures Apparent from the Case
Thus, several failures became apparent from this case. The most obvious one is the lack of standard phraseology in the SOP protocols. This is a systemic failure, and such phraseology is crucial for clear and exact communication between pilots (Kanki, Helmreich & Anca 2010, p. 127).
Another failure is related to the fact that in this critical situation, both pilots were experiencing increased workload and, most likely, stress; this combination of factors can lead to attentional narrowing and confirmation bias (Transportation Safety Board of Canada 2014a; Kanki, Helmreich & Anca 2010). Moreover, the CRM protocols which were used by the crew lacked proper instructions for extreme situations (Transportation Safety Board of Canada 2014a, pp. 135-137). Because of this omission, the captain probably perceived his course of actions as desirable and the FO’s suggestions, which were not strong or assertive enough, as alternative and undesirable (LeSage, Dyar & Evans 2011). The captain was likely experiencing plan continuation bias (Moriarty 2015, p. 54); he failed to notice the inapplicability of his plan.
In such a situation, a direct voicing of possible consequences by the FO (e.g., “We will crash into that hill”, or some standardised statement with similar direct content), might have “woken” the captain from his bias (Salas & Maurino 2010). However, the FO was supposed to only provide state-of-fact information, which failed to affect the captain (Transportation Safety Board of Canada 2014a). The lack of clear protocol for extreme situations failed communication between the two pilots; in addition to their already increased workload, they also had to deal with communicational problems (LeSage, Dyar & Evans 2011).
Avoiding Similar Accidents in the Future
Upon examining the crash investigation and analysing the contributing factors, it is possible to hypothesise that the accident could have been avoided if the CRM protocols included additional means of communication and better instructions for emergencies. Standardised phrasing for indicating critical situations might have helped the FO wake the captain from his confirmation bias that had resulted from increased workload and stress (Moriarty 2015). Additionally, the FO’s directly referring to the possible consequences of following the captain’s course of actions might have had the same effect (Salas & Maurino 2010). Therefore, CRM protocols must be upgraded to include both instructions and standardised language for extreme situations (LeSage, Dyar & Evans 2011). Also, the instruction provided for aircraft crew should be upgraded to ensure that all the pilots receive a complete set of knowledge and skills during CRM training; such training should not be shortened, as was the case in the First Air company (O’Connor et al. 2008). Such upgrades are paramount if similar situations in the future are to be avoided.
Conclusion
Several factors caused the aircraft in question to crash, but the investigation of the accident revealed that one of the most important factors was related to problems in communication between the captain and the FO. On the whole, both the captain and the FO experienced increased overload and, most likely, stress, both of which hindered their communication. In particular, the captain failed to notice that the aircraft had considerably deviated from the intended course, and he disregarded the numerous warnings of the FO that the localiser was not captured, that the nearby hill probably posed a danger, that the aircraft should do a go-around and more.
Simultaneously, the FO was using non-assertive phrases to state the problem, did not point out the possible danger directly, and did not make a direct suggestion of a go-around. He only informed the captain about what he noticed, and his warnings were ignored by the captain due to attentional narrowing as well as confirmation and plan continuation biases (Kanki, Helmreich & Anca 2010; Moriarty 2015). The FO was acting consistently with the CRM guidelines used in the company, for he was to provide information for the captain (Transportation Safety Board of Canada 2014a). However, the CRM protocols in use failed to provide instructions for emergencies and lacked a standardised language to describe the situation in which the crew found themselves, which resulted in the lack of understanding between the pilots.
Thus, it should be concluded that the CRM protocols used in the situation were not satisfactory and that they ought to be improved by adding standardised phrases to describe hazardous situations, including direct phrases that might “wake” a pilot from biases resulting from stress. Moreover, the protocols should be upgraded to enhance communication in emergencies, where workload and stress are excessive, and precise and quick communication is crucial. Also, all crew members must receive full training in CRM; as demonstrated by this tragic and avoidable crash, it cannot be shortened in favour of any other activities.
Reference List
Corbett, M 2015, “From law to folklore: work stress and the Yerkes-Dodson Law”, Journal of Managerial Psychology, vol. 30, no. 6, pp. 741-752.
Dahlstrom, N & Nahlinder, S 2009, “Mental workload in aircraft and simulator during basic civil aviation training”, The International Journal of Aviation Psychology, vol. 19, no. 4, pp. 309-325.
Kanki, BG, Helmreich, RL & Anca, J 2010, Crew resource management, 2nd edn., Academic Press, Burlington, MA.
Lean, Y & Shan, F 2012, “Brief review on physiological and biochemical evaluations of human mental workload”, Human Factors and Ergonomics in Manufacturing & Service Industries, vol. 22, no. 3, pp. 177-187.
LeSage, P, Dyar, JT & Evans, B 2011, Crew resource management: principles and practice, Jones and Bartlett Publishers, Sudbury, MA.
Moriarty, D 2015, Practical human factors for pilots, Academic Press, Waltham, MA.
O’Connor, P, Campbell, J, Newon, J, Melton, J, Salas, E & Wilson, K 2008, “Crew resource management training effectiveness: a meta-analysis and some critical needs”, International Journal of Aviation Psychology, vol. 18, no. 4, pp. 353-368.
Richardson, HW, Gordon, P & Moore, JE 2007, The economic costs and consequences of terrorism, Edward Elgar Publishing Limited, Cheltenham, UK.
Salas, E & Maurino, D 2010, Human factors in aviation, 2nd edn., Academic Press, London, UK.
Stanton, NA, Salmon, P, Jenkins, D & Walker, G 2010, Human factors in the design and evaluation of central control room operations, CRC Press, Boca Raton, FL.
Sweet, KM 2008, Aviation and airport security: terrorism and safety concerns, 2nd edn., CRC Press, Boca Raton, FL.
Transportation Safety Board of Canada 2014a, Aviation investigation report A11H0002. Web.
Transportation Safety Board of Canada 2014b, News release: TSB highlights worldwide problem with unstable approaches and calls for improved crew communications following 2011 crash in Resolute Bay, Nunavut. Web.
Wise, JA, Hopkin, VD & Garland, DJ 2009, Handbook of aviation human factors, 2nd edn., CRC Press, Boca Raton, FL.