Understanding the Three Mile Island Nuclear Meltdown through the Perspective of Human Report (Assessment)

Exclusively available on Available only on IvyPanda® Made by Human No AI

The 21st century will not only be defined by technology but also by the existence of complex organisations that leverage advances in the field of information technology, computers, and advanced management techniques. This is the reason why a corporation can have a main office in London but can do business all over the globe 24 hours a day and seven days a week.

Aside from that the needs of a growing population such as urban centres that can hold tens of millions of people also necessitate the creation and maintenance of complex and gigantic structures such as those that deliver water, electricity and gas. The intricate systems that these structures contain multiply the probability of errors.

Although there is a system of checks in place these types of facilities cannot afford malfunction or human errors. One good example is the nuclear facility in Three-Mile-Island that experienced a meltdown. The cause of the malfunction is linked to what managers now call as “human factors” and a clear understanding of what it means can help prevent future problems of this magnitude and potential consequence.

Human Factor

Before going any further it is imperative to get a clear understanding of the impact of human factors on the organisation, technology, and work and safety aspects of a nuclear plant. An excellent backdrop for this study is the definition for this concept as seen below:

Human factors is a multidisciplinary field that draws on the methods, data, and principles of the behavioural and social sciences, engineering, physiology, anthropometry, biomechanics, and other disciplines to design systems that are compatible with the capabilities and limitations of the people who will use them.

Its goal has been to design system that use human capabilities in appropriate ways, that protect systems from human frailties, and that protect humans from hazards associated with operation of the system (Moray & Huey, 1988).

This definition was the by-product of assiduous work in the field of nuclear regulatory research. This is research initiative, prompted no doubt by mishaps and other problems encountered by those working under the U.S. nuclear program. The compact definition has to be unpacked in order to appreciate the multidisciplinary aspect of human factors research.

Firstly, all of the supporters of human factors research are in agreement that it simply impractical and unwise to focus on one or few fields when it comes to designing a safe and efficient system, especially when it comes to critical operations such as those that can be found in aviation or nuclear facilities.

Secondly, it is not enough to simply increase the scope of study. There is a good reason why they included not only the integration of social and behavioural sciences; they also found the link between these fields and those of engineering, physiology, anthropometry, and biomechanics.

Human factors research looks at the people behind the organisation as well as the technology required to accomplish what seems to be a humanly impossible task such as the monitoring and maintenance of critical systems.

And finally, the definition suggests that no matter how technology has advanced there will always be a need for human expertise, the human touch is required to keep an operation to run smoothly. But this has another side to it. Due to the participation of humans in the creation, implementation and upkeep of a complex systems such as a nuclear power plant then that particular system is prone to error.

Therefore, the purpose of human factors research is to design systems that are well-suited to the capabilities and limitation of the staff and operators handling and maintaining that system (Moray & Huey, 1988). This is expounded even further by the realisation that human talents, perception, expectation must be considered into the design or training strategies to prepare those who will use the system.

Another major purpose of human factors research is to develop a system that is protected from human frailties (Moray & Huey, 1988).

It is an acknowledgment that although humans can sometimes outperform machines, for example machines only do what they are programmed to do but can never make adjustments to changing conditions – but at the same time there are limitations to what the human body can do. Machines are not prone to fatigue but humans have limits that can be measured and factored into the design of the system.

Finally, human factors research is not only limited to manufacturing high-quality products and services but it is also about protecting humans from the inherent hazards of a particular system (Moray & Huey, 1988). In the case of Three-Mile-Island the potential hazard is the radioactive material within the two nuclear reactors.

If radioactive radiation leaks into the atmosphere the first group of individuals who will suffer from contact with radioactive materials are going to be the staff and operators and then if there is a nuclear meltdown then the whole community near the nuclear plant and so their safety must also be included in the overall design.

Three-Mile-Island: Review of Facility

In September 1978 the people of Pennsylvania took notice of a dedication ceremony for the newly completed nuclear power plant built in an island accessible through the Susquehanna River and it was called after the name of its location and known hereafter as the Three-Mile-Island Nuclear Power Plant (Derkins, 2003). It is located a mere 15 kilometres away from Harrisburg, the capital of Pennsylvania (Derkins, 2003).

Americans are used to such announcements and it can be argued that no one had a premonition of what would follow; before the tragic event that would occur months later, a nuclear power plant is deemed safe and efficient. There was usually no drama involved. But this one is different.

The said nuclear power plant is comprised of two units, two reactors and designated as follows: TMI-I was constructed at the northern end of the island in 1968 while TMI-2 was constructed just south of the first unit and it was started in 1969 (Derkins, 2003)

The first unit, TMI-1, was completed in 1974 and immediately began to generate electricity but for the second unit, TMI-2 encountered construction delays and did not start generating electricity until 1978 (Derkins, 2003). It should have been a warning for the operators but this was not uncovered because of the absence of an effective HRA tool.

The Control Room and Plant Equipment Design

The control room was designed in such a way that it will alert the operators if there were problems but in 1979 a complex system like this one can only do so much. So in the midst of an impending crisis a loud horn was heard by the operators and the control panels began to light up like a Christmas tree.

But there was no way to differentiate what was trivial and what was critical, the flashing lights added to the confusion but never guided the operators to the real cause of the problem.

If only they knew that a critical component a relief valve has malfunctioned. This valve controls the pressure of the reactor coolant and by staying open it reduces the amount of reactor coolant in the system.

However, the only way for the operators to know this is to walk behind a seven-foot high instrument panel to discover that the coolant was dangerously low but it was not convenient and practical to keep on walking back and forth from the centre of the control room to the back of the control panel and so it was easy to miss it (Walker, 2004) The operators relied on a less reliable system.

They installed a signal light in the control panel and when it is lighted then it means that the relief valve was open and if it was not lighted then they assume that the relief valve was closed (Walker, 2004).

The operators were unaware of the failure of the relief valve and so they made the assumption that the pressure in the reactor was still high and therefore there was enough coolant within TMI-2 (U.S. NRC, 2010). This is the reason why the reactor generated tremendous amounts of heat that at the end melted the core.

Aside from the human factor and the complex operation requirement needed to run the nuclear plant, another major factor that has to be considered in order to understand why the Three-Mile-Island debacle occurred is the use of a technology called “pressurized water reactors” also known as “PWR” (Walker, 2004) This type of technology was most common in all the reactors built before the 1980s (Walker, 2004).

But for Babcock and Wilcox the reactor manufacturer commissioned to build TMI-1 and TMI-2 there is no other way to build a reactor other than the use of PWR (Walker, 2004).

This is significant because there is a critical part of the design which is called the Pilot-Operated-Relief-Valve (“PORV”) and it can be found on top of a forty-foot high container called a pressurizer but when it malfunctioned there was nothing that was installed to alert the operators that it has stayed open or stayed close (Walker, 2004). In other words it was the weak link in the design.

This is how the PORV was supposed to work: if the pressure in TMI-2 rises rapidly so the pressurizer is overwhelmed by the sudden build-up and so it was designed to open to release cooling water and therefore reduce the pressure build-up (Walker, 2004). At first this is what happened when other parts of the system malfunctioned at about 4:00 AM, March 28, 1979 (Walker,2004).

However, ten seconds later when the pressure and heat levels were back to normal the PORV was supposed to close as designed but in this case it did not close and coolant was oozing out of the reactor (Walker, 2004). The problem with this scenario is that this was not the first time that a PORV was stuck open, especially in TMI-2 and yet no one took time and effort to fix the problem or redesign the whole system.

It was revealed later that this type of problem is common in nuclear reactors designed by Babcock and Wilcox. In fact what happened in Three-Mile-Island was replicated two years earlier in a similarly designed system at a nuclear plant named Davis-Besse (Walker,2004).

The only difference is that in that case an operator was aware that there was a problem with the PORV because it stayed open and so he immediately took action to close the relief valve (Walker, 2004).

In the case of TMI-2 no one knew about this problem if they did then they would have automatically went to the PORV monitor and closed the relief valve when needed. Due to the error there was no adequate cooling for the reactor and as a result the nuclear fuel overheated and the core began to melt (U.S.NRC, 2010).

Human Error Types and Models

There are three types of errors based on psychological error mechanisms and these are listed as follows: 1) error of omission; 2) error of commission; and 3) psychological error (Verma, Ajit, & Karanki, )

This is a good starting point for understanding human errors because it clearly differentiates error based on what the person did while working in a nuclear facility and the third one is the error due to lapses in memory, attention failure, and perception failure (Verma, Ajit, & Karanki, 2010). These explains some of the basic errors committed in the workplace, factory or even in a nuclear facility.

It is also helpful in designing a fail-safe system for planners to be acquainted with the different types of actions based on work complexities (Verma, Ajit, & Karanki, 2010). These are widely known as:

  1. Skill based actions;
  2. Rule based actions; and
  3. Knowledge based actions (Verma, Ajit, & Karanki, 2010).

If these are linked to human errors then slips and lapses usually occurs when it comes to skill-based actions, during routine well-practiced tasks that does not require conscious effort (Strauch, 2004). In addition there are also rule-based mistakes and knowledge-based mistakes.

In the case of the former, the operator or manager applied the wrong rule. In the case of the latter the operator or manager did not posses the correct knowledge and did not have the experience to deal with a novel situation presented to him or her (Strauch, 2004). The last one can explain what happened to the Three-Mile-Island accident.

The Three-Mile-Island Accident is not the result of error of commission. There was no deliberate act that resulted in failure in fact the operators had no idea what was going on, they were inundated with information and system signals that they do not know what was trivial and what was critical so that they could be guided to do the next crucial step.

The incident was also the result of error of omission because the operators are all competent engineers and workers who had a clear understanding of how the system works. For instance if a supervisor told them that the relief valve was stuck and needed to be closed they would immediately know that the PORV should be closed to restore the correct amount of coolant in the reactor. But they were not privy to this information.

Thus, the Three-Mile-Island incident was the result of a psychological error in the sense that it was a failure in perception.

The engineers and operators thought that they were dealing with loss of water, clogged condenser etc. but the truth is there was only one critical problem that they had to address and everything could be reset to normal and yet they were unaware, the limitations of their knowledge about the Wilcox and Babcock designed nuclear plant brought them to the edge of a major disaster that could have taken hundreds even thousands of lives.

Human Reliability Analysis

The best HRA tool that is best suited to analyse the weakness, potential problems of a nuclear power facility is what is commonly known as the Technique for Human Error Rate Prediction because it combines task and event tree analysis and more than that it was originally designed for the nuclear industry (Attwood & Crowl, 2007).

The effectiveness of THERP is seen in how it thoroughly evaluates the task at hand as well as help in developing an event tree analysis that will map out for the managers and operators of a nuclear facility what will happen if this particular steps is omitted or if a particular function failed.

An event tree analysis which also serves as visual aid in the mastery of complex systems will provide an overview of the nuclear facility and its various operations.

This simply means that THERP will predict what will happen in the event that a particular action is committed or omitted or if a fail-safe system did not function as designed. Other HRA tools are also effective but there are some that are reactionary in the sense that it makes recommendations based on what has happened in the past.

For example one HRA established a formula wherein he said that for every 10 near misses there will be an accident (Hughes & Ferret, 2005). This gives the impression that managers will have to look at patterns and recognise the problem.

This may work in other fields but should not be recommended for use in a nuclear facility. In a nuclear power plant something wrong can happen without precedent – it is sudden and catastrophic – there would be no time to evaluate and then make recommendations. The risk is too high to be entrusted to an HRA tool that is not as thorough as THERP.

Human Factor Integration

The Human Factor Integration is an important step because it is the systematic process for identifying and tracking as well as resolving human factors related issues so that planners and system engineers can create a system that balances the impact of human talents and limitation and technology (Strauch, 2004). In the case of the Three-Mile-Island incident a correctly done HFI would have prevented the problem.

An event tree analysis as well as task analysis was completed so that even if the operators were confronted by a novel problem they would still be able to solve the problem and not overwhelmed with various signals and information that did not make sense to them. They would have designed a control room that allow them to see every instrument and every signal that needed their attention should be displayed prominently.

The HFI Plan

Babcock and Wilcox, the firm that was contracted to build the Three-Mile-Island nuclear power plant should have created a human reliability analysis and the tool that they should have adopted is the THERP. If this was accomplished then they would have created a task analysis and at the same time an event tree analysis.

They were able to train the support group, the operators and engineers on what to do in case of major failures but there was no system in place to deal with novel problems. And the reason for that was simple; there was not HRA that was conducted that could have predicted human error and other forms of mechanical failure.

Using an event tree analysis the operators would have known what would happen every step of the way. They are not only aware of the major components of a complex system but they would have been made familiar with the sequence of events and how the failure of one component will lead to what kind of effect.

The use of THERP would have exposed the weakness of the system. In the case of the Three-Mile-Island accident the weak link is the design of the PORV signal. The signal was basic, a light that was turned on signified that the valve was open and when the light is not on then it means that the valve was close.

The engineers who designed the system did not consider the possibility that the signal light may fail. They also did not consider the possibility that the operators will simply assume that that the valve was closed based on the absence of light coming from the instrument monitoring the PORV.

The one who designed the system did not consider what would happen in the event of multiple malfunctions and what it would look like inside the control room. The control room was supposed to be the eyes and ears of the engineers with regards to what is happening within the nuclear reactor.

Yet it was poorly designed. A good HRA tool will determine not only the effectiveness of the control panels to convey data but to transmit data in a way that could assist the operators to deal with the most critical issue and the most critical issue during that time is not the problem with the heaters or the mechanical failure of some components of TMI-2. The most critical issue is the lack of coolant inside a super-heated reactor.

Conclusion

The Three-Mile-Island incident could have been easily avoided if an effective HRA tool was used even before construction of the nuclear plant was given the go signal to proceed. By using an HRA tool such as the THERP the engineers and operators would have known that a very simple valve would play a crucial role in the meltdown of the reactor.

This means that the signal that is coming from this valve should have been displayed prominently in the control room. Aside from that there should have been a more sophisticated means of alerting the operators that the valve has malfunctioned. They should not be made to rely on a simple mechanism just like the light signal coming from an obscure mechanism situated behind the control panels.

The THERP would have given them a heads up when it comes to unprecedented events. They would have predicted the probability of human errors and mechanical failure so that they could create system to deal with potential problems.

For instance, the signal connected to the PORV should not be a simple light bulb it should be a gauge that provides an accurate reading and it must be displayed prominently in the control room. Thus, a quick glance of the controls will immediately alert the operators that coolant was leaking out of the reactor.

References

Attwood, d. D. Crowl. (2007). Human Factors Methods for Improving Performance in the Process Industries. New Jersey: John Wiley & Sons, Inc.

Dekker, S. (2005). Ten Questions About Human Error: A New View of Human Factors and System Safety. New York: Routledge.

Derkins, S. (2003). The Meltdown at Three Mile Island. New York: Rosen Publishing.

Hughes, P. & E. Ferret. (2005). Introduction to Health and Safety Work. Oxford: Butterworth- Heinemann.

Moray, N. & B. Huey. (1988). Human Factors Research and Nuclear Safety. Washington, D.C.: National Academies Press.

U.S. NRC. (2010). “Backgrounder on the Three-Mile-Island Accident.” Web.

Strauch, B, (2004). Investigating Human Error: Incidents, Accidents, and Complex Systems. UK: Ashgate Publishing.

Verma, A. S. Ajit, & D. Karanki. (2010). Reliability and Safety Engineering. London: Springer.

Walker, S. (2004). Three Mile Island: A Nuclear Crisis in Historical Perspective. CA: University of California Press.

More related papers Related Essay Examples
Cite This paper
You're welcome to use this sample in your assignment. Be sure to cite it correctly

Reference

IvyPanda. (2019, June 7). Understanding the Three Mile Island Nuclear Meltdown through the Perspective of Human. https://ivypanda.com/essays/understanding-the-three-mile-island-nuclear-meltdown-through-the-perspective-of-human/

Work Cited

"Understanding the Three Mile Island Nuclear Meltdown through the Perspective of Human." IvyPanda, 7 June 2019, ivypanda.com/essays/understanding-the-three-mile-island-nuclear-meltdown-through-the-perspective-of-human/.

References

IvyPanda. (2019) 'Understanding the Three Mile Island Nuclear Meltdown through the Perspective of Human'. 7 June.

References

IvyPanda. 2019. "Understanding the Three Mile Island Nuclear Meltdown through the Perspective of Human." June 7, 2019. https://ivypanda.com/essays/understanding-the-three-mile-island-nuclear-meltdown-through-the-perspective-of-human/.

1. IvyPanda. "Understanding the Three Mile Island Nuclear Meltdown through the Perspective of Human." June 7, 2019. https://ivypanda.com/essays/understanding-the-three-mile-island-nuclear-meltdown-through-the-perspective-of-human/.


Bibliography


IvyPanda. "Understanding the Three Mile Island Nuclear Meltdown through the Perspective of Human." June 7, 2019. https://ivypanda.com/essays/understanding-the-three-mile-island-nuclear-meltdown-through-the-perspective-of-human/.

If, for any reason, you believe that this content should not be published on our website, please request its removal.
Updated:
This academic paper example has been carefully picked, checked and refined by our editorial team.
No AI was involved: only quilified experts contributed.
You are free to use it for the following purposes:
  • To find inspiration for your paper and overcome writer’s block
  • As a source of information (ensure proper referencing)
  • As a template for you assignment
1 / 1