| The Investigation Process Research Resource Site |
A Pro Bono site with hundreds of resources for Investigation Investigators
|Home Page||Site Guidance||FAQs||Old News||Site inputs||Forums|
to advance the
Search site for::
Launched Aug 26 1996.
This document was presented at the System Safety Society 1997 System Safety Society Conference where it received one of the Best Paper awards, and was published in HAZARD PREVENTION 33:4, 1997, from which this version is drawn.
The main question the study sought to answer was whether psychologists attributed the causes to the technical system while engineers attributed them to the humans.
By Anna K. Lekberg
The Swedish Nuclear Power Inspectorate (SKI)
After the Three Mile Island accident in 1979, focus has been brought on the human factor issues in the nuclear power community both among industries and regulators. It is the goal of the Man, Technology and Organization (MTO) program at the Swedish Nuclear Power Inspectorate (SKI) to ensure that proper consideration is given to MTO factors in the design, operation and maintenance of the nuclear facilities, and that incidents and other operating experiences are reported and analyzed with this perspective in mind.
Different factors are of importance when analyzing incidents and accidents. A study performed at the University of Stockholm concluded that engineers and psychologists differ in their recommendations when it comes to improvements of the technical and human/organizational system. This indicates that it is important that engineers and psychologists cooperate when analyzing an incident or accident.
Human behavior is always involved in the cause of an accident. Advances in the design and surveillance of high risk systems have changed the task for man. New technology and more complex systems put new and higher demands on the people working in these systems. One of the major changes is that the operators have been placed further away from the process they are supposed to control. More advanced technical systems have come between the human and the actual working task. Therefore it is important to consider the possibilities and limitations of humans in the interaction with the technical system.
Operators make errors and will make new errors in trying to regain control in an unexpected system failure. Many of the root causes of the emergency were already in the system long
before the active error was made by the operator. The operator's part in the accident evolution can be viewed as "adding the final garnish to a lethal brew whose ingredients have already been long in the cooking" [Reason 1990]. Human factors specialists have given much attention to improving the interface between man
and the technical system. This is undeniably important, but it addresses only a relatively small part of the total safety problem.
Today it is important to see accidents and incidents as a result of the interaction between man, technology and organization. Accident and incident analyses will be guided by several different factors: one is the purpose of the investigation; another is the method; and yet another is the perspective from which the analyst is working. The purpose of this article is to highlight these factors, how they interact and how the results vary depending on the perspective. Results from a study also show the importance of the cooperation between different experts, in this case engineers and psychologists, when analyzing accidents in order to arrive at conclusions that can be used for improving safety.
Accidents have always had a substantial impact on safety in the nuclear industry; in the same manner as in other industries such as aviation and off-shore. More than in most other industries, however the prevention of severe accidents has become a matter of survival for the nuclear industry with regard to the confidence of investors as well as to public acceptance. The two most severe accidents occurred at Three Mile Island in 1979 and at Chernobyl in 1986.To prevent such accidents from happening in the future, it is important both for the industry and for its regulators to identify and act upon any warning signals for safety deficiencies. Such signals can come from actual incidents or from improved theoretical assessments (for example as lessons learned from the TMI accident).
The results of the analysis of the TMI accident came to be a turning point in the nuclear industry,
with substantially more attention now being devoted to human factors aspects in safety work. The 1979 Swedish Government Committee on Reactor Safety, set up after the TMI accident, recommended a substantially reinforced and more coordinated program on human factors, both with regard to formal regulatory activities and R&D. Their recommendations were accepted by the Swedish government and parliament, and formed the basis for the development of SKI activities in the human factors area in the 1980s.
Operating experience covers the follow-up of reportable events, trends based on these events, incident analysis, and the periodic safety reviews covering experiences in a 10-year perspective. It was recognized early that in order to be able to use information from operating experience with regard to MTO aspects it was necessary to improve the methods used in reporting incidents. Improved forms to report events, or Licensee Event Reports (LERs~, were a first step in this direction. Also, a special method for the analysis of incidents has been developed and tested among various specialists. The method, called AEB (Accident Evolution and Barrier function analysis~, focuses on the interaction between the human/organizational and the technical systems and assists in identifying the barriers that could have arrested the sequence of events. According to the AEB model, an accident evolution consists of a series of failures and related events. However this chain of failures and errors may be broken at several points by barrier systems able to
stop the unwanted evolution towards an accident. AEB analysis of an incident or an accident consists of two main phases:
1) the modeling of the accident evolution as an interaction between the human/organizational systems and the technical system, and 2~ a barrier function analysis.
The barrier function analysis can be divided into two steps: 1) identifying broken barriers, and 2~ strengthening broken barriers and suggesting implementations of new ones. In this way the AEB model helps direct the focus onto opportunities for improvement. The framework provided by the model may be used in predictive safety analyses as well as in post-hoc incident analyses. Application of the model will indicate the weaknesses and where safety can be improved.
System design has to be based on models that are able to predict the effect of technical failure and human failure during operation and evaluate the ability of the organization to deal with these problems. when a system fails it is not easy to trace a root cause. Based on available data about the accident, it is possible to establish that an accident is rarely simple, and that it is rarely the result of only one root cause but rather of several contributing factors to system failures.
When an accident or incident is analyzed, one follows a chain of events to understand what happened and why In other words, the analyst is trying to describe a special event scenario to identify the cause of a special incident or accident. It is important to consider the implicit frame of reference when analyzing an accident. The nature of the cause of explanation forms the accident analysis. Normally the dynamic process ends when a sequence is found which is familiar to the person doing the analysis~ In the continuous analysis process, the analyst's frame of reference will be taken for granted. In general, only the abnormal will be included in the analysis. The less familiar the context, the more detailed the decomposed connections between causes will be.
It is possible to prevent accidents and incidents through lessons learned. But, fortunately, there are few major accidents. It is therefore important to prevent accidents using information from both incidents and near misses.
Figure 2 - Two different perspectives on classifying human error [Svenson]
Accident analysis has even played an important role in introducing and developing new technologies.
It is of vital importance to decide the purpose of the analysis: whether the purpose is to explain the course of events, to assign responsibility, or to identify possible system improvements to avoid accidents and incidents in the future.
The method chosen can be seen as a to( that gives the analyst structure and a focus on the task. Another important consideration when using a method is being aware of the method's possibilities and limitations. If the incident or accident is a question of the interaction between the technical system and the person working with that system, it is important that the method be able to mirror that interaction. Even though a standard method is used, the result will vary depending on the background of the analyst.
The results from accident and incident analyses can also provide important information that can be of value in developing new methods for accident and incident analyses.
Accidents and incidents in technical systems can be seen from different perspectives. One is the perspective of the operator; another is the perspective of the designer of the system; and yet another is the perspective of the analyst. An analysis from the perspective of the operator involves different types of failures that humans can cause. Analysis from the designer's perspective involves those failures that are built into the technical system. An analysis from the perspective of the analyst means studying how the analyst carries out an accident or incident analysis. Svenson (1992) classified human failure from a technical system perspective and a human factors perspective.
The technical classification seeks to ~ldentify output patterns which are associated with different kinds of technical system input errors that have impact on the technical system. Examples of errors from this perspective are errors of omission and errors of commission. The human factors perspective seeks to identify input patterns to the human system which are associated with different kinds of human factors output errors back to the technical system. Examples of errors from this perspective build on different types of psychological processes and regard errors as failures in executing an action. The same set of errors are seen as input errors in the first perspective and as output errors in the second.
The different perspectives can be illustrated like this: an engineer finds that a nuclear operator made an error and stops the analysis at the operator level. However a human factors analyst goes further back to find out why that failure occurred and goes back to the technical system, and may end with an ambiguous technical system state or deficiencies in the organizational system.
A study at the University of Stockholm examined two important hypotheses: 1) that the breakdown of the cause/event flow will normally terminate when a sequence is found including events that match the prototypes familiar to the analyst [Rasmussen 1990]; and 2) that an engineer and a human factors specialist will probably come to different conclusions about an incident or accident depending on background [Svenson 1992].
One part of people's frame of reference and background is education. This study tested how psychologists and engineers made an accident analysis. The main question the study sought to answer was whether psychologists attributed the causes to the technical system while engineers attributed them to the humans.
The 40 participants were instructed to describe the accident as completely as possible by using the AEB method and by finding barriers that could improve the technical and human/organizational system. The accident analyzed occurred while 12 patients with renal failure were undergoing dialysis. Three patients died and several were injured. The analysis of the accident at that time was intended to determine responsibility. The head nurse was found guilty of manslaughter and given a conditional sentence.
After finishing the analysis the subjects were asked to fill out a questionnaire; and with regard to responsibility~ the psychologists thought that the manufacturer; the management and the nurse were responsible while the engineers thought that the manufacturer; the nurse, the management and the personnel supervising the task were responsible. The psychologists did not attribute any blame to the personnel supervising the task because they thought it was more important to improve the dialysis equipment.
In conclusion, this study shows that the background of the analyst is of importance when analyzing an accident. The court case produced a very narrow distribution regarding responsibility, focusing on the head nurse at the time of the accident. This contrasts with the much wider attribution of responsibility across those using the AEB method in the study.
Depending on the purpose, the method used, the perspective of the analyst and the quality of the background information about the accident, different results will be gained. If the purpose is to find someone to blame, the analysis will end at that point with no lessons learned. The method has to consider the interaction between the technical system and the human system; otherwise the conclusions will not illustrate the causes of the incident or accident in a proper way. The perspective of the analyst, if it is the human factors perspective or the technical perspective, will suggest different solutions for improvements. when it comes to the technical perspective, the most common recommendations are more training and/or more and/ or better procedures to help the operator avoid errors. The human factors perspective is that the technology must be adapted to man and not vice versa, and that the organization must supply the people with proper working conditions.
Emphasis has been put on the man-machine interface, particularly regarding the interface between the operator and the control room design. A further development regarding how to ensure safety is necessary, where human performance is considered in a broader context and focused not only on instrumentation, procedures, etc., but also on cooperation and communication within working groups and on organizational factors. In this connection it should be noted that the quality of the analysis will of course also depend on the data collected.
Incidents and accidents will continue to happen. It is of great importance to analyze them thoroughly. Just to end the analysis with operator error is not good enough. When performing an analysis you have to have a clear purpose, and you have to use a method that takes into account both the human/organizational and the technical systems. Otherwise, you will end up with an incomplete analysis, which does not improve the work environment for the operator working in complex high-risk systems such as nuclear power plants.
Important steps have been taken after the TMI accident that made way for the implementation of a systemic program to improve safety in terms of the interaction between man, technology and organization. In the development of the MTO program at SKI, many activities have involved the industry and required a commitment in time and effort on their part. One of these activities is that SKI demands that the nuclear power facilities use human factors specialists. These human factors specialists can be either employed by the organization or available as contractors. They are supposed to take part in incident analyses in cooperation with engineers, and through integrated safety analyses to come up with recommendations for improvements.
Through these activities a better understanding and acceptance of these issues has emerged. whereas previously there was a skeptical attitude by the industry at the initiation of the MTO program, there is now a clear understanding of the importance of the MTO
approach and the fact that human factors specialists can supply important knowledge to be used in the improvement of safety.
Dahlgren, K. The Swedish regulatory approach to human factors. Swedish Nuclear Power Inspectorate (SKI)
Department of Human Factors, 1995. Hoegberg, L. "Special safety reviews following accidents and major incidents at nuclear power plants." Nuclear Power Inspectorate (SKI). Presentation at the International Symposium on Reviewing the Safety of Existing Nuclear Power Plants, Vienna, Austria, October 8-11, 1996.
Lekberg, A. 'Accident analysis using the AEB-model; Accident Evolution and Barrier Function analysis, a comparative study of accident analysis performed by psychology and engineer students." Paper presented at the University of Stockholm (in Swedish), 1996.
Rasmussen, I Human error and the problem of causality in analysis of accidents. Phil. Trans. R. Soc. Lond., 1990. B 327, 449-462.
Reason, I Human Error Cambridge University Press, Cambridge, UK, 1990.
Svenson, O. Safety Barrier Function Analysis in AEB Modeling of Incidents and Accidents: The Human Factors Perspective. PSA Reports to SKI No.20, University of Stockholm, 1992.
Svenson, 0., A. Lekberg and A. Johansson. O Perspective, Expertise and Dfferences in Accident Analyses: Arguments for multidisciplinary integrated approach. In press, 1996.
About the Author
Anna Lekberg is employed at the Swedish Nuclear Power Inspectorate where she works in the Department of Man, Technology and Organization. She is responsible for the competence and training program and the decommissioning program within the Division of Reactor Safety. Her professional background is in the behavioral sciences.