- It is much easier to pin the legal responsibility for an accident upon the errors and violations of those in direct control of the aircraft or vessel at the time of the accident. The connection between these individuals and the disastrous consequences is far easier to prove than any possible links between earlier management decisions and the accident.
- This is further compounded by the willingness of professionals such as aircraft Captains and ships Masters to accept this responsibility. They are accorded substantial authority, power and prestige and in return, they are expected to "carry the can" when things go wrong. The buck and the blame traditionally stops at them.
- Most people place a large value on personal autonomy, or on sense of free will. We also impute this to others, so that when we learn that someone has committed an error with bad consequences, we assume that this individual actually chose an error-prone rather than a 'sensible' course of action. In other words, we tend to perceive the errors of others as having an intentional element, particularly when their training and status suggest that 'they should have known better'. Such voluntary actions attract blame and recrimination, which in turn are felt to deserve various sanctions.
- Our judgements of human actions are subject to similarity basis. We have a natural tendency to assume that disastrous outcomes are caused by equally monumental blunders. In reality, of course, the magnitude of the disaster is determined more by situational factors than by the extent of the errors. Many monumental disasters have resulted from relatively minor failings in different parts of the system (e.g. the Tenerife runway disaster).
- Finally, it can not be denied that there is a great deal of emotional satisfaction to be gained from having someone (rather than something) to blame when things go badly wrong. Few of us are able to resist the pleasure of venting our psychic spleens on some convenient scapegoat. And in the case of Organizations, of course there is considerable financial advantage in being able to detach individual fallibility from corporate responsibility.
- Do we want revenge? Maybe, but how much are we going to get out of the Pilot's body - or bank account? Before we let the Pilot fly our aircraft, we should have covered ourselves with insurance. We might wish that we had either trained the pilot better or withheld some clearances. But it's a little late for that now, Our revenge, if any, is going to come from the insurance company. Punishing the pilot isn't going to help.
- Do we need to protect society from this Pilot? Probably not. As long as we are talking about errors of judgement or technique and not willful violation of flying regulations, society is not in much danger. Maybe society needs protection from a system that will not prevent an individual eventually achieving PIC status based entirely on longevity or time and space.
- Do we need to change the behavior of this Pilot? If we are still talking of errors of judgement or technique, it is safe to say that the Pilot had absolutely no intention of having that accident in the first place. Now that it has occurred, the Pilot has even less intention of having it again. Applying punishment won't improve that.
- Do we need to make an example of this Pilot to others? This is an interesting question and can be argued a number of different ways. My View is that it depends entirely on whether the other Pilots were planning on doing the same thing our prospective recipient of the punishment did. If the accident was genuinely the results of mistakes and misjudgement, punishment probably has no effect on others. They were not planning on making those mistakes in the first place and seeing another Pilot punished won't change their mind. On a negative note, the punishment could influence them to avoid the circumstances where there might be an opportunity to make the same mistake. Maybe that's what we want. Be careful though. That's how we develop students who have never been taught the hard things, instructors who won't get off the controls; and pilots who can't operate their aircraft in the corners of the Flight Envelope. On the other hand, lets suppose that our other pilots were planning on doing the same thing that got this pilot in trouble. Here, punishment can be very effective - provided it is applied to the act and not the result. By this I mean that we always punish the pilots who disregard our rules, regardless of whether it results in an accident or not. This is an effective way to manage all our Pilots. If we wait for an accident to occur and then apply punishment, we are behaving inconsistently and loose any benefit that punishment might have on the rest of our Pilots. They realize that we are willing to tolerate their misbehavior as long as they don't have an accident.
- Why should we stop at the Organizational roots? In a deterministic world, everything has a prior cause. In theory, therefore, we could go back to the Big Bang! Seen from this broader historical perspective an analytical stop-rule located at the organizational root causes is just as arbitrary as one located close to the proximal individual failures.
- The scientific logic to apply here is, in seeking the reasons for an accident, we should go far enough back to identify casual factors that, if corrected, would enhance the systems resistance to subsequent challenges. The people most concerned and best equipped to do this are within the organization(s) involved, so it makes sense to stop at these organizational boundaries. However, these boundaries may often be indistinct, particularly in Aviation where there are a large number of inter-related sub-systems involved.
- Perhaps the most serious scientific problem has to do with the particular nature of accidents and how they change our perceptions of preceding events. In retrospect, an accident appears to be a point of convergence of a number of casual chains. Looking back down these lines of causation, our perceptions are colored by the certain knowledge that they caused a bad outcome. But if we were to freeze any system in time, without an accident having occurred, we would see very similar imperfections, latent failures and technical problems. No system is ever perfect. The only thing that gives these same kind of systemic weaknesses causal significance is that in a few intensely investigated events they were implicated in the accident sequence. If all that distinguishes these latent factors is the subsequent occurrence of a bad outcome, should we not limit our attention only to those proximal events that transformed such commonplace shortcomings into an accident sequence? In other words, should we not run with the Moral and Legal tide and simply concentrate on those individual failures having an immediate impact upon the integrity of the system?
The answer here depends crucially upon two factors. First, whether or not latent organizational and managerial factors can be identified and corrected BEFORE an accident occurs, and second, the degree to which these interventions can improve the systems natural resistance to local accident producing factors.
In a recent survey of the Human factors literature, it was revealed that the estimated involvement of human error in the breakdown of hazardous technologies had increased fourfold between the 1960s and the 1990s, from minima of around 20% to a maxima of more than 80%. During this period it also become apparent that these contributory errors were not restricted to the 'sharp end', to the Captains, Masters, Ships officers, control room operators, pilots, drivers etc. in direct control of the operation. Nor can we only take account of those human failures that were the proximal causes of the accident. Major accident inquiries (like the three mile island, Challenger, Kings Cross, etc.) indicate that the human causes of major accidents are distributed very widely, both within the organization as a whole and often over several years prior to the actual event.
The only way to proceed in such a scenario is to ask: What do all of these complex, well defended technologies have in common? The answer is, Organizational processes and their associated cultures, a variety of different workplaces involving a variety of local conditions and defenses, barriers and safeguards, designed to protect people, assets and the environment from the adverse effects of the local hazards. Each of these aspects is addressed in the Reason Model of accident causation discussed in the earlier post and reproduced below:
|The Reason Model of Accident Causation|
The organizational processes - decisions taken in the higher echelons of the system - seed Organizational Pathogens into the system at large. These resident pathogens take many forms: Managerial oversights, ill-defined policies, lack of foresight or awareness of risks, inadequate budgets, lack of legal control over contractors, poor design, specifications and construction, deficient maintenance management, excessive cost cutting, poor training and selection of personnel, blurred responsibilities, unsuitable tools and equipment, commercial pressures, missing or flawed defenses and the like. The adverse consequences of these pathogens are transported along two principal pathways to the various workplaces, where they act upon the defenses to create latent conditions and upon local workplace conditions to promote active failures.
Subsequently, these active and latent failures act to create an event (a complete or partial trajectory through the defensive layers). Events may arise from a complex interaction between active and latent failures, or from factors present predominantly in one or the other pathway. Both, local triggering factors and random variations can assist in creating trajectories of accident opportunity.
By specifying the organizational and situational factors involved in the causal pathways, it is possible to identify potentially dangerous latent failures before they combine to cause an accident. Hence we can have a measure of control over Human Errors and me, the Erring Humans that work for you.
But why do we Err? We will discuss that in the subsequent posts.
The Erring Human.