Human Error is classified in a number of different ways. However, in my experience, I have found the classification developed by Prof James Reason to be the most suitable for use in assessing, evaluating and further investigation of the root causes. Prof. Reason, after extensive research in high risk work situations that spanned across Medical, Nuclear, Road, Shipping, Air Transportation (and many more) industries developed what is today known as "The Reason Model of Accident causation". According to this model, an Organisation has a fair degree of control over the working conditions and the work environment provided to an employee. Employees (who are all Humans) work within this defined environment and face hazards, encounter risks and deal with them to accomplish the tasks required of them by that organisation. In the performance of their duties within the framework defined by the organisation they make errors and violations that could result in accidents or incidents. Any organisation has three defences against these errors and violations that lie in Technology, Training and Regulation. These defences will work when and if applied in this sequence (we will discuss this in more detail in a later post, but for present we will just leave this statement here). However, poorly defined or managed organisational processes have an ability to create Latent Conditions that could bypass all, the defences and allow the accident causation chain of events to run their course. The diagram below gives a brief depiction of this.
Top of the food chain: The Organisational Processes |
Since this is at the top of the Human Error Food Chain, we will discuss this in more detail in the end. Lets now move a bit lower in the search of how this links to our errors.
A step lower in the food chain of Human Error causation factors is the role of Supervision in any high risk operation. I assume the readers of this blog to have already a basic understanding of what and why supervision is necessary. In case you do not have this knowledge, please feel free to write so in the comments below and I will explain that for you. Basically, there can be four levels of supervisory failure, as is depicted in the diagram below.
The next level: Supervisory Lapses |
Again, this is still pretty high up in the Accident food chain, so we will discuss this in more detail in a later post. For the present, lets move a little lower. There are elements that are in play even beyond supervisory failures, or lets say despite or without supervisory failure.
Often accidents happen even when perfectly good supervision was available and the factors then in play are depicted in the diagram below.
The third level: Preconditions for Unsafe Acts. |
So, as we move lower in the accident causation food chain, we are getting close to the Human. Uncomfortably close? Lets leave that discussion also for a later post, because there is the fourth and final level which is our target for today's post...the lowest level in accident causation food chain and the level at which, unfortunately, most investigations tend to stop! This is the level of actual error or violation. Their are many different kinds of errors and violations as you will see from the diagram below. The unsafe acts of Humans can be classified as Errors or as Violations.
So, what is the difference? errors represent the mental or physical activities of individuals that fail to achieve their intended outcome. Not surprising, given the fact that human beings by their very nature make errors, these unsafe acts dominate most accident databases. Violations, on the other hand, refer to the wilful disregard for the rules and regulations that govern the safety of flight. The bane of many organizations, the prediction and prevention of these appalling and purely “preventable” unsafe acts, continue to elude managers and researchers alike.
Bottom of the Accident Food Chain: Errors and Violations. |
Errors
Skill-based errors. Skill-based behaviour within the context of aviation is best described as “stick-and-rudder” and other basic flight skills that occur without significant conscious thought. As a result, these skill-based actions are particularly vulnerable to failures of attention and/or memory.
In fact, attention failures have been linked to many skill-based errors such as the breakdown in visual scan patterns, task fixation, the inadvertent activation of controls, and the mis-ordering of steps in a procedure, among others. A classic example is an aircraft’s crew that becomes so fixated on troubleshooting a burned out warning light that they do not notice their fatal descent into the terrain. Perhaps a bit closer to home, consider the hapless soul who locks himself out of the car or misses his exit because he was either distracted, in a hurry, or daydreaming. These are both examples of attention failures that commonly occur during highly automatized behaviour. Unfortunately, while at home or driving around town these attention/memory failures may be frustrating, in the air they can become catastrophic.
In contrast to attention failures, memory failures often appear as omitted items in a checklist, place losing, or forgotten intentions. For example, most of us have experienced going to the refrigerator only to forget what we went for. Likewise, it is not difficult to imagine that when under stress during in-flight emergencies, critical steps in emergency procedures can be missed. However, even when not particularly stressed, individuals have forgotten to set the flaps on approach or lower the landing gear – at a minimum, an embarrassing gaffe.
The third, and final, type of skill-based errors identified in many accident investigations involves technique errors. Regardless of one’s training, experience, and educational background, the manner in which one carries out a specific sequence of events may vary greatly. That is, two pilots with identical training, flight grades, and experience may differ significantly in the manner in which they manoeuvre their aircraft. While one pilot may fly smoothly with the grace of a soaring eagle, others may fly with the darting, rough transitions of a sparrow. Nevertheless, while both may be safe and equally adept at flying, the techniques they employ could set them up for specific failure modes. In fact, such techniques are as much a factor of innate ability and aptitude as they are an overt expression of one’s own personality, making efforts at the prevention and mitigation of technique errors difficult, at best.
Decision errors. The second error form, decision errors, represents intentional behaviour that proceeds as intended, yet the plan proves inadequate or inappropriate for the situation. Often referred to as “honest mistakes,” these unsafe acts represent the actions or in-actions of individuals whose “hearts are in the right place,” but they either did not have the appropriate knowledge or just simply chose poorly.
Perhaps the most heavily investigated of all error forms, decision errors can be grouped into three general categories: procedural errors, poor choices, and problem solving errors. Procedural decision errors (Orasanu, 1993), or rule-based mistakes, as described by Rasmussen (1982), occur during highly structured tasks of the sorts, if X, then do Y. Aviation, particularly within the military and commercial sectors, by its very nature is highly structured, and consequently, much of pilot decision making is procedural. There are very explicit procedures to be performed at virtually all phases of flight.
Still, errors can, and often do, occur when a situation is either not recognized or misdiagnosed, and the wrong procedure is applied. This is particularly true when pilots are placed in highly time-critical emergencies like an engine malfunction on takeoff.
However, even in aviation, not all situations have corresponding procedures to deal with them. Therefore, many situations require a choice to be made among multiple response options. Consider the pilot flying home after a long week away from the family who unexpectedly confronts a line of thunderstorms directly in his path. He can choose to fly around the weather, divert to another field until the weather passes, or penetrate the weather hoping to quickly transition through it. Confronted with situations such as this, choice decision errors, or knowledge-based mistakes as they are otherwise known, may occur. This is particularly true when there is insufficient experience, time, or other outside pressures that may preclude correct decisions. Put simply, sometimes we chose well, and sometimes we don’t.
Finally, there are occasions when a problem is not well understood, and formal procedures and response options are not available. It is during these ill-defined situations that the invention of a novel solution is required. In a sense, individuals find themselves where no one has been before, and in many ways, must literally fly by the seats of their pants. Individuals placed in this situation must resort to slow and effortful reasoning processes where time is a luxury rarely afforded. Not surprisingly, while this type of decision making is more infrequent then other forms, the relative proportion of problem-solving errors committed is markedly higher.
Perceptual errors. Not unexpectedly, when one’s perception of the world differs from reality, errors can, and often do, occur. Typically, perceptual errors occur when sensory input is degraded or “unusual,” as is the case with visual illusions and spatial disorientation or when aircrew simply misjudge the aircraft’s altitude, attitude, or airspeed. Visual illusions, for example, occur when the brain tries to “fill in the gaps” with what it feels belongs in a visually impoverished environment, like that seen at night or when flying in adverse weather. Likewise, spatial disorientation occurs when the vestibular system cannot resolve one’s orientation in space and therefore makes a “best guess” — typically when visual (horizon) cues are absent at night or when flying in adverse weather. In either event, the unsuspecting individual often is left to make a decision that is based on faulty information and the potential for committing an error is elevated. It is important to note, however, that it is not the illusion or disorientation that is classified as a perceptual error. Rather, it is the human’s erroneous response to the illusion or disorientation. For example, many unsuspecting pilots have experienced “black-hole” approaches, only to fly a perfectly good aircraft into the terrain or water. This continues to occur, even though it is well known that flying at night over dark, featureless terrain (e.g., a lake or field devoid of trees), will produce the illusion that the aircraft is actually higher than it is. As a result, pilots are taught to rely on their primary instruments, rather than the outside world, particularly during the approach phase of flight. Even so, some pilots fail to monitor their instruments when flying at night. Tragically, these aircrew and others who have been fooled by illusions and other disorientating flight regimes may end up involved in a fatal aircraft accident.
The Table below sums up the above discussion by providing some examples. It also includes the classification that you will see in SAFAs Safety Reports and in the Companies Safety data base against each type of error.
Skill-based errors. Skill-based behaviour within the context of aviation is best described as “stick-and-rudder” and other basic flight skills that occur without significant conscious thought. As a result, these skill-based actions are particularly vulnerable to failures of attention and/or memory.
In fact, attention failures have been linked to many skill-based errors such as the breakdown in visual scan patterns, task fixation, the inadvertent activation of controls, and the mis-ordering of steps in a procedure, among others. A classic example is an aircraft’s crew that becomes so fixated on troubleshooting a burned out warning light that they do not notice their fatal descent into the terrain. Perhaps a bit closer to home, consider the hapless soul who locks himself out of the car or misses his exit because he was either distracted, in a hurry, or daydreaming. These are both examples of attention failures that commonly occur during highly automatized behaviour. Unfortunately, while at home or driving around town these attention/memory failures may be frustrating, in the air they can become catastrophic.
In contrast to attention failures, memory failures often appear as omitted items in a checklist, place losing, or forgotten intentions. For example, most of us have experienced going to the refrigerator only to forget what we went for. Likewise, it is not difficult to imagine that when under stress during in-flight emergencies, critical steps in emergency procedures can be missed. However, even when not particularly stressed, individuals have forgotten to set the flaps on approach or lower the landing gear – at a minimum, an embarrassing gaffe.
The third, and final, type of skill-based errors identified in many accident investigations involves technique errors. Regardless of one’s training, experience, and educational background, the manner in which one carries out a specific sequence of events may vary greatly. That is, two pilots with identical training, flight grades, and experience may differ significantly in the manner in which they manoeuvre their aircraft. While one pilot may fly smoothly with the grace of a soaring eagle, others may fly with the darting, rough transitions of a sparrow. Nevertheless, while both may be safe and equally adept at flying, the techniques they employ could set them up for specific failure modes. In fact, such techniques are as much a factor of innate ability and aptitude as they are an overt expression of one’s own personality, making efforts at the prevention and mitigation of technique errors difficult, at best.
Decision errors. The second error form, decision errors, represents intentional behaviour that proceeds as intended, yet the plan proves inadequate or inappropriate for the situation. Often referred to as “honest mistakes,” these unsafe acts represent the actions or in-actions of individuals whose “hearts are in the right place,” but they either did not have the appropriate knowledge or just simply chose poorly.
Perhaps the most heavily investigated of all error forms, decision errors can be grouped into three general categories: procedural errors, poor choices, and problem solving errors. Procedural decision errors (Orasanu, 1993), or rule-based mistakes, as described by Rasmussen (1982), occur during highly structured tasks of the sorts, if X, then do Y. Aviation, particularly within the military and commercial sectors, by its very nature is highly structured, and consequently, much of pilot decision making is procedural. There are very explicit procedures to be performed at virtually all phases of flight.
Still, errors can, and often do, occur when a situation is either not recognized or misdiagnosed, and the wrong procedure is applied. This is particularly true when pilots are placed in highly time-critical emergencies like an engine malfunction on takeoff.
However, even in aviation, not all situations have corresponding procedures to deal with them. Therefore, many situations require a choice to be made among multiple response options. Consider the pilot flying home after a long week away from the family who unexpectedly confronts a line of thunderstorms directly in his path. He can choose to fly around the weather, divert to another field until the weather passes, or penetrate the weather hoping to quickly transition through it. Confronted with situations such as this, choice decision errors, or knowledge-based mistakes as they are otherwise known, may occur. This is particularly true when there is insufficient experience, time, or other outside pressures that may preclude correct decisions. Put simply, sometimes we chose well, and sometimes we don’t.
Finally, there are occasions when a problem is not well understood, and formal procedures and response options are not available. It is during these ill-defined situations that the invention of a novel solution is required. In a sense, individuals find themselves where no one has been before, and in many ways, must literally fly by the seats of their pants. Individuals placed in this situation must resort to slow and effortful reasoning processes where time is a luxury rarely afforded. Not surprisingly, while this type of decision making is more infrequent then other forms, the relative proportion of problem-solving errors committed is markedly higher.
Perceptual errors. Not unexpectedly, when one’s perception of the world differs from reality, errors can, and often do, occur. Typically, perceptual errors occur when sensory input is degraded or “unusual,” as is the case with visual illusions and spatial disorientation or when aircrew simply misjudge the aircraft’s altitude, attitude, or airspeed. Visual illusions, for example, occur when the brain tries to “fill in the gaps” with what it feels belongs in a visually impoverished environment, like that seen at night or when flying in adverse weather. Likewise, spatial disorientation occurs when the vestibular system cannot resolve one’s orientation in space and therefore makes a “best guess” — typically when visual (horizon) cues are absent at night or when flying in adverse weather. In either event, the unsuspecting individual often is left to make a decision that is based on faulty information and the potential for committing an error is elevated. It is important to note, however, that it is not the illusion or disorientation that is classified as a perceptual error. Rather, it is the human’s erroneous response to the illusion or disorientation. For example, many unsuspecting pilots have experienced “black-hole” approaches, only to fly a perfectly good aircraft into the terrain or water. This continues to occur, even though it is well known that flying at night over dark, featureless terrain (e.g., a lake or field devoid of trees), will produce the illusion that the aircraft is actually higher than it is. As a result, pilots are taught to rely on their primary instruments, rather than the outside world, particularly during the approach phase of flight. Even so, some pilots fail to monitor their instruments when flying at night. Tragically, these aircrew and others who have been fooled by illusions and other disorientating flight regimes may end up involved in a fatal aircraft accident.
The Table below sums up the above discussion by providing some examples. It also includes the classification that you will see in SAFAs Safety Reports and in the Companies Safety data base against each type of error.
Some Examples of Errors and Violations with abbreviations used for them in SAFAs Safety Database. |
Violations.
By definition, errors occur within the rules and regulations
espoused by an organization, typically dominating most accident databases. In
contrast, violations represent a wilful disregard for the rules and regulations that govern safe flight and, fortunately, occur
much less frequently since they often involve fatalities (Shappell et al.,
1999).
While there are many ways to distinguish between types of
violations, two distinct forms have been identified, based on their etiology,
that will help the safety professional when identifying accident causal
factors. The first, routine violations, tend to be habitual by nature and often
tolerated by governing authority (Reason, 1990). Consider, for example, the
individual who drives consistently 5-10 Kmph faster than allowed by law or
someone who routinely flies in marginal weather when authorized for visual
meteorological conditions only. While both are certainly against the governing regulations,
many others do the same thing. Furthermore, individuals who drive 60 Kmph in a 50
Kmph zone, almost always drive 60 in a 50 Kmph zone. That is, they “routinely”
violate the speed limit. The same can typically be said of the pilot who
routinely flies into marginal weather.
What makes matters worse, these violations (commonly referred to
as “bending” the rules) are often tolerated and, in effect, sanctioned by
supervisory authority (i.e., you’re not likely to get a traffic citation until
you exceed the posted speed limit by more than 10 Kmph). If, however, the local
authorities started handing out traffic citations for exceeding the speed limit
on the highway by 10 Kmph or less (as is often done on military installations),
then it is less likely that individuals would violate the rules. Therefore, by
definition, if a routine violation is identified, one must look further up the supervisory
chain to identify those individuals in authority who are not enforcing the
rules.
On the other hand, unlike routine violations, exceptional
violations appear as isolated departures from authority, not necessarily
indicative of individual’s typical behaviour pattern nor condoned by management (Reason, 1990). For example, an isolated
instance of driving 105 Kmph in a 50 Kmph zone is considered an exceptional
violation. Likewise, flying under a bridge or engaging in other prohibited manoeuvres,
like low-level canyon running, would constitute an exceptional violation.
However, it is important to note that, while most exceptional violations are
appalling, they are not considered “exceptional” because of their extreme
nature.
Rather, they are considered exceptional because they are neither
typical of the individual nor condoned by authority. Still, what makes
exceptional violations particularly difficult for any organization to deal with
is that they are not indicative of an individual’s behavioural repertoire and,
as such, are particularly difficult to predict. In fact, when individuals are
confronted with evidence of their dreadful behaviour and asked to explain it,
they are often left with little explanation. Indeed, those individuals who
survived such excursions from the norm clearly knew that, if caught, dire consequences
would follow. Still, defying all logic, many otherwise model citizens have been
down this potentially tragic road.
This
ladies and gentlemen, is all for today. In the next post, we shall move
one level up the food chain and discuss Preconditions for unsafe acts.
Before you think you have learned all there is to learn about Human Error, remember, we are at the bottom of the Food Chain that feeds Accidents. Our job is not done till we reach the Lion of errors, the Organisation. Our path is long and will be negotiated through some case studies and some discussions like the one above. But those that stay with me, I assure you, your visits to this blog will be worth their while!
I will look forward to your comments and feedback on this post before we proceed to the next level in our re-discovery of Human Error.
If you have not yet subscribed to this blog, kindly click on the "subscribe" link below to get email updates whenever a new post is added to the blog.
Before you think you have learned all there is to learn about Human Error, remember, we are at the bottom of the Food Chain that feeds Accidents. Our job is not done till we reach the Lion of errors, the Organisation. Our path is long and will be negotiated through some case studies and some discussions like the one above. But those that stay with me, I assure you, your visits to this blog will be worth their while!
I will look forward to your comments and feedback on this post before we proceed to the next level in our re-discovery of Human Error.
If you have not yet subscribed to this blog, kindly click on the "subscribe" link below to get email updates whenever a new post is added to the blog.
Until next week,
The Erring Human
Very interesting, EH. I have added you to my blogroll and to my Google reader.
ReplyDeleteThanks Dakota. Knowing you guys are liking it motivates me to write more. Will cover an interesting case study in Errors and Violations next week.
ReplyDeleteThe erring human who am I concept, touched the skies to glory
ReplyDeleteYour motivation is like food for the brain. I cannot get enough in one sitting. It needs continual and regular top up's.
Asking is the beginning of receiving. I am waiting for the next blog.
I’ll make sure I don't come to the ocean with a teaspoon.
With Regards
Kathirvelu V
Thanks Velu. Good to know you found this valuable. Kindly share with your other colleagues and friends also at the airport who may have an interest in this subject.
ReplyDelete