Friday, December 30, 2011

Preconditions to Unsafe Acts

Let us now climb higher on the ladder of Accident Causation "Food Chain" - Preconditions to Unsafe Acts - The conditions that make any human more prone to errors.

Preconditions for Unsafe Acts. Arguably, the unsafe acts of pilots can be directly linked to nearly 80% of all aviation accidents. However, simply focusing on unsafe acts is like focusing on a fever without understanding the underlying disease causing it. Thus, investigators must dig deeper into why the unsafe acts took place. As a first step, two major subdivisions of unsafe aircrew conditions were developed: substandard conditions of operators and the substandard practices they commit.

 
Substandard Conditions of Operators
Substandard Practices of Operators
Adverse Mental States
Adverse Psychological States
Physical/Mental Limitations
Crew Resource Mismanagement
Personal Readiness
Channelised attention
Impaired psychological state
Insufficient reaction time
Failed to back up
Excessive physical training
Complacency
Medical illness
Visual limitation
Failed to coordinate/ communicate
Self medication
Distraction
Mental Fatigue
Psychological incapacitation
Incompatible intelligence/Aptitude
Failed to conduct adequate brief
Violation of crew rest requirements
Get-home-itis
Physical Fatigue
Incompatible physical capability
Failed to use all available resources
Violation of bottle to throttle requirements
Haste


Failure of leadership

Loss of situational awareness


Misinterpretation of traffic calls

Misplaced Motivation




Task Saturation





Substandard Conditions of Operators

Adverse mental states. Being prepared mentally is critical in nearly every endeavour, but perhaps even more so in aviation. As such, the category of Adverse Mental States was created to account for those mental conditions that affect performance. Principal among these are the loss of situational awareness, task fixation, distraction, and mental fatigue due to sleep loss or other stressors. Also included in this category are personality traits and pernicious attitudes such as overconfidence, complacency, and misplaced motivation.

Predictably, if an individual is mentally tired for whatever reason, the likelihood increase that an error will occur. In a similar fashion, overconfidence and other pernicious attitudes such as arrogance and impulsivity will influence the likelihood that a violation will be committed. Clearly then, any framework of human error must account for pre-existing adverse mental states in the causal chain of events.

Adverse physiological states. The second category, adverse physiological states, refers to those medical or physiological conditions that preclude safe operations. Particularly important to aviation are such conditions as visual illusions and spatial disorientation as described earlier, as well as physical fatigue, and the myriad of pharmacological and medical abnormalities known to affect performance.

The effects of visual illusions and spatial disorientation are well known to most aviators. However, less well known to aviators, and often overlooked are the effects on cockpit performance of simply being ill. Nearly all of us have gone to work ill, dosed with over-the-counter medications, and have generally performed well. Consider however, the pilot suffering from the common head cold. Unfortunately, most aviators view a head cold as only a minor inconvenience that can be easily  remedied using over-the counter antihistamines, acetaminophen, and other non-prescription pharmaceuticals. In fact, when confronted with a stuffy nose, aviators typically are only concerned with the effects of a painful sinus block as cabin altitude changes. Then again, it is not the overt symptoms that local flight surgeons are concerned with. Rather, it is the accompanying inner ear infection and the increased likelihood of spatial disorientation when entering instrument meteorological conditions that is alarming - not to mention the side-effects of antihistamines, fatigue, and sleep loss on pilot decision-making. Therefore, it is incumbent upon any safety professional to account for these sometimes subtle medical conditions within the causal chain of events.

Physical/Mental Limitations. The third, and final, substandard condition involves individual physical/mental limitations. Specifically, this category refers to those instances when mission requirements exceed the capabilities of the individual at the controls. For example, the human visual system is severely limited at night; yet, like driving a car, drivers do not necessarily slow down or take additional precautions. In aviation, while slowing down isn’t always an option, paying additional attention to basic flight instruments and increasing one’s vigilance will often increase the safety margin. Unfortunately, when precautions are not taken, the result can be catastrophic, as pilots will often fail to see other aircraft, obstacles, or power lines due to the size or contrast of the object in the visual field.

Similarly, there are occasions when the time required to complete a task or manoeuvre exceeds an individual’s capacity. Individuals vary widely in their ability to process and respond to information. Nevertheless, good pilots are typically noted for their ability to respond quickly and accurately. It is well documented, however, that if individuals are required to respond quickly (i.e., less time is available to consider all the possibilities or choices thoroughly), the probability of making an error goes up markedly.

Consequently, it should be no surprise that when faced with the need for rapid processing and reaction times, as is the case in most aviation emergencies, all forms of error would be exacerbated. In addition to the basic sensory and information processing limitations described above, there are at least two additional instances of physical/mental limitations that need to be addressed, albeit they are often overlooked by most safety professionals. These limitations involve individuals who simply are not compatible with aviation, because they are either unsuited physically or do not possess the aptitude to fly. For example, some individuals simply don’t have the physical strength to operate in the potentially high-G environment of aviation, or for anthropometric reasons, simply have difficulty reaching the controls. In other words, cockpits have traditionally not been designed with all shapes, sizes, and physical abilities in mind. Likewise, not everyone has the mental ability or aptitude for flying aircraft. Just as not all of us can be concert pianists or NFL linebackers, not everyone has the innate ability to pilot an aircraft – a vocation that requires the unique ability to make decisions quickly and respond accurately in life threatening situations. The difficult task for the safety professional is identifying whether aptitude might have contributed to the accident causal sequence.

Substandard Practices of Operators

Clearly then, numerous substandard conditions of operators can, and do, lead to the commission of unsafe acts. Nevertheless, there are a number of things that we do to ourselves that set up these substandard conditions. Generally speaking, the substandard practices of operators can be summed up in two categories: crew resource mismanagement and personal readiness.

Crew Resource Mismanagement. Good communication skills and team coordination have been the mantra of industrial/organizational and personnel psychology for decades. Not surprising then, crew resource management has been a cornerstone of aviation for the last few decades (Helmreich & Foushee, 1993). As a result, the category of crew resource mismanagement was created to account for occurrences of poor coordination among personnel. Within the context of aviation, this includes coordination both within and between aircraft with air traffic control facilities and maintenance control, as well as with facility and other support personnel as necessary. But aircrew coordination does not stop with the aircrew in flight. It also includes coordination before and after the flight with the brief and debrief of the aircrew. It is not difficult to envision a scenario where the lack of crew coordination has led to confusion and poor decision making in the cockpit, resulting in an accident. In fact, aviation accident databases are replete with instances of poor coordination among aircrew. One of the more tragic examples was the crash of a civilian airliner at night in the Florida Everglades in 1972 as the crew was busily trying to troubleshoot what amounted to a burnt out indicator light. Unfortunately, no one in the cockpit was monitoring the aircraft’s altitude as the altitude hold was inadvertently disconnected. Ideally, the crew would have coordinated the trouble-shooting task ensuring that at least one crew member was monitoring basic flight instruments and “flying” the aircraft. Tragically, this was not the case, as they entered a slow, unrecognized, descent into the everglades resulting in numerous fatalities.

Personal Readiness. In aviation, or for that matter in any occupational setting, individuals are expected to show up for work ready to perform at optimal levels. Nevertheless, in aviation as in other professions, personal readiness failures occur when individuals fail to prepare physically or mentally for duty. For instance, violations of crew rest requirements, bottle-to-brief rules, and self-medicating all will affect performance on the job and are particularly detrimental in the aircraft. It is not hard to imagine that, when individuals violate crew rest requirements, they run the risk of mental fatigue and other adverse mental states, which ultimately lead to errors and accidents. Note however, that violations that affect personal readiness are not considered “unsafe act, violation” since they typically do not happen in the cockpit, nor are they necessarily active failures with direct and immediate consequences.

Still, not all personal readiness failures occur as a result of violations of governing rules or regulations. For example, running 10 miles before piloting an aircraft may not be against any existing regulations, yet it may impair the physical and mental capabilities of the individual enough to degrade performance and elicit unsafe acts. Likewise, the traditional “candy bar and coke” lunch of the modern businessman may sound good but may not be sufficient to sustain performance in the rigorous environment of aviation. While there may be no rules governing such behaviour, pilots must use good judgment when deciding whether they are “fit” to fly an aircraft.

Next week we will progress into the next level of accident causation food chain - The Supervision.

Until then,

The Erring Human.

Monday, December 19, 2011

Andersen's Temple of Doom

 “We just received a message from Saddam Hussein. The good news is that he’s willing to have his nuclear, biological and chemical weapons counted. The bad news is he wants Arthur Andersen to do it.”                                 
  -George W. Bush, 2002

Salient points that emerge from the case study are:

1.        Policies.       Andersen tried to present a defense that destruction of documents was in accordance with its “Document retention policy”. However, the courts were not impressed because while policies are common in business and it is not illegal for a manager to instruct his employees to comply with a valid document retention policy under ordinary circumstances, policies must be reasonable and evenly applied. In this case, it was seen that the policy was selectively applied only to Enron documents!

2.       Role of Supervision.         The decision to destroy documents started with Temple, but its implementation was company-wide. No one questioned, much less opposed, Temple’s instructions. Over a dozen of Andersen’s most senior global managers were party to discussions in which pursuit of the mostly ignored document policy was urged, and dozens more of the firm’s lower level employees carried out the work of purging the record, without objection. The document destruction was not limited to Andersen’s Houston office; Enron records were also destroyed in Chicago, Portland, and London.

3.       Corporate Culture.           Andersen’s in-house lawyers were expected to “rubber-stamp” all transactions, regardless of ethical or legal propriety. Andersen seemingly expected its employees, including in-house counsel, to protect the “firm” and its clients at all costs, legal or otherwise. Andersen had an “up or out” environment, in which employees either moved up the ranks or were moved out of the firm. By the late 1990s, the sure and possibly only way for Andersen employees to move up the ranks was “to keep both their bosses and the people at Enron happy” and the sure way was to approve every transaction. By contrast, the sure way for Andersen employees to move out of the firm was to dissent to an Enron transaction.

The experience of Andersen partner Carl Bass exemplifies the “yes-man” culture at Andersen. Bass was a senior partner in Andersen’s Houston office. He served on the prestigious “Professional Standards Group (PSG)”, an internal team of accounting experts that reviewed and approved troublesome “accounting issues” confronting local offices. For decades, the PSG’s word was accepted as law at Andersen.

Enron was considered one of Andersen’s highest-risk clients. In February 2001, Bass, who had been assigned “to monitor . . . high-risk audit[s], strongly objected to Enron’s accounting.”  Bass’s objection was overruled by local partners in the Houston office; Andersen was the only Big Five accounting firm that allowed local partners to overrule the PSG. Thereafter, Bass continued to object to Enron’s accounting and, not surprisingly, tensions grew between Bass and Enron. Enron “considered him a roadblock to their rapid fire deal-making.” Rather than stand up for Bass – a member of the PSG – Andersen, in an unprecedented move that was protested by most of the members of the PSG, demoted Bass by removing him from all oversight of the Enron account. Bass was demoted for being too rules-oriented. The demotion was no small matter, as it was approved by Andersen’s CEO Joe Berardino.

Bass paid the price for saying “no” to a rogue client. At least two other Andersen accountants – Jennifer Stevenson and Pattie Grutzmacher – were also removed from the Enron engagement for challenging Enron’s use of SPEs.  Undoubtedly, these demotions sent a clear message to all Andersen employees, including Temple.

In this environment, how could one expect Nancy Temple, a relatively junior in-house lawyer who had recently been assigned to the Enron account, to say “no” to Enron or senior Andersen partners when she had recently witnessed the demotion of a senior partner for the very same act? Thus, Andersen’s culture presented Temple with an excruciating dilemma: protect Andersen by instructing its employees to destroy Enron’s documents or destroy her career. Unfortunately, she chose the former and, ironically, destroyed Andersen.

4.       The Temple of Doom.      In conclusion, Andersen’s “Temple of Doom” was its corporate culture, a cult-like culture in which employees were not free to think or act independently. It was this culture – and not greedy partners or unethical lawyers – that doomed Andersen to a needless death.

So, what do we learn from this case study?

This case study is a reaffirmation of the earlier statement that Organizations create a climate in which humans work and take decisions. The climate, or culture, created by an organization has a direct bearing on the type of decisions its employees take, and hence the number of errors they make!

The role of supervision is also clearly brought out here. Any degree of supervision over Temple could have prevented this disaster. However, those responsible for supervising Temple went along with her…but we will talk more about this when we discuss the role of supervision in detail.

So, next week we will move on to the higher elements in the error causation food chain.
Until next week,

The Erring Human.

Sunday, December 11, 2011

Arthur Andersen and the Temple of Doom



Well, well, well...as on the date of writing this, the last post got 440 page views and not one comment! Either everyone understood everything, or no one understood anything! I only hope it is the former because living in a vacuum without any feedback in form of comments, I have no way of knowing if the message is reaching the right audience. While I am ecstatic at number of page views the post got, I feel sad at not having any comments! So guys, please do leave a comment so I know if there are any doubts, disagreements or misunderstandings and can then evaluate how fast or slow to proceed!

Today, we will do one more case study. This case study will prepare us to understand the relationship between Human Error and the higher elements in the error causation food chain. The story is about the demise of one of the worlds largest auditing firms, Arthur Andersen Inc. Following short videos convey an interesting message and build a background to the case.

Downfall of Arthur Andersen (3 minutes)

Enron - Arthur Andersen (5 minutes)

The story of Nancy Temple (Temple) and Arthur Andersen (Andersen) is infamous in legal ethics. Temple was the in-house lawyer that advised Andersen’s employees to shred documents on the eve of the Security and Exchange Commission’s (SEC) investigation of Enron Corporation (Enron). Temple’s advice triggered a string of events that culminated in the needless demise of America’s fifth-largest accounting firm.

The accounting firm of Arthur Andersen was founded in 1913. Until 2002, Andersen was one of the world’s largest accounting and consulting firms. It was a $9 billion ‘big five’ accounting firm with hundreds of partners and more than 28,000 U.S. employees and 85,000 global employees.

Enron started as a natural gas pipeline operator but transformed itself into an energy trading and investment conglomerate in the 1990’s. During this time, Andersen audited Enron’s publicly filed statements while simultaneously providing internal auditing and consulting services to Enron. Enron was Andersen’s largest client, accounting for $58 million of Andersen’s revenue in 2000. There was a revolving door between Andersen and Enron, as dozens of Enron’s financial executives and accountants were former Andersen employees.

On August 14, 2001, Jeffrey Skilling, Enron’s CEO, resigned. A few days later, Sherron Watkins, a senior accountant at Enron (and a former Andersen auditor), blew the whistle by informing Kenneth Lay, Enron’s Chairman, and two senior Andersen accountants that Enron was ready to implode in a wave of accounting scandals. At this point, Andersen created an internal crisis-response group, which included Temple, an in-house lawyer in Andersen’s Chicago office. 

On Oct 09, it was recognized that an SEC investigation of Enron and Anderson was “highly probable”. The very next day, Michael Odom (Odom), the senior Andersen partner on the Enron account, sent an email message to Andersen personnel urging them to comply with Andersen’s document retention policy, noting “if it’s destroyed in the course of normal policy and litigation is filed the next day, that’s great . . . we’ve followed our own policy and whatever there was that might have been of interest to somebody is gone and irretrievable.”

On October 12, Temple entered the Enron matter into Andersen’s internal tracking system, identifying it as a government regulatory investigation. Nonetheless, on that very same day, Temple sent an email to Odom suggesting that it might be useful to “consider reminding the engagement team of our documentation and retention policy.” Odom forwarded Temple’s email to David Duncan (Duncan), Andersen’s audit partner on the Enron account. As he later explained, Duncan felt “justified” destroying documents based on Temple’s email.

Later, at the trial Duncan entered into a plea agreement with the government under which he agreed to plead guilty to one count of obstruction of justice. At the Andersen trial, Duncan testified on direct examination that “I obstructed justice, I instructed people on the (Enron audit) team to follow the document-retention policy, which I knew would result in the destruction of documents. Obviously, the thought of litigation, whether with the SEC or some other kind, was on our minds when we destroyed the documents.”

The SEC sent a letter to Enron on October 16th informing Enron that the SEC had commenced an informal investigation, and that an additional accounting letter would follow. Andersen received a copy of the SEC letter on October 19. The following morning a crisis group conference was called and Temple reminded everyone to make sure to follow the documentation and retention policy. On October 22, Enron publicly acknowledged that the SEC had started an informal investigation. On October 30th, the SEC sent Enron a letter informing it that a formal investigation had begun and requesting accounting documents. Andersen continued, however, to destroy documents. In addition, more than 30,000 emails and computer files were deleted. In the end, documents supporting the final audit were retained but, in accordance with Andersen’s document retention policy, drafts, notes, and other non-supporting documents were destroyed. The shredding continued until the SEC issued a subpoena for records. Andersen received a copy of the subpoena on November 8th, after which it advised its personnel to cease shredding documents.

On March 7, 2002, Andersen was indicted for obstructing an official proceeding of the SEC in violation of 18 U.S.C. section 1512(b)(2). In effect, Andersen was charged with “witness tampering,”

For all practical purposes, the indictment destroyed Andersen in a matter of weeks, as its clients fled the firm and the firm was forced to slash its workforce and sell off its component services in response. The departing clients included long-term, Fortune 500 clients like Colgate-Palmolive and Merck. Moreover, Andersen’s overseas offices quickly moved to sever ties with their U.S. parent, with entire country groups – e.g., Spain and Chile – leaving Andersen to join other large accounting firms.

On October 16, 2002, the judge imposed the maximum sentence on Andersen: a $500,000 fine and five years’ probation.

It is now clear that Andersen was facing substantial civil and regulatory liability in the Enron matter. But such exposure did not destroy Andersen. Rather, Andersen was destroyed by the criminal indictment and conviction, which were based on Andersen’s destruction on Enron-related documents between October 16 and November 9, 2001.

So, the question to you is, “what do you think caused the downfall of this great firm? Is this merely a matter of “Human Error” by Nancy Temple or is there a larger issue behind this?”

I will look forward to your comments in answer to this question before we proceed further to study in greater detail the Human Error made by Nancy Temple to decide what really was the Arthur Andersen's "Temple of Doom".


Until next week,

The Erring Human. 

Saturday, December 3, 2011

Case Study: Ueberlingen Midair Collision

Before we move any further, lets do a case study to drive home the point of Errors and Violations.

On the night of July 1, 2002, a Boeing 757 collided with a Tupolev-154 at 35,000 feet, resulting in 71 fatalities. Initially, this accident was immediately blamed on two individuals. First, the pilot of the Tupolev aircraft and second the controller on duty. Let us re-examine the event, highlighting fundamental human and system errors that occurred that night: errors that contributed to one of the worst midair collisions in recent history.

Kindly visit the following links to view video re-construction  of the events. The first link is a 10 minute video that those of you who may be short of time can watch to get the essentials of the accident. Those with more time at hand may prefer the second video that discusses the case in much more detail over a 45 minute period.

The following narrative draws heavily from the research paper presented by Dr. Ashley Nunes & Dr. Tom Laursen of the University of Illinois, Aviation Human Factors Division Savoy, IL in coordination with Sky guide, Air Traffic Control Operations, Zurich Area Control Center, Switzerland at the 48th Annual Chapter Meeting of the Human Factors and Ergonomics Society, September 20 - 24, 2004, New Orleans, LA, USA.

Known Sequence of Events

The Boeing 757 (registered to DHL) was en route from Bergamo (Italy) to Brussels on a heading of 004 degrees at FL 260. The Tupolev-154 (registered to Bashkirian Airlines) was flying from Munich to Barcelona on a heading of 254 degrees at FL 360, correcting its heading twice within the last minute to end up on heading of 274 degrees. Both aircraft were equipped with the Traffic Collision and Avoidance System (TCAS) and their trajectories put them on a converging course at a 90° angle in airspace above Lake Constance, Germany.

Under a contractual agreement between the German and Swiss government, this airspace was under the authority of the Zurich Area Control Center (ACC). After making contact with the B757, the Swiss controller issued two clearances to the B757. First he cleared the B757 to climb to FL 320 and at time 21.26.36 to climb to FL 360. At time 21.30.11 the T-154 called in. After that, the Swiss controller did not initiate any contact with either aircraft until just seconds before the TCAS system aboard gave both pilots a traffic advisory. Following this, the controller instructed the T-154 to descend from FL 360 to FL 350 to avoid collision with the B757. However, the TCAS on board the T-154 and B757 instructed the pilots to climb and descend respectively. After receiving contradictory instructions, the T-154 pilot opted to obey controller orders and began a descent to FL 350 where it collided with the B757, which had followed its own TCAS advisory to descend. All 71 people were killed. 

Trajectories of B757 and T-154.
At first glance, knowledge of the timeline of events would suggest that there were two individuals who were solely to blame for the accident. Firstly, the Russian pilot who disobeyed his TCAS system and followed controller instructions to descend instead of climbing. Second and more importantly, blame should lie on the controller who was fully aware of the presence of both aircraft in his sector but waited for more than four minutes before issuing a descent clearance and a traffic information report to the Russian pilot. The controller’s most important task is to ensure safety in the sector. The controller failed in that task: or did he?

Identification of Contributing Factors

Contributing Factor 1. Single Man Operations. The presence of only one controller working the radar screen represents one of the underlying causes of the accident, namely lack of supervision or assistance in safety-critical situation. This Single Man Operation (SMOP) was a controversial procedure implemented in 2001, despite numerous protests from the controller union.

Contributing Factor 2. Downgraded Radar.  Procedures in force stated that when the SMOP is in effect, a conflict detection system be on and fully functional. The Zurich ACC’s system, known as the Short Term Conflict Alert (STCA), provided the controller with a two-minute alarm, which visually indicated the presence of a conflict. On that night, maintenance work was being done on the main radar system, which placed radar services in their fall back mode. As a result, separation minimums between aircraft were increased from 5 miles to 7 miles (corresponding to approximately one minute). The fall-back radar mode also meant that the STCA was not available. While unit procedures specifically mandated that the STCA be available when SMOP were taking place: but it was not.

Contributing Factor 3. Dual Frequency Responsibility.  The controller had to monitor two display consoles that were separated by over a meter, resulting in the maintenance of divided attention for a sustained period of time.

Contributing Factor 4. Phone System.  The automated phone system used in the Zurich ACC enabled controllers to communicate with one another at the touch of a button. In addition to inter-facility coordination, the controller could also communicate with ATC facilities in Germany to coordinate local approaches such as that to the FHA airport. On the night of the accident the main telephone system was also out for maintenance and the back-up system had a software failure, which no one in the company had noticed, not even during tests run three month before the accident. As a result, when the controller tried to contact the FHA tower to inform them that the second aircraft was requesting a different approach, he could not get through. Given that the phone system had worked perfectly since its implementation (more than four years ago), the controller had a high degree of trust in the system and as a result did not think the system had failed, rather believing he had dialed the wrong number. He continued his attempts to reach the FHA tower while neglected to maintain his usual scanning pattern on the other radar console, which depicted the B757 and T154 converging at the same altitude. The severity of the malfunctioning phone system cannot be underestimated. Two minutes before the collision occurred, controllers working the Upper Area Sector at Karlsruhe, Germany noticed the situation unfolding, given that their own STCA had gone off, and tried to contact the Swiss controller to warn him. Despite numerous attempts, they could not get through to him because of the malfunction in the phone system. The controller’s communication with the outside world was essentially cut-off. The next line of defense at this point was TCAS.

Contributing Factor 5. TCAS.  TCAS is designed to provide not only traffic advisories but also resolution recommendations to avoid a midair collisions and it was in fact this system that alerted the pilots of both aircraft to the pending conflict a full seven seconds before the controller, who was busy vectoring another aircraft in for landing using a separate radar screen. After the pilots were alerted to the collision, the TCAS instructed the DHL pilot to descend and the T154 pilot to climb. However, the T154 had already been instructed by the controller to descend.

This choice exacts that two technical issues be considered. Firstly, TCAS does not provide the controller with information regarding resolution advisories: the pilot only knows these advisories. Therefore, the controller had no way of knowing that the system had instructed the T-154 to climb, resulting in an ‘honest’ decision error on the part of the controller. Second and more importantly, TCAS does not account for situations where one of the aircraft does not follow its instructions. In the present case, T-154 disobeyed its own TCAS instructions to climb (the pilot opting to follow controller instructions) and descended to FL 350. 

The result in the B757 cockpit, was an instruction to increase the rate of descent rather than remaining level at its original altitude of FL 360. Had this been done, safe separation would have been maintained. 

This inability of TCAS to make the controller aware of what resolution advisories were issued to the pilot or account for the execution of alternative actions by the pilot represent major limitations of the system; limitations that played a role in this event.

Contributing Factor 6. Corporate Culture.  Whereas the B757 pilot followed the TCAS advisory to descend, the T-154 pilot opted out of following this advisory to climb and followed controller instructions to descend. This raises the issue of why the pilots of two separate aircraft would respond to the system in such a different way. When presented with conflicting information between ATC and TCAS, European pilots are advised to follow TCAS whereas Russian pilots were trained to take both into account before rendering a decision. In most instances, the latter group chose to follow ATC. This may help explain why the B757 pilot (who was British) and the T154 pilot acted in the manner observed. 

Today, of course, we train all Pilots to follow TCAS without any need for an approval or prior information to the controller. However, this was not the case back in 2002.

Conclusion. 

As can be seen, what appeared to be a simple case of Human Error on the part of controller and the pilot, on deeper analysis turns out to be combination of organisational and systemic failures. Lets try and understand this more clearly.

Single Man Operation (SMOP). The Zurich ATC had implemented SMOP procedures despite objections from the unions and implemented it also during the night, eliminating the Safety Layer of supervision and assistance from the system. This put the controller under stress and forced a Human Failure. We can classify this as a case of Routine Violation by the organisation.

Short Term Conflict Alert (STCA).  The procedures required STCA to be available when SMOP was in force, but it was not. This can be classified as an Exceptional Violation by the organisation.

The Phone System.  When the controller was unable to contact FHA over the phone, he did not think the system had failed, rather believing he had dialed the wrong number. He continued his attempts to reach the FHA tower while neglecting to maintain his usual scanning pattern on the other radar console, which depicted the B757 and T154 converging at the same altitude. There are two errors here. The first is a case of applying incorrect solution to a given problem, or Problem Solution Error while The second is breakdown of his scanning pattern, classified as Technique Error. However, it can be clearly seen that these were precipitated by an overworked controller facing an automation surprise and thereby resulting in cognitive tunneling, a totally avoidable situation!

The Russian Pilots.  While there is some debate over the actions of the Russian pilots in following the controller instead of TCAS, and some analysts tend to classify that as an Exceptional Violation. However, it must be understood that they acted purely in accordance with their training and the corporate culture prevailing in their company.

This also brings us to another very important cultural issues point. Individuals that were born and brought-up in the technology savvy western world are more comfortable with technology and more likely to trust a machine input, however, those that grew up in the developing and third world countries are not so comfortable with technology and more likely to believe me, The Erring Human, over the machine/technology...and hence the comfort level of the Russian Pilots in following the controller over TCAS ("...he is guiding us down!").

So, as we delve deeper in the details of an error or of a violation, we can see clearly the linkages it has to the higher levels in the food chain described earlier. If the concept is clear thus far, we are ready to move on to study the next level in the accident causation food chain.

I look forward to your comments and questions before moving further in the subject.

Until next week,

The Erring Human.