Saturday, January 21, 2012

Organisational Influences

As noted previously, fallible decisions of upper-level management directly affect supervisory practices, as well as the conditions and actions of operators. Unfortunately, these organizational errors often go unnoticed, due in large part to the lack of a clear framework from which to investigate them. Generally speaking, the most elusive of latent failures revolve around issues related to resource management, organizational climate, and operational processes, as detailed in Figure below .

Organisational Influences.
Selected examples of Organisational Influences (Note: This is not a complete listing.)

Organisational Influences
Resource/Acquisition Management
Organisational Processes
Organisational Climate/Culture
Human Resources - selection, staffing/manning, Training.
Operations - Operational tempo, time pressure, production quotas, incentives, appraisal, schedules, deficient planning
Structure – Chain of command, Delegation of authority, Communication, Formal accountability for actions.
Monetary/budget resources -  excessive cost cutting, Lack of funding.
Procedures – Standards, Clearly defined objectives, Documentation, Instructions.
Policies – Hiring and Firing, Promotions, Drugs and Alcohol.
Equipment/facility resources - Poor design, Purchasing of unsuitable equipment.
Oversight – Risk management, Safety Programs
Culture – Norms and Rules, Values and beliefs, Organisational justice.

Resource Management. This category encompasses the realm of corporate-level decision making regarding the allocation and maintenance of organizational assets such as human resources (personnel), monetary assets, and equipment/facilities. Generally, corporate decisions about how such resources should be managed centre around two distinct objectives – the goal of safety and the goal of on-time, cost-effective operations. In times of prosperity, both objectives can be easily balanced and satisfied in full. However, as we mentioned earlier, there may also be times of fiscal austerity that demand some give and take between the two. Unfortunately, history tells us that safety is often the loser in such battles and, as some can attest to very well, safety and training are often the first to be cut in organizations having financial difficulties. If cutbacks in such areas are too severe, flight proficiency may suffer, and the best pilots may leave the organization for greener pastures. Excessive cost-cutting could also result in reduced funding for new equipment or may lead to the purchase of equipment that is sub optimal and inadequately designed for the type of operations flown by the company. Other trickle-down effects include poorly maintained equipment and work spaces, and the failure to correct known design flaws in existing equipment. The result is a scenario involving unseasoned, less-skilled pilots flying old and poorly maintained aircraft under the least desirable conditions and schedules. The ramifications for aviation safety are not hard to imagine.

Climate. Organizational Climate refers to a broad class of organizational variables that influence worker performance. Formally, it was defined as the “situationally based consistencies in the organization’s treatment of individuals” (Jones, 1988). In general, however, organizational climate can be viewed as the working atmosphere within the organization. One tell-tale sign of an organization’s climate is its structure, as reflected in the chain-of-command, delegation of authority and responsibility, communication channels, and formal accountability for actions.

Just like in the cockpit, communication and coordination are vital within an organization. If management and staff within an organization are not communicating, or if no one knows who is in charge, organizational safety clearly suffers and accidents do happen (Muchinsky, 1997). An organization’s policies and culture are also good indicators of its climate. Policies are official guidelines that direct management’s decisions about such things as hiring and firing, promotion, retention, raises, sick leave, drugs and alcohol, overtime, accident investigations, and the use of safety equipment. Culture, on the other hand, refers to the unofficial or unspoken rules, values, attitudes, beliefs, and customs of an organization. Culture is “the way things really get done around here.”

When policies are ill-defined, adversarial, or conflicting, or when they are supplanted by unofficial rules and values, confusion abounds within the organization. Indeed, there are some corporate managers who are quick to give “lip service” to official safety policies while in a public forum, but then overlook such policies when operating behind the scenes. However, the Third Law of Thermodynamics tells us that, “order and harmony cannot be produced by such chaos and disharmony”. Safety is bound to suffer under such conditions.

Organisational Process. This category refers to corporate decisions and rules that govern the everyday activities within an organization, including the establishment and use of standardized operating procedures and formal methods for maintaining checks and balances (oversight) between the workforce and management. For example, such factors as operational tempo, time pressures, incentive systems, and work schedules are all factors that can adversely affect safety. As stated earlier, there may be instances when those within the upper echelon of an organization determine that it is necessary to increase the operational tempo to a point that overextends a supervisor’s staffing capabilities. Therefore, a supervisor may resort to the use of inadequate scheduling procedures that jeopardize crew rest and produce sub optimal crew pairings, putting aircrew at an increased risk of a mishap. However, organizations should have official procedures in place to address such contingencies as well as oversight programs to monitor such risks. Regrettably, not all organizations have these procedures nor do they engage in an active process of monitoring aircrew errors and human factor problems via anonymous reporting systems and safety audits. As such, supervisors and managers are often unaware of the problems before an accident occurs. Indeed, it has been said that “an accident is one incident too many” (Reinhart, 1996). It is incumbent upon any organization to fervently seek out the “holes in the cheese” and plug them up, before they create a window of opportunity for catastrophe to strike.

So much for today. In the next post we will study yet another take on Organisational Influences before doing a case study to drive home the point.

Until then,

The Erring Human.

Saturday, January 14, 2012

SUPERvision

Unsafe Supervision

In addition to those causal factors associated with the pilot/operator, Reason (1990) traced the causal chain of events back up the supervisory chain of command. Four categories of unsafe supervision can thus be identified: inadequate supervision; planned inappropriate operations; failure to correct a known problem; and supervisory violations. Each is described briefly below.



Selected Examples of Unsafe Supervision (Note: This is not a complete listing)

Unsafe Supervision
Inadequate Supervision
Planned Inappropriate Operations
Failed to correct a known problem
Supervisory Violations
Failed to provide guidance
Failed to provide correct data
Failed to correct a document in error
Authorised unnecessary hazard
Failed to provide operational doctrine
Failed to provide adequate brief time
Failed to identify an at-risk aviator
Failed to enforce rules and regulations
Failed to provide oversight
Improper manning
Failed to initiate corrective action
Authorised unqualified crew for flight
Failed to provide training
Mission not in accordance with rules/regulations
Failed to report unsafe tendencies

Failed to track qualifications
Provided inadequate opportunity for crew rest


Failed to track performance





Inadequate Supervision. The role of any supervisor is to provide the opportunity to succeed. To do this, the supervisor, no matter at what level of operation, must provide guidance, training opportunities, leadership, and motivation, as well as the proper role model to be emulated. Unfortunately, this is not always the case. For example, it is not difficult to conceive of a situation where adequate crew resource management training was either not provided, or the opportunity to attend such training was not afforded to a particular aircrew member. Conceivably, aircrew coordination skills would be compromised and if the aircraft were put into an adverse situation (an emergency for instance), the risk of an error being committed would be exacerbated and the potential for an accident would increase markedly.

In a similar vein, sound professional guidance and oversight is an essential ingredient of any successful organization. While empowering individuals to make decisions and function independently is certainly essential, this does not divorce the supervisor from accountability. The lack of guidance and oversight has proven to be the breeding ground for many of the violations that have crept into the cockpit. As such, any thorough investigation of accident causal factors must consider the role supervision plays (i.e., whether the supervision was inappropriate or did not occur at all) in the genesis of human error.

Planned Inappropriate Operations. Occasionally, the operational tempo and/or the scheduling of aircrew is such that individuals are put at unacceptable risk, crew rest is jeopardized, and ultimately performance is adversely affected. Such operations, though arguably unavoidable during emergencies, are unacceptable during normal operations. Therefore, the second category of unsafe supervision, planned inappropriate operations, was created to account for these failures.

Take, for example, the issue of improper crew pairing. It is well known that when very senior, dictatorial captains are paired with very junior, weak co-pilots, communication and coordination problems are likely to occur. Commonly referred to as the trans-cockpit authority gradient, such conditions likely contributed to the tragic crash of a commercial airliner into the Potomac River outside of Washington, DC, in January of 1982 (NTSB, 1982). In that accident, the captain of the aircraft repeatedly rebuffed the first officer when the latter indicated that the engine instruments did not appear normal. Undaunted, the captain continued a fatal take off in icing conditions with less than adequate take-off thrust. The aircraft stalled and plummeted into the icy river, killing the crew and many of the passengers.

Clearly, the captain and crew were held accountable. They died in the accident and cannot shed light
on causation; but, what was the role of the supervisory chain? Perhaps crew pairing was equally responsible. Although not specifically addressed in the report, such issues are clearly worth exploring in many accidents. In fact, in that particular accident, several other training and manning issues were identified.

Failure to Correct a Known Problem. The third category of known unsafe supervision, Failed to Correct a Known Problem, refers to those instances when deficiencies among individuals, equipment, training or other related safety areas are “known” to the supervisor, yet are allowed to continue unabated. For example, it is not uncommon for accident investigators to interview the pilot’s friends, colleagues, and supervisors after a fatal crash only to find out that they “knew it would happen to him some day.” If the supervisor knew that a pilot was incapable of flying safely, and allowed the flight anyway, he clearly did the pilot no favours. The failure to correct the behaviour, either through remedial training or, if necessary, removal from flight status, essentially signed the pilot’s death warrant - not to mention that of others who may have been on board.

Likewise, the failure to consistently correct or discipline inappropriate behaviour certainly fosters an unsafe atmosphere and promotes the violation of rules. Aviation history is rich with reports of aviators who tell hair-raising stories of their exploits and barnstorming low-level flights (the infamous “been there, done that”). While entertaining to some, they often serve to promulgate a perception of tolerance and “one-up-man ship” until one day someone ties the low altitude flight record of ground-level! Indeed, the failure to report these unsafe tendencies and initiate corrective actions is yet another example of the failure to correct known problems.

Supervisory Violations. Supervisory violations, on the other hand, are reserved for those instances when existing rules and regulations are wilfully disregarded by supervisors. Although arguably rare, supervisors have been known occasionally to violate the rules and doctrine when managing their assets. For instance, there have been occasions when individuals were permitted to operate an aircraft without current qualifications or license. Likewise, it can be argued that failing to enforce existing rules and regulations or flaunting authority are also violations at the supervisory level. While rare and possibly difficult to cull out, such practices are a flagrant violation of rules and invariably set the stage for the tragic sequence of events that predictably follow.

So much for the role of supervision in controlling human error. Can you now co-relate how poor supervision caused the Doom of Arthur Andersen? There are many case studies that highlight the role of supervision in preventing human error and we will do some later in this journey. For the present, if there are no questions, we will move on to the Lion of accident causation food chain - The Organisation.

Until next week,

The Erring Human.