Human Factors Accident Classification System

Introduction

The HFACS (Human Factors Accident Classification System) gives details on accidents brought about by human beings while suggesting instruments to help in the procedures of investigation. This also helps in the processes of training and prevention of accidents. HFACS was brought on board by two doctors Shappell Scott and Doug Wiegmann. This was in the “Civil Aviation Medical Institute and University of Illinois at Urbana-Campaign, USA, in response to a trend that showed some form of human error was a primary causal factor in 80% of all flight accidents in the Navy and Marine Corps” (Reason, 12).

“HFACS is based in the “Swiss Cheese” model of human error which looks at four levels of human failure, including unsafe acts, preconditions for unsafe acts, unsafe supervision, and organizational influences” (Reason, 12). It is an all-inclusive human being error, which made Reason turn his ideas into an applied setting. He defined about 19 causal groupings in 4 levels of errors caused by human beings. Wiegmann and Shappell (26) have identified that:

HFACS can be reliably used to identify human factor trends associated with military and general aviation accidents. It was originally used by the US air force to investigate and analyze human factor aspects of aviation. Over the years, the application has spread to civil and general aviation as well. Although it is a highly effective tool, the model is not as widespread as desirable. This framework (HFACS) is heavily based upon James Reasons Swiss cheese model (Wiegmann and Shappell, 26).

Therefore this paper is going to focus on the human factors accident classification system. The paper will discuss an accident case of a vessel (Attilio Ievoli), explain the accident scenario, show how accident causation has changed over time, explain the domino theory and the Swiss cheese model while displaying the four levels of analysis in HFACS, and finally generate an HFACS for the Attilio Ievoli. The paper will then conclude by discussing the results of the HFACS classification and proposing the safety management of such accidents.

Vessel identification and the accident scenario (Attilio Ievoli)

Attilio Ievoli was an Italian chemical Tanker (IMO Type II) that was a 115.5 M long, 4450 gross tonnage double-hulled steel vessel. Attilio Ievoli’s service speed was at 14 knots. This accident occurred at 1632 (UTC +1), on the third of June the year 2004 at Lymington Banks, the West Solent (50°43.’5N 001°30’7W). At the time of the accident, there were 16 people on board but fortunately, there were no reported injuries. The damage realized was a “1 meter square indentation, approximately 4 meters inboard from the forward end of the port side bilge keel. Extensive scoring of the bottom paintwork was also realized” (Stewart, 44).

The master of this tanker ignored the instructions given to him by his company which specified the use of the east Solent route. He wished to use less time in the shortcut, the vessel was put on autopilot which then drifted out of the Solent and eventually grounded causing the above-mentioned effects. Lack of communication, ignorance, and misappropriation of tasks was the main cause of this accident.

Accident causation and how it has changed over time (domino theory 7 Swiss cheese model).

The domino theory of accident causation is one of the theories that are best known in the world today. This theory explains that accidents are a consequence of a series of sequential occurrences, symbolical like falling dominoes in a line. When a domino falls over it activates the next in line and so on. The theory still shows that when key factors are done away with (unsafe actions), the series of sequential actions would be prevented.

Unsafe actions and conditions include, “unsafe performance of persons i.e. standing under suspended loads, horseplay, removal of safeguards and mechanical or physical hazards such as unguarded gears or insufficient light” (Heinrich, 12). The author imagined 5 dominos tagged with accident causation. These are; “social environment and ancestry, fault of person, unsafe actions, accident, and injury. Domino explains each domino explicitly and gives advice on minimizing or eliminating their presence in the theory” (Heinrich, 14).

The first accident causation aspect (social environment and ancestry), is about employee personality. Unwanted personalities which include recklessness, voracity, and stubbornness can be inherited or are born through an individual’s social environment. These mentioned factors add to the errors of an individual.

The fault of an individual is the second accident causation dealing with employee personality. These aspects are inborn and they encompass” bad temper, inconsiderateness, ignorance and recklessness. Natural or environmental flaws in the workers life cause the secondary defects, which are contributors to unsafe acts or existence of unsafe conditions” (Heinrich, 14).

The third causation that is unsafe actions or situations shows direct accident causation. Some of these are like lack of rails guarding workers, lack of protective measures et cetera. These were central in accident causation as well as the central factors in preventing them. This is again due to “improper attitude, lack of knowledge or skill, physical unsuitability, and improper mechanical or physical environment” (Heinrich, 14).

The fourth causation is an accident in itself which is Heinrich (16) describes as, “the occurrence of a preventable injury due to natural culmination of a series of events-circumstances which invariably occur in a fixed and logical order. He defines accidents as, events such as falls of persons or striking of persons by flying-objects”.

The domino theory of accident causation has changed over time. From the two individual models (Domino and Swiss cheese model) in the sense that, it has been modernized and changed to replicate the improvement in accident safety to replicate the altering social climate.

Of late this theory has been changed; this is according to Wrathall who says that:

The dominoes have been re-labeled and updated (with a new emphasis on management, and incident as property loss), but the basic structure and premises of the theory are still in place. The revised model re-labels the dominoes as Management: Loss of control, origins/basic causes, symptoms/immediate causes, contact, incident, and loss of people – property (Wrathall, 67).

The Swiss cheese model on the other hand is a representation of errors proposed by Reason. Reason explained that “every step of any process carries with it the potentiality of error with varying degrees” (Reason 62). He goes ahead and says that, “the ideal system is analogous to a stack of slices of Swiss cheese. Consider the holes to be opportunities for a process to fail, and each of the slices as “defensive layers” in the process” (Reason, 62). A failure in the system can allow a hitch to pass through an opening in one layer, in the next one though the openings are placed differently, and the hitch should be stopped. “Every layer is considered a defense against possible error or failure having an impact on the outcome” (Reason 62).

For a disastrous error/failure to take place, the openings are supposed to align at every step in the process. This renders all defenses useless failing. If layers are set in a way that all the openings line up, then this is an intrinsically flawed system that permits errors. The errors are allowed from the start through which later affects the outcome negatively. “Each slice of cheese is an opportunity to stop an error. The more defenses you put up, the better. Also the fewer the holes and the smaller the holes, the more likely you are to catch/stop errors that may occur” (Reason, 64).

Currently, the cheese Swiss model increases the role of human beings from a limited view as single entities to entities in a very complicated system like the organizational system. This shows that they do not act alone but interrelate with these complex systems. This model is currently making human failures more transparent showing that they do not only occur at the closing stages of the system i.e. active and unsafe failures. “In so doing, it is perpetuating the understanding that humans are fallible in the system, from which a reasonable conclusion may be that, removing them from any layer of the system as far as practicable should render safer systems” (Reason, 72).

The Swiss cheese model has changed over time but organizational complexity remains on the human side of the equation. This might be because other possible barriers are not developed within the organizational piece instead it is in the defense-in-depth layer. This model once more places emphasis solely on human errors in systems.

HFACS’s 4 levels

Based on Reason’s conception, Human Factors Accident Classification System levels of failure are: “1) Unsafe Acts, 2) Preconditions for Unsafe Acts, 3) Unsafe Supervision, and 4) Organizational Influences. A brief description of the major components and causal categories follows, beginning with the level most closely tied to the accident, i.e. unsafe acts” (Helmreich, 54).

Unsafe actions are categorized into two classes that are: errors and violations. “Not surprising, given the fact that human beings by their very nature make errors, these unsafe acts dominate most accident databases. Violations, on the other hand, refer to the willful disregard for the rules and regulations that govern safety” (Helmreich, 57). The bane of a lot of organizations is the forecast and avoidance of these horrendous and purely avoidable unsafe actions, which continue to evade administrators and investigators alike. Still, differentiating errors from violations cannot give the level of granularity needed for most investigations.

The varieties of unsafe actions committed by people include skill-based errors “(failures of attention and/or memory). In fact, attention failures have been linked to many skill-based errors such as the breakdown in visual scan patterns, task fixation, inadvertent activation of controls, and the miss-ordering of steps in a procedure among others” (Reason, 67). Decision failures (inappropriate procedure, improper maneuverability, and the poor making of decisions) are also categorized under unsafe acts. Another unsafe action is the perceptual error which entails misjudgment, spatial confusion, and false impressions affecting visual capability. Violations on the other hand entail failing to follow briefs, violating safety rules, failing to prepare correctly for the task, going beyond limits, and low qualification.

Arguably, preconditions for unsafe actions are connected to more than 85 % of all accidents. “However, simply focusing on unsafe acts is like focusing on a fever without understanding the underlying disease causing it. Thus, investigators must dig deeper into why the unsafe acts took place” (Helmreich, 57). In this regard, 2 divisions of the unsafe situation were created. These were, “substandard conditions of operators and the substandard practices they commit. These were further subdivided into adverse mental states; adverse physiological states physical mental limitations, crew resource mismanagement and personal readiness” (Helmreich, 54).

The reason stated that in addition to the above-mentioned factors or the ones associated with operators, unsafe supervision is a major drawback that triggers a sequence of events. “The categories of unsafe supervision are inadequate supervision, planned inappropriate operations, failure to correct a known problem, and supervisory violations” (Reason, 67).

Under inadequate supervision, there is the failure to give supervision, instruction, provision of doctrines, and inability to follow the performance. Failure to correct a known problem encompasses the inability to make out a risk, failure in initiating corrective measures, and failure to give information (unsafe tendencies). Planned inappropriate operations entail; failure to provide “correct data, inadequate brief timing, improper manning, the mission not being in accordance with rules/regulations and provision of inadequate opportunity for the crew. Finally supervisory violations include: Authorized unnecessary hazards, Failure to enforce rules and regulations and authorized unqualified crew” (Reason, 68).

As mentioned before, imperfect decision-making done by the management has big effects on supervisory practices. This also affects the conditions and acts of operators. Regrettably, these errors related to organizational influences go unnoticed most of the time. It is mainly due to the “large part to the lack of a clear framework from which to investigate them. Generally speaking, the most elusive of latent failures revolve around issues related to resource management, organizational climate, and operational processes” (Reason, 67).

Organizational errors as shown by Reason are broken down into

Resource management, organizational climate, and organizational processes, resource management entails human resources (selection, staffing, and training), monetary/budget resources (excessive cost-cutting & lack of funding), and equipment facility (poor design, purchase of unsuitable equipment). Organizational climate cover structure (chain of command, delegation of authority, and communication), policies (hiring and firing, promotion and drugs or alcohol), and culture (norms and rules values and beliefs, and organizational justice). Last on the list is an organizational process that includes operations (Tempo, incentives, schedules, and deficient planning), procedures (standards, objectives, documentation, and instructions), and oversight (risk management and safety programs) (Reason, 72).

HFACS classification for the Attilio Ievoli

In analyzing this accident we can say all officers on board were not fatigued consequently ruling out the aspect of fatigue in the accident. Human factors on the other hand were the major contributors to this accident. As shown by (Stewart, 44):

The Master’s decision to take the vessel through relatively hazardous waters, without a pilot, and under automatic steering must be questioned. The risks inherent in this scenario were compounded by poor team management, resulting in an inappropriate division of tasks, and a lack of accurate positional awareness. The 2/O knew that the vessel was not following an appropriate course but failed to communicate this to the master. The poor standard of teamwork accepted by the master probably contributed to this failure. Language difficulties probably did not play a part, but cultural differences and communications practice may well have contributed (Stewart, 44).

Another notable mistake was in the bridge’s team management. It was evident that the crew was not clear of each person’s responsibility as there was no briefing before sailing. “Further, the second officer was unable to concentrate on monitoring the vessel’s position because he was used for more menial tasks such as taking down the pilot flag. The cadet should have been employed for this task” (Stewart, 23). Another notable issue is the lack of proper communication and cultural differences. The master of this vessel was again distracted by telephone calls leaving no one to take care of safety measures since no one else looked responsible.

“The VTS warning of vessels entering and leaving from/to the west had no authority to organize the movement of vessels in the west Solent since this area lies outside the western limit of the port of Southampton” (Stewart, 44).). Again this was a causal factor.

In general, there was no briefing before sailing, there were no definite roles assigned to each crew member in the bridge team, no fixing of the vessel’s position was done, the port radar was not accessible to the second officer and the operation of the echo sounder was not efficient.

Results of the HFACS classification lessons that need to be applied to navigation and safety management from this accident

“Good seamanship and navigation practice is dictated to the need for extreme caution and thorough passage planning when approaching the coast” (Stewart, 23). The lessons that need to be applied to navigation from this accident are poor teamwork management made worse by cultural differences, errors in initiating safety functions, not complying with instructions, and irresponsibility among crew members.

HFASC for Attilio Ievoli

The following are characteristics of human errors that led to the Attilio Ievoli accident; there was specifically unsynchronized teamwork. The first error which contributed to the accident was the ship’s master’s choice of taking the ship via the West Solent. This decision was not as per the company’s specifications. The master was aware of the fact that this part had complex tides together with hazardous waters, and in fact, he did not have a pilot.

The master’s first fault of judgment was later escalated by the inefficient teamwork of the bridge. The pilot of the vessel departed at 1600, after he left the team did not seem to know their particular roles. Consequently, the performance of the crews’ various tasks was un-coordinated to the extent that there was little or no organization and supervision. The eventual aftermath of the lack of accountability was evidenced at the time of the accident when no person among the bridge crew knew the precise appreciation of the vessel’s location.

In terms of teamwork best practice, it is possible to identify three areas where action should have been taken to militate against the state of affairs. First, the master should have prepared a plan for the responsibility management during the first few sea watches. This plan should have been briefed to the officers undertaking those watches before sailing. The plan should have included the master’s instructions for each individual (while the pilot was not on board, in the period within the pilot’s disembarkation and the watch handover at 1600 and in the period after 1600 (Stewart, 44).

Secondly, an official renouncement of the watches must have taken place at 1600 hours. This should have been done together with the establishment of a common positive reception of the vessel’s situation then. This renouncement must have been employed to reiterate each team member’s accountability throughout the watch to come and to officially relieve off-duty crewmembers that had completed their watch.

Thirdly, the vessels Master acting in the capacity of the officer of the watching should have put in place effective supervision at the time he was doing the watch. He was supposed to contact the crew members responsible for the bridge’s supervision to give a response, to put objectives and priorities while making certain that every crew member possessed a suitable and compatible perception of their situations together with the vessel’s situation. This was to be done in the realization of each member’s role together with the adequate knowledge of the roles of other people in the vessels management. If this would have been done, the master would have had adequate knowledge of the cadet’s assumption that the ship was on planned track.

The knowledge that the vessel had veered off course was relayed to the Master by the 2/0 10 minutes after 1600 hours. This was a long time before the accident. Conversely, the master said he did not receive the information. There is also enough proof showing that the master was on the phone when he was being informed. Even though he was talking on the phone, the 2/0 should have made sure the master is aware of this important information. Therefore we can say there was also an issue of neglect here.

Given that the Master was Italian and the 2/0 was Ukrainian, it was essential to reflect on social aspects given that English was the official language on the bridge. The 2/0 must have believed the Master got the information but found it inappropriate not to respond. The 2/0 was also a few days old on the ship and he might not have been well acquainted with his colleagues. Before that, the master had given him instructions to lower the ship’s flag an action usually done by lower officials. So he must have thought the Master disregarded him. But this does not make one lack the necessary accountability. This is further shown by the fact that the Master was on phone and he should have waited or made sure he got the information. It is also shown that he did not get any briefing from the master and thus did most of his duties on his own.

It is also expected that cross-cultural differences (Ukraine and Italian) might have been the main cause of the interaction failure. So the possible incompatibility might have brought about some friction and thus risk. This must-have brought about the question of reluctance as he decided to tell the situation to a cadet instead. It is believed that with his cultural background, the 2/0 tends to act upon instruction and since he knew his watch was over he assumed he had been replaced without official instructions. This is also escalated by the fact that he was working in a new environment and different nationality.

In the synopsis, we can say that the master is not completely accountable. He took the ship through a hazardous route and without a pilot. The vessel was again on automatic pilot. “The risks inherent to this scenario were compounded by poor team management resulting to an inappropriate division of tasks and lack of accurate positional awareness” (Stewart, 44). The 2/0 also failed to tell the Master about the ship’s veering off course. Another notable aspect was poor standards of teamwork harbored by the very Master. Cultural differences were insignificant here but inefficient communication practice might have had a major contribution.

Conclusion

Still, HFACS together with all other frameworks only has slight contributions to the already growing listing of faulty taxonomies especially when they can’t confirm the usefulness in operational settings. Putting this in mind, “HFACS has been employed by the U.S. Navy, Marine Corps, Army, Air Force, and Coast Guard for use in aviation accident investigation and analysis. To date, HFACS has been applied to analyze human factors data from approximately 1,000 accidents” (Wiegmann & Shappell, 26). All through these procedures, the convenience and content suitability has experimented over and over again.

Wiegmann and Shappell went ahead to explain that:

Human Factors Analysis and Classification System (HFACS) framework bridges the gap between theory and practice by providing investigators with a comprehensive, user-friendly tool for identifying and classifying the human causes of aviation accidents. The system, which is based upon Reason’s (1990) model of latent and active failures, encompasses all aspects of human error, including the conditions of operators and organizational failure (Wiegmann & Shappell, 26).

Early indicators in research have identified that the framework is vital in making out and analyzing human factor safety aspects. Consequently, “the systematic application of HFACS to the analysis of human factors accident data has afforded the U.S. Navy/Marine Corps (for which the original taxonomy was developed) the ability to develop objective, data-driven intervention strategies” (Sanders, 153). HFACS has indeed shed light on areas ready for intervention instead of basing on the personal study which is not necessarily geared towards saving lives and reducing accidents.

Furthermore, HFACS together with insights gleaned from research has been employed to make inventive investigation techniques which in turn have improved the quantity and quality of the human aspect in the information collected throughout accident investigations. “However, not only are safety professionals better suited to examine human error in the field but, using HFACS, they can now track those areas (the holes in the cheese) responsible for the accidents as well” (Yin, 132). Currently, it is possible to trail successes and letdowns of particular intervention programs put in place to reduce the precise human error and accidents. In the employment of HFACS therefore, investigating investments and protection programs can re-invent or toughen to meet the altering needs of accident prevention and safety.

In conclusion, the development of the Human Factors Accident Classification System has brought to light the valuable initial step in instituting a bigger civil aviation accident prevention program. The definitive goal of HFACS and all other safety measures is to decrease and reduce aviation accident rates via organized, data-driven investment. Therefore as shown in this paper the Attilio Ievoli accident would have been prevented if proper HFACS principles would have been followed.

Works cited

Heinrich, William. Domino Theory. New York, NY: Oxford University Press, 2002. Print.

Helmreich, Daniel. Four levels of analysis in HFACS. New York: McGraw-Hill, 2001. Print.

The reason, James. Human Error. Cambridge, UK: Cambridge University Press, 2001. Print.

Sanders, Martin. HFACS Intervention in the US Army, New York, NY: McGraw Hill, 2005. Print.

Stewart, Keith. Marine accident investigation. Southampton: Center for human science, 2004. Print.

Wiegmann, Douglas. & Shappell, Scott. The Human Factors Analysis and Classification System (HFACS). Journal on HFACS 23.2 (2001): 12-82.Print.

Wreathall, Thomas. Domino theory. Aldershot, UK: Ashgate, 2004. Print.

Yin, Liu. An Introduction to Human Factors Engineering. New York, NY: Longman Publishers, 2008.Print.

Cite this paper

Select style

Reference

Premium Papers. (2022, January 6). Human Factors Accident Classification System. https://premium-papers.com/human-factors-accident-classification-system/

Work Cited

"Human Factors Accident Classification System." Premium Papers, 6 Jan. 2022, premium-papers.com/human-factors-accident-classification-system/.

References

Premium Papers. (2022) 'Human Factors Accident Classification System'. 6 January.

References

Premium Papers. 2022. "Human Factors Accident Classification System." January 6, 2022. https://premium-papers.com/human-factors-accident-classification-system/.

1. Premium Papers. "Human Factors Accident Classification System." January 6, 2022. https://premium-papers.com/human-factors-accident-classification-system/.


Bibliography


Premium Papers. "Human Factors Accident Classification System." January 6, 2022. https://premium-papers.com/human-factors-accident-classification-system/.