Beyond Bad Apples
Leaving Behind the Notion of Accident Prone
Years ago, I was part of an accident investigation team after heavy material damage occurred due to an installation being commissioned too early. When I asked the manager about speaking to the operator involved, the response I received was, “Oh, we let him go. That man was no good. He was involved in an accident not once, but TWICE!”
Within organizational contexts, discussions of errors and failures often directly reference an individual’s accident proneness. The idea of accident proneness originated from statistical observations suggesting a non-random distribution of accidents among workers. This belief attributed certain individuals with characteristics that made them seemingly more prone to accidents. A century ago, this perception of individual traits heavily influenced understanding accidents. Despite challenges in proving the existence of accident proneness statistically, attempts to define and measure it persist.
The critical error lies in failing to account for unequal exposure—some individuals encounter riskier tasks or environments, distorting the understanding of accident proneness. Safety-critical domain experts may seem more accident-prone due to their exposure to riskier situations or higher expectations. Their expertise becomes embedded in organizational culture, influencing their tasks and perceived capabilities.
Bureaucracies, fixated on normality, tend to scapegoat individuals when anomalies arise. This fixation fails to recognize systemic flaws contributing to accidents. Different perspectives exist in investigations, focusing on individual-level interventions or analyzing systemic factors contributing to what seems like human error. Each explanation may be valid.
It’s important to remember that attributing a singular cause to an accident is a social choice instead of a technical judgment. Focusing on certain explanations leads to missed safety improvement opportunities in other areas. Broadening perspectives and understanding the interrelation among explanations is crucial. Accidents occur despite good intentions. So, there’s a need to comprehend interactions among components, actors, and technology to ensure safety.
In healthcare, for instance, the complex environment contributes to incidents during operations. Hospitals often conduct root cause analyses. They initially focus on assigning blame but eventually recognize inherent safety issues in processes.
So, understanding errors in organizational contexts involves recognizing their multifaceted nature and the challenge of effecting meaningful change within rigid structures. Concepts like “accident prone” or “human error” serve as a barrier against deeper investigations into systemic failures. A systemic approach encompassing technical, social, and organizational dimensions is essential to improve safety and mitigate risks effectively.
Ref.
Cook, R.I., Nemeth, C.P. (2010), ‘‘Those found responsible have been sacked’’: some observations on the usefulness of error, in: Cognition, Technology & Work (2010) 12: pp. 87–93.
Dekker, S. (2019), Foundations of Safety Science - A Century of Understanding Accidents and Disasters, Boca Raton FL: CRC Press.
Luhmann, N. (1991), Soziologie des Risikos, Berlin: Walter de Gruyter.