No theory forbids me to say "Ah!" or "Ugh!", but it forbids me the bogus theorization of my "Ah!" and "Ugh!" - the value judgments. - Theodor Julius Geiger (1960)

Man-Made Disasters, 2nd edition

Turner, B.A., Pidgeon, N.F. (1997), Man-Made Disasters, 2nd edition, Boston: Butterworth-Heinemann.

In the second edition of Man-Made Disasters, Barry Turner and Nick Pidgeon provide a broad perspective on disasters. They write that these events are rarely the result of random accidents or uncontrollable forces, but arise from complex systemic, social, and organizational failures. Turner and Pidgeon provide a conceptual framework as well as case studies to help readers understand the origins of man-made disasters and, in doing so, offer insights into opportunities for risk reduction. This is truly a modern classic and I highly recommend reading it for anyone interested in safety management.

The evolution of disaster studies

Studies of disasters used to focus on the technical and medical aspects of disasters. Mainly, the societal responses and processes of societal recovery after catastrophic events were examined. An example is the book "Catastrophe and Social Change: Based Upon a Sociological Study of the Halifax Disaster" by sociologist Samuel E. Prince.

Turner and Pidgeon take a broader perspective by examining why disasters occur in the first place. They argue that with the increasing centralization and complexity of technological systems, the potential for significant disasters increases, especially because small errors or mistakes in large organizations can lead to catastrophic outcomes. This socio-technical perspective requires an interdisciplinary approach which draws on psychology, sociology, engineering, and epidemiology. This perspective helps identify not only the technical errors, but also the socio-cultural and organizational mistakes that make systems more vulnerable to disasters.

Understanding disaster cycles and mechanisms

Turner and Pidgeon outline a six-step process for how man-made disasters unfold:

  1. Societies and organizations adopt precautions based on past experience and conventional wisdom;
  2. Over time, small anomalies or risks accumulate unnoticed due to organizational rigidity, information silos, or dismissiveness of external warnings;
  3. Eventually, these latent problems lead to specific warning signals, but organizations often misinterpret or ignore them;
  4. A disruption occurs, exposing deficiencies in both defense systems and organizational awareness;
  5. Organizations rush to respond, often through temporary, ad hoc solutions;
  6. The aftermath leads to examinations and adjustments to norms, beliefs, and regulations, allowing society to adapt to new understandings of risk.

Turner and Pidgeon use examples such as the Aberfan coal dump disaster (picture by Wikimedia Commons), the Hixon train crash and the Summerland fire to illustrate how a mix of technical, administrative and social failures contribute to disasters. These case studies reveal patterns such as rigid organizational beliefs, ignored warnings and secondary problems (decoy phenomena) that distract from primary hazards. Disasters should thus not be seen as isolated events but rather as complex failures resulting from an interaction of ignored risks and breakdowns in social and organizational structures.

 

Bounded rationality and organizational limitations

Bounded rationality (Herbert Simon, 1957) is the idea that individuals and organizations operate within the limits of their knowledge and resources; they often settle for acceptable solutions rather than an optimal solution.

Likewise, in complex organizations, decision makers often overlook risks due to limited information, communication breakdowns, or an inability to process dispersed data effectively. Turner and Pidgeon write that while decision makers in structured environments rely on probabilities and estimated outcomes, these models are limited in real-world contexts where predictability is often low. For example, decisions in organizations are sometimes made based on specific rules and beliefs that are then different from external reality. This leads to unforeseen vulnerabilities, such as the Ronan Point building, where small mistakes and structural flaws accumulated: A gas explosion in a corner flat caused the collapse of several floors, resulting in the deaths of four people and injuries to 17.

(Picture of Ronan Point by Wikimedia Commons)

Information flow failures

Miscommunication and information bottlenecks are recurring themes in Man-Made Disasters. When systems fail to communicate critical information effectively, disaster risk increases. Small, overlooked deviations can signal deeper problems, but these signals are often not picked up due to biases, organizational rigidity, or hierarchical resistance. Examples include hospital contaminations and mining accidents where scattered or suppressed information prevented timely intervention.

Information is often diluted by noise within communication systems, individual cognitive biases, and organizational tendencies to resist external perspectives. These structural deficiencies create conditions where organizations miss opportunities for proactive risk management. As a result, hazards are not confronted until a major event forces recognition and response.

 

Addressing organizational failures

Turner and Pidgeon advocate promoting risk awareness and open communication within an organization. Instead of mere procedural compliance, all organizational members should share responsibility for safety. This approach is inspired by high reliability organizations (HROs) such as nuclear power plants and air traffic control centers. HROs prioritize safety through decentralized decision-making, real-time information sharing, and a proactive approach to risk management. According to Turner and Pidgeon, organizations need to go beyond traditional safety audits or TQM, which may only address surface issues. Instead, a deeper change in values ​​and shared beliefs about risk creates a resilient organizational structure that is able to anticipate and mitigate risk.

Legal frameworks are needed to support open discussion and adaptability in safety practices. These should protect safety-relevant data from punitive action, and encourage organizations to report and learn openly from incidents. By creating an environment that supports continuous learning and improvement, organizations can better address safety issues and prevent future disasters.

 What about "safety culture"?

Disaster risks emerge due to a "discrepancy between some deteriorating but ill-structured state of affairs and the culturally 'taken for granted,'" or the assumptions an organization holds about managing hazards. This discrepancy creates "blindness to certain forms of hazard"; cultural norms may prevent organizations from recognizing risks, e.g. NASA’s culture provided "a way of seeing that was simultaneously a way of not seeing" (p. 187).

What's needed is "senior management commitment to safety; shared care and concern for hazards... realistic and flexible norms and rules," and continuous learning through reflection and feedback are needed (p. 189). This culture cannot be installed by decree, since it's "a process and not a thing", a dynamic system influenced by "symbolic and rhetorical matters" that are "subtle and difficult to control" (p. 190).

"Safety culture", is not a "homogeneous organizational entity" but a layered, complex set of practices that vary by subgroup and are subject to continuous negotiation. Achieving safety in ill-structured, dynamic contexts is more about sustaining an "ongoing set of (sometimes conflicting) arguments about the organization" than imposing static rules. (p. 191)

Conclusion - Building resilient systems for the future

Man-Made Disasters by Barry Turner and Nick Pidgeon challenges the idea that disasters are isolated or unpredictable events. They show that these incidents often arise from systemic organizational failures, communication failures, and rigid cultural assumptions. Turner and Pidgeon's sociotechnical approach provides a valuable framework for understanding the roots of disasters and offers a blueprint for building resilient organizations capable of managing complex risks. Their insights remain relevant for risk professionals, policymakers, and researchers who want to prevent future catastrophes. Resilience, Turner and Pidgeon argue, may be the most effective defense against the next man-made disaster.