In his book “Techniksoziologie”, German sociologist Johannes Weyer has written a chapter about the societal processes involved in the creation and management of technical risks. The research on this topic can be approached from various angles:
- societal and theoretical analyses (such as Ulrich Beck's work on the risk society);
- sociological analyses of the roles played by actors in these processes;
- sociological concepts related to the real-world experiments that result from an open mode of production;
- the perspective of organizational theory on how organizations handle risks;
- research on human error.
The limits of probabilistic calculus
Weyer begins by introducing the interdisciplinary context of social scientific risk research, including the use of probabilistic methods to mathematically assess the likelihood and potential impact of risks in order to guide decision-making about technological systems.
In the 1980s, researchers attempted to understand the subjective perception of risk by laypeople. It was found that qualitative factors such as "catastrophe potential" and "involuntariness of risks" are particularly important for laypeople's perception and acceptance of risk. As a result, nuclear energy was considered the greatest risk. For laypeople, it makes a difference whether thousands of people are affected by one event or one person is affected by thousands of events. The factor of voluntariness and the actuality of catastrophes, which highlights the role of the media in laypeople's perception of risk, also play important roles. Sociologist Brian Wynne took a different approach, conducting sociological case studies on trust in institutions in the 1980s and finding that laypeople's assessment of risk is highly dependent on the performance of the relevant political-administrative, economic, and scientific institutions. Wynne identified three variables:
- the perceived fairness of the process;
- the attribution of institutional competence, which depends on previous experiences with the institution's problem-solving abilities;
- the institutional transparency of processes, structures, and information flow.
Both the social psychological studies and the more institutionally focused work of sociology cast doubt on the hope that (a) the expert and lay perspectives can be reconciled and (b) there can be a consensus on risk within society as a whole. As a result, social scientific risk analysis took different directions in the following years.
From risk society to the theory of reflexive modernization
Following Ulrich Beck’s concept of "reflexive modernity," Weyer is considering a second modernity that is characterized by the freeing or release of various social, cultural, and economic developments that were previously blocked or constrained in the first modernity, which is characterized by a nationalistic, industrial, and partially democratic society. This transition to the second modernity is happening gradually and without the presence of a revolutionary subject, and there are certain shifts, such as a move from a principle of "either-or" to one of "both-and”. Hybrid forms, such as new forms of world politics and family structures, arise in this second modernity. Weyer resists the idea of complete fragmentation, because it would make it impossible to assign responsibility and to draw boundaries in order to make decisions. Weyer writes that reflexive modernity is vague and difficult to pin down, as it is defined negatively, in relation to the first modernity; it calls for fundamental changes and reorientations, but it is not explicitly linked to specific empirical observations.
Real experiments in the risk society
Weyer posits the idea of experimental implementation as a way to understand the development and testing of modern technology. The process of developing new technologies involves not only laboratory testing, but also field testing in real-world conditions, which can produce valuable insights and results. Weyer identifies four types of real experiments that fall under the concept of experimental implementation:
1. accidents or incidents that reveal unforeseen information about a technology;
2. the improvement of prototypes through field testing;
3. the release of artificial substances or radiation that have unintended consequences on human subjects;
4. nonlinear and recursive feedback processes that involve the interaction of technology with its environment.
Weyer suggests that these types of real experiments should be recognized and analyzed as an integral part of the process of technological development, rather than being viewed as simply a byproduct or a deviation from the idealized laboratory testing model. Weyer sees real experiments as a central and almost ubiquitous feature of modern societies, and sees these experiments as essential for driving social change and innovation. He proposes the idea of experimental implementation as a way to understand and analyze the process of developing and testing new technologies and ideas, which involves both laboratory testing and field testing in real-world conditions. Weyer suggests that the concept of real experiments should be recognized and embraced, rather than being viewed as a deviation from the traditional model of knowledge production, which relies on validating theories through laboratory experiments.
Real experiments should be conducted transparently and with the informed consent of those affected, in order to avoid unethical or illegitimate research practices. Weyer distinguishes between four types of real experiments based on a combination of the dimensions of nomothetic (aiming for generalizability) and ideographic (focusing on specific cases) interests and goals. The concept of "Mode 2" knowledge production emphasizes the importance of interdisciplinary and applied research, and the role of research and innovation in driving economic growth and social development.
The risks of complex technical systems
Weyer discusses complex technological systems and the risks they pose. Such systems are often characterized by unexpected and difficult to predict interactions, feedback loops, and increased complexity over time. Accidents in these systems are often the result of organizational rather than individual factors. Charles Perrow has argued that the risks posed by complex technological systems can be better understood by examining the interaction and coupling of their components, as well as the organizational factors that shape their design and operation. Perrow's book "Normal Accidents: Living with High-Risk Technologies" aims to understand the risks of complex technical systems such as nuclear power plants. Perrow argues that accidents in such systems are inevitable due to their complex nature and the tight coupling of their components. He identifies two indicators for assessing the risk of complex technical systems: the interaction of system components (either linear or complex) and the coupling of these components (either loose or tight). According to Perrow, loose coupling allows for easier correction of errors, while tight coupling leads to unexpected interactions and cascading failures. Perrow also introduces the concept of "normal accidents," which occur when the complexity and tight coupling of a system make it prone to unexpected interactions and failures, even when operating within normal parameters. He concludes that the only way to effectively manage the risks of complex technical systems is through strict regulation and oversight.
High-reliability organizations
Charles Perrow's theory of complex, technological systems suggests that disasters are more likely to occur due to the complex interactions and tight couplings within the system, rather than individual failures or technological malfunctions. However, other researchers have challenged Perrow's theory, arguing that high-reliability organizations, which are characterized by a strong consensus on goals, formal procedures, and intensive training, are able to effectively prevent disasters. The debate surrounding the causes of disasters in complex, technological systems remains ongoing.