No theory forbids me to say "Ah!" or "Ugh!", but it forbids me the bogus theorization of my "Ah!" and "Ugh!" - the value judgments. - Theodor Julius Geiger (1960)

Safety Management Beyond Simplification

“A system is complex when it cannot be calculated, even if one has complete information about all of its elements.” – Armin Nassehi, Paradox 17 conference

In safety management we often try to define what is safe and what is unsafe. We do this through language and communication. This is how we reach consensus on ‘the root cause’ of an incident when communication stops. Until someone with a different perspective adds their few cents... We often look at actions at the coal face, and less often at the plan on which the work is based, the company policies, regulations, and laws behind it.

We also try to simplify complex systems, while they are beyond simplification. Although simplification can work with complicated systems that can be analyzed, this strategy does not work for complex systems (Snowden and Boone, 2007). Interactive complexity is immune to finding linear causality. It includes many components with non-straightforward relations that go beyond analytical understanding (Hollnagel, 2012). How can we deal with the inherent complexity of our socio-technical systems?

 

The Lure of Simplification

Do you know these safety videos that portray safety as a choice between safe and unsafe. Like a flip of a coin? Choosing “the safe option” gives a feeling of security. But because categorizing situations as safe or unsafe often oversimplifies complex realities, the sense of security can be false and the associated solutions can prove ineffective.

The appeal of simplification - present more than one hundred years ago, when Frederick Taylor decomposed tasks for local optimization - is everywhere in management. Automation is used to replace people with technology. Training and design are used to adapt people to machines and vice versa. Despite being valuable when used properly, these solutions were criticized for hiding underlying problems and legitimizing the introduction of new technologies. They were often applied reactively after technologies were already in use rather than preventing problems in advance (Hollnagel, 2012). Unfortunately, more often than not, these solutions increase complexity instead of reducing it. This is the dilemma facing organizations: they long for simple, lean, complexity-reducing structures, but these would actually increase the lack of clarity.

For example, in safety management, through the hierarchy of controls, we use simple but arbitrary categories to justify actions taken or not taken. We ignore the fact that all engineering designs require human action to be effective, and that all administrative controls manifest themselves through some kind of design. We 'eliminate' dangers, but as a result other dangers arise (Rae, 2023).

The reflex to simplify safety management strategies entails risks. Simplification can quickly lead to unmanageable complexity. Partly because this increasing complexity goes unnoticed, especially in growing interdependent systems. Some safety management consultants assume that simple “life-saving” rules and “take five” structures lead to simple, low-complexity organizations. But because of unforeseen interactions in the real world, simplification can lead to ineffective solutions and unintended consequences.

For example, strict 'blanket rules' can put pressure on people to strictly adhere to the rules, even in unforeseen critical situations. This can also hinder people from raising concerns or near misses. The rules can also give a false sense of security, assuming that standard procedures always lead to optimal performance. Additionally, things like a “one-size-fits-all evacuation plan” may not be appropriate for the unique challenges that different situations present, potentially leading to confusion, delays, or inappropriate responses during a real crisis. Finally, the problem of complex action situations is also not solved by “better moral insight” and “solidarity” (Nassehi, 2017), because these simplify something that is considerably – well – more complex. Dealing with complexity is therefore not a problem that can simply be eliminated, but a ubiquitous and necessary aspect of normal work (Hollnagel, 2012).

 

Deregulation leads to more regulation

So, simplification does not take the complexity out of psychological and social systems. Over the past decades, government safety management regulation has been driven by deregulation, simplification and inspection-regimes at the organizational level. Organizations have got freedom to implement safety management systems appropriate for their operations. But companies still act in a different way, i.e by heavily documenting the safety management system and by employing punitive policies. While regulators might state that safety management plans are form-free, companies desperately search for templates to fill and administrate. They extensively document their safety management and standardize still more procedures. Why? Because of work auditability, managerial insecurity and liability (Størkersen et al, 2020).

 

The Complexity Dilemma

Complexity occurs when, in a network of interconnected agents, agents have a certain amount of autonomy (Plsek and Greenhalgh, 2001). As a result, what may happen in the system is uncertain. While simplification contributes to complexity management, it is not a counterpart to complexity. As our systems have become more complex in recent times, it has become more difficult to negotiate and resolve conflicts. Different logics and problem-solving approaches have emerged.

A single distinction like safe/unsafe is not effective for these systems. Neither is root cause analysis; when to stop and call something the root cause is a social choice in complex systems. We can use narratives to try and understand and describe complex systems. But still, tension arises when the desire for a solution or punchline conflicts with the complexity of the situation.

The risk of the complexity dilemma lies not so much in the dilemma itself, but in the inability to perceive it (Kühl, 2022). Predetermined frameworks or solutions have their place in simple, mechanical situations, but because of the interaction between people, technology, environment and organizations (power, culture, task division, strategy and structure), safety management has to deal with complexity as well. For this, we need interdisciplinary working groups and other forms of multi-professional organization that address the problem of different perspectives (Nassehi, 2017). We also must learn to tolerate and endure differences.

The bottom line is that the social-technical systems we work in inherently involve paradoxes and challenges. A system is complex when it cannot be fully calculated even with complete information about its elements. We then deal with uncertainty instead of mere risk (Knight, 1921). Under uncertainty and when having imperfect knowledge, people use heuristics instead of risk management categories. Heuristics exploit learned and evolved capacities to make fast judgments. They are used in complex systems, where they outperform analysis and calculation to make predictions without fitting observed data (Mousavi and Gigerenzer, 2014).

 

Interaction Helps to Process Complexity

Insights from mathematics, economics, physics and biology show how misleading the assumption is that we could easily simplify complex situations. The Santa Fe Institute, an interdisciplinary research group in Santa Fe, New Mexico, repeatedly found that simple rules often produce complex systems, as in the application of simple calculation rules.

The gradual accumulation of elements and their combinations increases complexity. So do added functions or subsystems, which let systems break out of their performance limits. These things can be reversed, often with significant effort. But then there’s this other mechanism at work: Simpler elements are utilized by external entities in a way that allows for interaction or collaboration between these elements to serve specific purposes. These mechanisms occur irregularly, in distinct periods, and frequently go in both directions. This means that complex systems continuously evolve and agents adapt at the edge of chaos (SFI, 1993).

A complex system has distributed intelligence, instead of a centralized control unit. This gives it the ability to process complexity. In a society driven by distributed intelligence, using authoritarian means is challenging. Systems are complex when they can assume several other states and when there are multiple possible solutions to some specific problem. Research into these systems has showed that complexity lurks already within extremely simple systems. Complex global patterns with new properties emerge from local interactions; the interaction of some simple rules leads to complexity.

 

Interaction Instead of Causality

Complexity means that simultaneously different things happen, which are not causally related but rather through interaction processes. Causality means: now, this happens, and it is a cause of what happens next. Interaction means: something happens, the forces of which enable or disable each other (Nassehi, 2017b). The description of what happens there can only be done in scenarios and not in “causes”. Multiple, unrelated events occur simultaneously, driven by interactive processes rather than direct causality. So, when we create risk scenarios, none of the scenarios will happen the way we did because that would contradict interaction theory. But this doesn’t make scenario analysis less valuable.

Armin Nassehi gave a nice illustration at the 2017 Paradox Conference:

“When an airplane falls from the sky, figuring out why it actually crashed allows you to retrospectively establish clear causal chains: it was the malfunction of the coffee machine's safety mechanism that didn't work, causing a fire, and so on. Then one can imagine such events. But you cannot predict in advance that the coffee machine's safety mechanism will fail and result in an airplane crash. Politically, one usually acts by saying: “we must absolutely include checking the coffee machine's safety in the plane’s checklist, but since that takes too long, we'll remove something else based on what the wage negotiations have suggested”, and then it crashes.”

We have to keep in mind, that, in complex systems, every perspective is selective. One focuses on what one wants to see, is able to see and is conditioned to see. We see this a lot in incident investigation: we select one perspective – usually a “root cause” - in order to come to an end of our analysis. To deal with complexity, we must move from causality thinking to interaction thinking. We can do this by creating scenarios to explore potential outcomes when parameters change.

 

Methods

Because irregular and rare events are hard to predict and assess, models are needed that account for interactions and variability in normal performance (Hollnagel, 2008). There are specific methods to support this kind of interaction and scenario thinking:

- Nancy Leveson’s STAMP, after modeling the control structure, enables us to identify unsafe control actions and identify loss scenario’s.

- Erik Hollnagel’s FRAM, after making a model that outlines possible connections between different functions in a system, allows us to analyze different scenarios or instances of that model, describing the relationships between functions in that scenario, depending on how much things can change.

In both cases, whether analyzing everyday work or potential accidents, choosing representative scenarios requires deep knowledge of the specific domain. Both models shows how functions are likely to unfold in a scenario, considering their temporal and physical relations.

 

Conclusion

Dealing with the complexity inherent in socio-technical systems demands a departure from oversimplified approaches that categorize situations as strictly safe or unsafe. The allure of simplification, often prevalent in management practices, can lead to unanticipated complexities and ineffective solutions. Paradoxically, organizations desire simplicity while having to deal with complex adaptive systems.

Deregulation, aimed at giving organizations autonomy, paradoxically results in increased documentation and standardization due to auditability concerns and managerial insecurities. The complexity dilemma arises as our systems evolve, requiring an understanding that goes beyond simple dichotomies like safe/unsafe or root cause analysis. Predetermined frameworks may work in straightforward situations but fall short in the complex conundrum of people, technology, environment, and organizational dynamics.

Acknowledging the complexity of these systems, and working interdisciplinary is necessary. Insights from various disciplines, such as mathematics, economics, physics, and biology, emphasize that simple rules can give rise to complex systems. The challenge lies not just in the complexity itself but in recognizing and understanding it. The belief in simplification overlooks the fact that complex systems evolve through interactions, not linear causality.

Interaction, rather than causality, characterizes complex systems, necessitating a shift from root cause thinking to scenario-based analysis. Specific methods, like Nancy Leveson's STAMP and Erik Hollnagel's FRAM, offer ways to explore potential scenarios and understand the interactions within complex systems. These methods underscore the importance of deep domain knowledge in choosing representative scenarios.

In essence, dealing with the complexity of socio-technical systems in safety management requires interdisciplinary collaboration, and adopting scenario-based thinking. Recognizing and dealing with the complex dynamics of these systems is essential for effective safety management in a world where uncertainty, rather than mere risk, prevails.

 

References

Hollnagel, E. (2008), The changing nature of risk, in: Ergonomics Australia, 22 (1-2), 33-46.

Hollnagel, E. (2012), Coping with complexity: past, present and future, in: Cognition, Technology & Work, Vol. 14, No. 3, p. 199-205.

Hollnagel, E., Hounsgaard, J., Colligan, L. (2014), FRAM – the Functional Resonance Analysis Method – a handbook for the practical use of the method, Southern Region of Denmark: Centre for Quality.

Knight, F.H. (1921), Risk, Uncertainty and Profit, Boston/New York: Houghton Mifflin Company.

Kühl, S. (2022), Der ganz formale Wahnsinn - 111 Einsichten in die Welt der Organisationen, München: Vahlen.

Leveson, N.G., Thomas, J.P. (2018), STPA Handbook, Boston: MIT. (https://psas.scripts.mit.edu/home/get_file.php?name=STPA_handbook.pdf)

Mousavi, S., Gigerenzer, G. (2014), Risk, uncertainty, and heuristics, in: Journal of Business Research, Volume 67, Issue 8, August 2014, Pages 1671-1678.

Nassehi, A. (2017), Die letzte Stunde der Wahrheit – Kritik der komplexitätsvergessenen Vernunft, Hamburg: Sven Murmann.

Nassehi, A. (2017b), Die sieben Paradoxien moderner Gesellschaften, Stuttgart: Paradox Conference.

Plsek, P.E., Greenhalgh, T. (2001), The challenge of complexity in health care, in: BMJ. 2001 Sep 15; 323(7313): 625–628.

Rae, D. (2023), personal communication.

Sante Fe Institute (1993), Annual Report on Scientific Programs – A Broad Research Program on the Sciences of Complexity, Santa Fe, NM: SFI.

Snowden, D.J., Boone, M.E. (2007), A Leader’s Framework for Decision Making, in: Harvard Business Review, November 2007.

Størkersen, K.V., Thorvaldsen, T., Kongsvik, T., Dekker, S.W.A. (2020), How deregulation can become overregulation: An empirical study into the growth of internal bureaucracy when governments take a step back, in: Safety Science, 128, http://resolver.tudelft.nl/uuid:bd69c4ad-db17-4c80-85cd-5960372bfd02