No theory forbids me to say "Ah!" or "Ugh!", but it forbids me the bogus theorization of my "Ah!" and "Ugh!" - the value judgments. - Theodor Julius Geiger (1960)

Managing the Unexpected

Introduction

High-risk systems, such as nuclear power, chemical plants, aircraft, and financial markets, have a potential for widespread impact on populations and the environment. There is an ongoing tension in industrial safety regarding the unpredictability of incidents, accidents, or disasters. Despite efforts to imagine worst-case scenarios, the materialization of high-profile events is contingent on specific circumstances that are challenging to predict precisely. The effectiveness of strategies for managing the unexpected depends on specific technological, social, and historical contexts of high-risk systems. The application of categories such as building favorable power configurations, confronting fallible constructs, and maintaining the relation between parts and whole varies across different safety-critical activities and their unique contexts.

 

Technological determinism

According to Charles Perrow, certain disasters are inherent to the technologies they accompany. The circumstances of these events surpassed the socially and technologically designed capacities meant to prevent them. There were limitations in engineering knowledge, complexities of human cognition, uncertainties in managerial decisions, and the influence of external factors like civil society and the media. Perrow's thesis suggests that accidents in tightly coupled and complex systems are inevitable, leading to the proposal that such systems should be abandoned or made less coupled and interactive. Perrow's view holds that technology out of control leads to unexpected events. The rise of new technological systems, particularly those based on artificial intelligence, may revive concerns about our ability to control these systems.

 

The Issue of Power

Charles Perrow also introduced a critical perspective focused on managerial negligence and the misuse of power by elites. Perrow contended that disasters like Bhopal, Chernobyl, Challenger, and Exxon Valdez were preventable and resulted from managerial negligence and production pressures. His analysis of the Challenger accident, for example, shifts the focus from technology to the elite in NASA and their proximity to political powers, emphasizing organizational goals rather than technological issues. This echoes Marx’ critique of capitalism and its for-profit organizations, in which economic incentives can discourage robust safety practices.

Andrew Hopkins likewise emphasizes the preventability of accidents through appropriate managerial commitment.

The environments of high-risk systems are as crucial as their internal functioning. The role of external networks, including regulators, watchdog groups, and professional associations, is essential to the ability of high-risk systems to manage the unexpected.

 

The Failure of Foresight

Sociologist Barry Turner focuses on the failure of foresight and fallible constructs. Turner applied a cultural interpretation to accidents and coined the concept of an incubation period, where signals of a pending disaster accumulate but are not recognized due to prevailing assumptions. Other authors like Ron Westrum, Karl Weick, Diane Vaughan, and John Downer also emphasize the role of anomalies, hidden events, and limited knowledge in understanding unexpected events in high-risk systems. How a diversity of actors establishes specific understandings of situations and deals with uncertainty is challenging. Learning is critical in high-risk systems, allowing adaptation to changing environments. Socio-cognitive processes supporting imaginative and creative anticipation are emphasized.

 

The Systems Perspective

Cognitive engineer Jens Rasmussen emphasized complexity, self-organization, and emergent properties in systems. Rasmussen's approach incorporates cybernetic ideas and interprets the defense-in-depth engineering principle in the context of high-risk systems. Rasmussen shifts from a micro view of errors to a macro socio-technical perspective, emphasizing the degree of freedom of individuals in complex situations. He highlights the evolutionary nature of work processes structured through on-the-job training. Accidents are inherent in systems with self-organizing features. Rasmussen's model of migration towards the boundary of acceptable performance, incorporates notions of self-organization, adaptation, and degrees of freedom. The model illustrates how local variations within work space allow actors to identify effort gradients, while management builds up cost gradients. Rasmussen draws on Ross Ashby's cybernetic concept of requisite variety, to regulate self-organizing complex systems. These systems, though causally controlled by natural laws, are too complex for full prediction through functional analysis during real-life decision-making.

Rasmussen inspired other authors. Scott Snook applied Rasmussen's concepts to explore practical drift and the normality of accidents in highly reliable organizations. Erik Hollnagel introduced the principle of resonance, emphasizing emergence and advocating a non-reductionist approach to safety.

High-risk systems face internal and external constraints, leading to interactions and decisions by operators, engineers, managers, executives, and regulators dealing with various uncertainties (technological, social, political, and economic). Due to the complexity of interactions within socio-technological systems, predicting their behavior with certainty is challenging.

The boundary between operating safely and unsafely is ambiguous, but this is seldom publicly admitted in a world promoting technological, managerial, and financial innovations. Keeping sight of the relation between parts and the whole is an answer to Rasmussen’s interpretation. This involves understanding and managing uncertainties, adapting under various constraints, and considering the reality of powerful actors orienting ambitious strategies for high-risk systems. Faced with the self-organized and adaptive behaviors of individuals in high-risk systems, having the "bubble" or heedful interaction is needed. The ability to keep a broad vision of potential negative outcomes of interactions between various actors is crucial, and individuals at supervisory and managerial levels play a crucial role in managing the parts in relation to the whole.

 

Managing the Unexpected

Specific cognitive and social dynamics allow organizations to manage safety. Karl Weick's model of collective mindfulness is an example, emphasizing both failures and successes. To limit the possibility of failures by executives or regulation, authors like Perrow emphasize the macro systemic view of high-risk systems. The networks of actors and organizations, including civil society, unions, legal entities, state entities, and private companies, contribute to maintaining safe practices and managing the unexpected.

Fallible (cultural) constructs can be confronted by understanding the limitations due to cognition, organization, and complexity and identifying systems that successfully manage the unexpected despite these limitations.

The historical context of high-risk activities matters, and features like standardization, open markets, financialization, IT technology, and networked organizations influence the conditions under which safety management takes place.

 

Ref.

Le Coze, J.C. (2018), Managing the Unexpected, in: Möller, N., Hansson, S.O., Holmberg, J.E., Rollenhagen, C. (Eds.), Handbook of Safety Principles, New York City: Wiley.

Accessible online: (6) Managing the unexpected | jean-christophe le coze - Academia.edu