No theory forbids me to say "Ah!" or "Ugh!", but it forbids me the bogus theorization of my "Ah!" and "Ugh!" - the value judgments. - Theodor Julius Geiger (1960)

Biases in safety management

Below, I summarize a paper from Reiman and Rollenhagen regarding biases in safety management. I think it's a great piece of work.

A human factors strategy for congruence in safety management

Underlying assumptions and theories in use among safety professionals (as well as safety researchers) vary a lot. To form more congruent conceptions among personnel, a human factors strategy would be needed in the organization.

Human behavior is contextual and influenced by many factors, including the environment, tools, and technology, as well as social and organizational norms. People tend to attribute the mistakes and errors of others to stable traits rather than contextual reasons.

Safety management systems based on overly rational images of organizations can be ineffective, as they do not account for the reality of organizational life, which often involves informal practices and adaptations. Organizations are constantly adapting and changing in response to daily challenges. Organizations don’t only learn from failures; defining what constitutes a failure is a social and political process.

Uncertainty and the perils of quantification, linear causality and lack of contextualization 

Organizations have to deal with uncertainty due to a lack of information, and with uncertainty due to inherent randomness in a system. Over-quantification can lead to the dismissal of subtle but important issues that are not easily quantifiable, such as subjective risk and gut feelings. The handling of uncertainties is related to the development of expertise and decision-making in general.

Safety science is based on certain underlying assumptions, such as quantification and linear causality, which may not accurately represent complex sociotechnical systems. Safety science needs to take into account the reciprocal causality of technology and the human elements of the system. Considering the subjective meanings of personnel in sociotechnical systems is important. The challenge is to turn data into reliable and robust solutions.

The lack of contextualization in safety models can lead to problems where safety domains such as occupational safety and patient safety, become so context-dependent that common features among these different safeties become a problem. Relying on experience and personal preferences based on previous success can be a drawback, as experience can narrow one's perspective and create vulnerability to overgeneralization. Mistaking safety for the absence of something can lead to taking past success as a guarantee of future safety. Safety management systems need to be fine-grained enough to incorporate specific hazards and tasks found in various domains while being general enough to integrate various types of safeties found in complex socio-technical systems.

Source:
Reiman, T., Rollenhagen, C. (2011), Human and organizational biases affecting the management of safety, in: RE&SS 96-10, 1263-1274.

Full paper here