No theory forbids me to say "Ah!" or "Ugh!", but it forbids me the bogus theorization of my "Ah!" and "Ugh!" - the value judgments. - Theodor Julius Geiger (1960)


The concept of trust with regard to safety

One of the most-used concepts in safety management is safety culture. It came to light after the Chernobyl nuclear disaster in 1986 and was based on the idea that organizations with a strong culture do well. The Chernobyl disaster was claimed by Westerners not to could have happened in their countries and – conversely – the Fukushima disaster was deemed by the Japanese themselves as caused by “the ingrained conventions of Japanese culture”. The concept of safety culture is now used in all kinds of complex and high-risk sectors, usually as something that is measurable and which does not change quickly. The American sociologist Ron Westrum pointed out the importance of Trust for the sharing of information about safety risks. However, while trust is essential to collaboration, communication and relationships, boundless trust can prove itself costly in the end. Besides, the information used to substantiate our trust an organization’s safety management – for example, certificates, expert opinions or audits – cannot tell that trust is warranted for future performance.

 An unintended side effect

Usually, in organizations, there is a perception that the organizational culture must give priority to safety, along with the commitment of management and all employees in the workplace. This often means that employee behaviour, individual behaviour or collective behaviour, is seen asthe dominant cause of accidents. Small wonder that people in these organizations prefer not to talk about their “mistakes”, isn’t it? Ron Westrum, the inventor of the continuum of cultures, indicated that suppression was the dominant feature of a pathological culture. Westrum’s Walhalla is a mission-oriented organizational culture in which information is, shared timely. Further, this information preferably provides answers that the recipient can effectively use.

 Skepticism as a proposed alternative for trust

Because trust can turn out costly for safety, what is needed instead is an attitude of skepticism. Skepticism should be understood here NOT as universal doubt as in people who call out “fake news” all the time. With skepticism I mean an approach in which claims follow observation. This approach prevents too much trust from suddenly turning into too much distrust. Cultivating skepticism doesn't require constant, time-consuming monitoring, but keeping an eye on things and circumstances from time to time. Skepticism combines a fundamental basic trust - that is large enough to allow each other freedom to adapt - with a touch of distrust. This keeps the door open for meaningful monitoring and feedback regarding to safety, instead of just complying to rules and procedures. Though Westrum’s emphasis on trust seems to be too much of a good thing, this meaningful monitoring and feedback seems to be what Westrum had in mind with his generative culture.

…a humanistic alternative

In the proposed skeptical program, something else is important: the functional confidence that people perform their duties reliably, where we see them as human beings rather than as functionaries; where we try to understand their point of view and meet them repeatedly when and where it’s important for them. The fine print of the “work-as-imagined” and the messiness of normal work should be discussed as “trust” generously overlooks it.


Postscript: Why “safety culture” fails, but still persists

Jean Cristophe Le Coze (2019) showed that safety culture is a central product of self-reinforcing actions of the safety field, which is made up of regulators, industry sectors, consultants, academics and publishers. Safety culture has a broad appeal for managers: it gives a sense of control; it promises a solution to the problem of safety management and it supports management's claim that they care about the problem. As such, safety culture is generically applied, sometimes even as a certification scheme and as a basis for legal requirements.

Safety culture interventions - with their explanations at the individual or component level - are known to be unable to reliably explain the whole social or system performance. The concept of safety culture keeps silent about power issues and disagreements over organizational practices, beliefs and mindsets. These kind of interventions are fads.

According to the American sociologist Joel Best, people: rarely proclaim their disappointments; forget what happened with the last institutional fad. Because of the fear of being left behind, they don't ask for persuasive evidence. If we want to become fad-proof, we have to deliberately check these reflexes. Like a skeptic would.

 Suggested reading:


Best, J. (2006), Flavor of the Month - Why Smart People Fall for Fads, Berkeley: University of California Press.

Henriqson, E., Schuler, B., Van Winsen, R., Dekker, S.W.A. (2014), “The constitution and effects of safety culture as an object in the discourse of accident prevention: A Foucauldian approach”. Safety Science, 70: 465-476.

Le Coze, J.C. (2019), “How safety culture can make us think”. Safety Science, 218: 221-229.

Rae, A.J. & D. Provan (2020), Ep.4 What is the relationship between trust and safety?, Safety of Work podcast.

Schmid, W. (2019), “Vertrauen - Die Basis des sozialen Miteinanders”, SWR2 Wissen podcast.

Westrum, R. (2014), “The study of information flow: A personal journey”. Safety Science, 67:58-63.