No theory forbids me to say "Ah!" or "Ugh!", but it forbids me the bogus theorization of my "Ah!" and "Ugh!" - the value judgments. - Theodor Julius Geiger (1960)

Common Sense

Have you seen them lately? Posters or signs about working safely? 

“Accidents are caused by human error, there is no such thing as bad luck”

“Only constant supervision helps to stop work in case of unsafe situations.”

“If there is no safety awareness in the workplace, accidents are bound to happen.”

“Accidents are caused by neglecting to do something or by not paying close attention.”

 Do those posters and tiles fulfill a function? William Edwards Deming was clear about it:

“Such exhortations only create adversarial relationships.” Moral judgments often end in persistent disputes. It is logical that the workplace does not just accept these slogans, because – in the words of James Reason: “Human actions are almost always constrained by factors beyond an individual's immediate control”.

How about that, you say? On the surface, the statements above all seem like "common sense" statements. You can understand (un)safe work with it. Can you really, though? Well no. Because our intuition, experience and acquired wisdom contain systematic errors of reasoning, "common sense" reasoning suffers from a combination of limitations, all of which reinforce and even obscure each other. As a result, common sense is great at making sense of the world, but not necessarily at understanding it.

The American sociologist Duncan Watts (2011) describes three systematic errors of reasoning in our "common sense" statements.

Mistake 1 - Underestimation of the unconscious:

- We think about why people do what they do and focus on factors such as incentives, motivations and beliefs, of which we are aware. It doesn't occur to us that normal work routines are automatic or unconscious; so we don't factor this into our anticipation of how people will react. In addition, people adapt in the workplace, either on the basis of trained coordination or because unforeseen circumstances arise. People also make trade-offs between operational goals and safety goals. In fact, as we'll see, it's probably impossible to anticipate everything that might be relevant to a given situation. As a result, no matter how carefully we try to put ourselves in someone else's shoes, we are likely to make serious mistakes in predicting how they will behave somewhere outside the immediate here and now.

 Mistake 2 – Underestimation of the group process:

Our mental model of collective behavior is even more wrong than our mental model of individual behavior. The fundamental problem is that when people come together in groups – for example on a construction site – they communicate with each other, share information, spread rumors, pass on recommendations, compare themselves with their colleagues, reward and punish each other's behavior, learn from the experience of others and in generally influence each other's perspectives on what is right and wrong, correct or incorrect, useful and clumsy. These influences accumulate in unexpected ways, creating collective behavior that is "emergent" in the sense that it cannot be understood only in terms of its individual parts. Faced with such complexity, common sense explanations instinctively fall back on the logic of individual action. Sometimes we call on fictitious concepts, such as 'the market', 'the workplace' or 'the safety culture' that we put in place of the actions and interactions of many. And sometimes we mention 'special people', such as project leaders, visionaries or influencers to whom we attribute all power. Regardless of which trick we use, the result is that our explanations of collective behavior usually don't really describe what is actually happening.

Mistake 3 – disregard of history:

When something interesting, dramatic, or terrible happens, we instinctively look for explanations. Because we only want to explain these events afterwards, we focus in studies on the scenario that happened and simple cause-effect relationships in it and not on many others that could have happened. Because we only try to explain events that seem sufficiently interesting to us - significant accidents - our explanations explain even a very small part of the things that happen. As a result, what seem to us causal explanations are in fact just stories—descriptions of what happened that tell us little or nothing about complex confluences at work. We then discuss individual actors and separate causes in a system, which will not lead to sustainable system improvements. Bad results are not the result of immoral choices, but the product of normal, locally rational interactions between people and systems that often keep control and sometimes lose it.

Common sense, according to Watts, works just like mythology, where we give clear explanations for whatever specific circumstances the world throws at us. So common sense is functional because we don't worry about whether what we think we know is really true. It is much easier to tackle a clearly defined root cause than a combination of factors and circumstances, at least in the short term. The prize is that we think we've understood things that we've basically just wrapped up in a plausible-sounding story. And because this illusion of understanding, in turn, undermines our motivation to treat safety issues the way we treat issues in medicine, engineering, and science, the unfortunate result is that common sense hinders our understanding of the work.

I conclude with a text that would fit on a tile: “The label “human error” should be the starting point of study and research, not the end point” (Woods et al, 2010).

Sources:

Dekker, S.W.A. (2017), Rasmussen's legacy and the long arm of rational choice, Applied Ergonomics, Volume 59, Part B, March 2017, Pages 554-557.

Deming, W.E. (1986), Out of the crisis. Cambridge, Mass.: Massachusetts Institute of Technology, Center for Advanced Engineering Study.

Le Coze, J.C. (2015), Reflecting on Jens Rasmussen’s legacy. A strong program for a hard problem, in: Safety Science 71: pp. 123-141.

Reason, J. (1997), Managing the Risks of Organizational Accidents, Taylor & Francis Ltd.

Schröder-Hinrichs, J.U., Hollnagel, E., Baldauf, M. (2012), From Titanic to Costa Concordia—a century of lessons not learned, WMU Journal of Maritime Affairs 11 (2).

Watts, D.J. (2011), Everything Is Obvious: How Common Sense Fails Us, Sydney: Currency.

Woods, D. et al (2010), Behind Human Error – 2nd edition, Boca Raton: CRC Press.