No theory forbids me to say "Ah!" or "Ugh!", but it forbids me the bogus theorization of my "Ah!" and "Ugh!" - the value judgments. - Theodor Julius Geiger (1960)

Horst Rittel

What’s the problem with safety in our organisations? Well, it depends who you ask. Operations might see under-reporting, HSE weak signals. Procurement sees contracting pressure, HR psychological safety. Each definition implies a different solution path, be it training, staffing, redesign, or something else entirely. And there’s no stop rule. When is a problem fixed well enough? Usually when time or budget run out..
 
Once easy wins from engineering such as guardrails, machine interlocks, PPE specifications and compliance checklists are in place, organisations face more stubborn questions. Questions tied to equity, authority, and competing goals, such as:
- production vs. protection,
- cost vs. care, and
- speed vs. thoroughness.

These are ill-defined, value-laden, and context-dependent. They don’t have ๐˜ด๐˜ฐ๐˜ญ๐˜ถ๐˜ต๐˜ช๐˜ฐ๐˜ฏ๐˜ด, only consequences and compromises. The moment you define the problem, you’ve already chosen the solution path. These problems are not about compliance, so we have to keep the conversation about risk alive. We deal with incomplete information, disagreement among stakeholders on values, and new consequences emerging out of any chosen action:

-     Every risk reduction measure costs time, money, or output;
-     Speak-up programs solve the part we can model, which can deceive others into believing the danger is gone;
-     In supply-chains, accountability diffuses across boundaries;
-     Automation moves risks into new interfaces and decisions;
-     When it comes to fatigue and well-being, organisational design meets social and economic life;
-     With learning from accidents, every root cause is a political story about what counts as truth.
For these problems, the idealised cycle of setting goals: analysis; optimisation; monitoring - is often unattainable, and sometimes undesirable.
 
Here we see what German professor Horst Rittel (1930-1990) first formulated in lectures sixty years ago, called wicked problems. Rittel studies physics and mathematics, and sociology too. He became a design and systems thinker. He stugheid his students that every technical decision has social ripples. He warned that formal programs with their targets, dashboards and scorecards make things look too tidy. We can build elaborate systems of indicators but they are “surrogates for statements of desired conditions” (page 165). And "it’s terribly difficult, if not impossible, to make these systems truly operational."
 
What Rittel called first-generation tools (think of our root cause analysis and compliance audits) help only ๐˜ข๐˜ง๐˜ต๐˜ฆ๐˜ณ a problem is tamed. But most safety challenges aren’t tame. Rittel’s second-generation design methodology placed participation and argumentation at the centre. Today we see this in learning teams and collaborative safety reviews:
๐˜›๐˜ฉ๐˜ฆ ๐˜ฑ๐˜ณ๐˜ฐ๐˜ฃ๐˜ญ๐˜ฆ๐˜ฎ ๐˜ข๐˜ฏ๐˜ฅ ๐˜ต๐˜ฉ๐˜ฆ ๐˜ด๐˜ฐ๐˜ญ๐˜ถ๐˜ต๐˜ช๐˜ฐ๐˜ฏ ๐˜ค๐˜ฐ-๐˜ฆ๐˜ฎ๐˜ฆ๐˜ณ๐˜จ๐˜ฆ ๐˜ต๐˜ฉ๐˜ณ๐˜ฐ๐˜ถ๐˜จ๐˜ฉ ๐˜ฅ๐˜ช๐˜ด๐˜ค๐˜ช๐˜ฑ๐˜ญ๐˜ช๐˜ฏ๐˜ฆ๐˜ฅ ๐˜ฅ๐˜ฆ๐˜ฃ๐˜ข๐˜ต๐˜ฆ ๐˜ข๐˜ฎ๐˜ฐ๐˜ฏ๐˜จ ๐˜ต๐˜ฉ๐˜ฐ๐˜ด๐˜ฆ ๐˜ธ๐˜ฉ๐˜ฐ ๐˜ฅ๐˜ฆ๐˜ด๐˜ช๐˜จ๐˜ฏ ๐˜ต๐˜ฉ๐˜ฆ ๐˜ธ๐˜ฐ๐˜ณ๐˜ฌ, ๐˜ฅ๐˜ฐ ๐˜ต๐˜ฉ๐˜ฆ ๐˜ธ๐˜ฐ๐˜ณ๐˜ฌ, ๐˜ข๐˜ฏ๐˜ฅ ๐˜ญ๐˜ช๐˜ท๐˜ฆ ๐˜ธ๐˜ช๐˜ต๐˜ฉ ๐˜ต๐˜ฉ๐˜ฆ ๐˜ค๐˜ฐ๐˜ฏ๐˜ด๐˜ฆ๐˜ฒ๐˜ถ๐˜ฆ๐˜ฏ๐˜ค๐˜ฆ๐˜ด.

References:
West Churchman, C. (1967). Wicked Problems. Management Science, Vol. 14, No. 4, Application Series, pp. B141-B142.
Rittel, H. W. J., & Webber, M. M. (1973). Dilemmas in a General Theory of Planning. Policy Sciences, Vol. 4, Issue 2, pp. 155–169.
Photo of Horst Rittel by alchetron.com