Horst Rittel
What’s the problem with safety in our organisations? Well, it depends who you ask. Operations might see under-reporting, HSE weak signals. Procurement sees contracting pressure, HR psychological safety. Each definition implies a different solution path, be it training, staffing, redesign, or something else entirely. And there’s no stop rule. When is a problem fixed well enough? Usually when time or budget run out..
Once easy wins from engineering such as guardrails, machine interlocks, PPE specifications and compliance checklists are in place, organisations face more stubborn questions. Questions tied to equity, authority, and competing goals, such as:
- production vs. protection,
- cost vs. care, and
- speed vs. thoroughness.
These are ill-defined, value-laden, and context-dependent. They don’t have ๐ด๐ฐ๐ญ๐ถ๐ต๐ช๐ฐ๐ฏ๐ด, only consequences and compromises. The moment you define the problem, you’ve already chosen the solution path. These problems are not about compliance, so we have to keep the conversation about risk alive. We deal with incomplete information, disagreement among stakeholders on values, and new consequences emerging out of any chosen action:
- Every risk reduction measure costs time, money, or output;
- Speak-up programs solve the part we can model, which can deceive others into believing the danger is gone;
- In supply-chains, accountability diffuses across boundaries;
- Automation moves risks into new interfaces and decisions;
- When it comes to fatigue and well-being, organisational design meets social and economic life;
- With learning from accidents, every root cause is a political story about what counts as truth.
For these problems, the idealised cycle of setting goals: analysis; optimisation; monitoring - is often unattainable, and sometimes undesirable.
Here we see what German professor Horst Rittel (1930-1990) first formulated in lectures sixty years ago, called wicked problems. Rittel studies physics and mathematics, and sociology too. He became a design and systems thinker. He stugheid his students that every technical decision has social ripples. He warned that formal programs with their targets, dashboards and scorecards make things look too tidy. We can build elaborate systems of indicators but they are “surrogates for statements of desired conditions” (page 165). And "it’s terribly difficult, if not impossible, to make these systems truly operational."
What Rittel called first-generation tools (think of our root cause analysis and compliance audits) help only ๐ข๐ง๐ต๐ฆ๐ณ a problem is tamed. But most safety challenges aren’t tame. Rittel’s second-generation design methodology placed participation and argumentation at the centre. Today we see this in learning teams and collaborative safety reviews:
๐๐ฉ๐ฆ ๐ฑ๐ณ๐ฐ๐ฃ๐ญ๐ฆ๐ฎ ๐ข๐ฏ๐ฅ ๐ต๐ฉ๐ฆ ๐ด๐ฐ๐ญ๐ถ๐ต๐ช๐ฐ๐ฏ ๐ค๐ฐ-๐ฆ๐ฎ๐ฆ๐ณ๐จ๐ฆ ๐ต๐ฉ๐ณ๐ฐ๐ถ๐จ๐ฉ ๐ฅ๐ช๐ด๐ค๐ช๐ฑ๐ญ๐ช๐ฏ๐ฆ๐ฅ ๐ฅ๐ฆ๐ฃ๐ข๐ต๐ฆ ๐ข๐ฎ๐ฐ๐ฏ๐จ ๐ต๐ฉ๐ฐ๐ด๐ฆ ๐ธ๐ฉ๐ฐ ๐ฅ๐ฆ๐ด๐ช๐จ๐ฏ ๐ต๐ฉ๐ฆ ๐ธ๐ฐ๐ณ๐ฌ, ๐ฅ๐ฐ ๐ต๐ฉ๐ฆ ๐ธ๐ฐ๐ณ๐ฌ, ๐ข๐ฏ๐ฅ ๐ญ๐ช๐ท๐ฆ ๐ธ๐ช๐ต๐ฉ ๐ต๐ฉ๐ฆ ๐ค๐ฐ๐ฏ๐ด๐ฆ๐ฒ๐ถ๐ฆ๐ฏ๐ค๐ฆ๐ด.
References:
West Churchman, C. (1967). Wicked Problems. Management Science, Vol. 14, No. 4, Application Series, pp. B141-B142.
Rittel, H. W. J., & Webber, M. M. (1973). Dilemmas in a General Theory of Planning. Policy Sciences, Vol. 4, Issue 2, pp. 155–169.
Photo of Horst Rittel by alchetron.com