With the news of damage caused by misconduct, it is like the seasons. They come and go with a rhythmic regularity. When a new one has reached us, we can confidently assume that the next one will follow on its heels. Obviously, we humans have great difficulty in doing things right over time. Therefore, the only empirically proven wisdom of Murphy "Anything that can go wrong will go wrong" is truth and not hypothesis for many of us. And this has to do with the annoying fact that we humans are fallible.
In spite of this knowledge and in spite of the self-made experience of everyone, it must not be! The cry for a known damage is audible every time. And always it is clearly recognizable in the emotional excitement that the fallibility of humans in this current case cannot be accepted. In the outrage, our emotions ensure that our thinking is depressed and that we throw to the wind beliefs, facts and scientifically based knowledge, even certainties. Therefore, a cognitive effort is required from all those who have to deal with human errors, if they want to take care of a good problem solution at all. Unfortunately, there are many who do not make it and who do not have the strength to do so. Often executives are confronted with mistakes. And unfortunately, there are many of them who are not up to the task. They take the 'easy way out', leave the mistake hooked on people and go back to business as usual.
A reflex call for control
It is not only the loss of ratio that is typical of the confrontation with failure and the damage that often accompanies it. Equally symptomatic is the reflexive call for more control and more supervision. Paradoxically, this one-dimensional reaction stiffens itself in the same concept over and over again. It assumes that more regulation could prevent future damage. Compliance is a very closed concept. It therefore reaches its limits in our complex world. And yet we know that it is one of those concepts that has helped us to achieve a lot.
We've created great systems that deliver unprecedented levels of reliability and safety. Aviation is one such system. Over the past many years, it has managed to transport billions of people in an inherently risky way without making significant headlines with serious accidents anymore. Over the years, it has become the safest mode of transport of all. The stringent regulation of this industry has certainly made a significant contribution to this. Beyond that, our experience leads us into two interesting and important directions for the future. Firstly, we have reached a point where we have to conclude that more regulation will not bring further benefits in terms of safety or reliability. This observation is being intensively discussed in national and international regulatory authorities. Secondly, we assume that a system described and 'designed' with regulation (thanks to compliance) will work as it was determined and set up. But as we now have to learn, this is largely a fallacy.
As inventors or designers of our socio-technical systems, we delude ourselves if we believe we can describe them in every detail. For this we would have to have competences that we generally attribute to God. It is rather the case that the people in these systems are busy every day to meaningfully supplement and bridge all things not specified. Their actions ensure that systemic inconsistencies, gaps, risks or conflicting goals are successfully mastered and overcome. The future concepts that will help us make things more reliable, safer and the systems more resilient will deal intensively with these particular human strengths. In view of this knowledge, the call for more regulation as soon as failure has been detected somewhere is fading into the moth boxes of industrial history.
We need other approaches.
If we want to get away from the rhythmic regularity with which we receive news of failures and losses, then we should look beyond compliance to the framework that our companies and organizations need in order to prevent damage from them and from us. They all depend on learning within their systems. To do so, they depend on the experience of their employees in their interaction with the system. Because there are many damage-triggering incidents in the company that have their origin in false incentives of the system, and which are expressed in or are the reason for people's incorrect actions. And there are systemic misconceptions that are directly responsible for potential or even recurring damage. In both cases, it is not easy for the employees involved to come forward and point out the self-made mistake or the systemic inadequacy. This is because they may run the risk that their superiors may believe that the system is perfect or that they do not want or are unable to intellectually link a mistake made by an employee with systemic inadequacies. Both reactions of superiors are widely known and represent a top-class internal challenge.
But even if a company has overcome this hurdle by explaining these issues to managers and allowing them to take them into account in their work, a framework is needed to live a mature error culture, a Just Culture. The company must be able to maintain a database in which it lists all errors and near misses. It must be allowed to talk about them openly within the organization so that experiences can be passed on in a meaningful way. However, this data, as recorded in critical incident reporting systems of hospitals or in safety management systems of the industry, must be protected from inadequate access by third parties by laws yet to be created. Likewise, the reporting persons who have pointed out inadequacies or their own errors must be protected. If these basic conditions are given and Just Culture has established a foothold, corrections to the system will be fine-meshed, fast and effective. A Just Culture permeates the system hand in hand with the employees. It nips straw fires in the bud and smoothens the waves before they have caused damage. It complements the crude concept of compliance, which has all too often disappointed us and has never been able to deliver what we put in as hope.
What we still need to become safer
After the recent accumulation of breakdowns and scandals in federal companies such as SBB, Swiss Post and Swisscom, members of the Council of States are now calling for remedial action. They are calling for a law to strengthen the supervision of these companies, which are majority-owned by the Confederation. In the opinion of the members of the Council of States of the responsible Commission for Transport and Telecommunications, the influence of parliament on the failing companies must be significantly increased. The intention of this request is not difficult to guess. The serious shortcomings should be reduced with increased control.
As we can see, the reflex for more control is also at work in federal Berne. That is a pity, because in Switzerland politics has an old tradition. It sees itself as a framework-setting party for the players in society and the economy. The framework it builds is designed to allow companies to seek and offer the best solutions in a fair competitive environment and on their own responsibility. And when it comes to avoiding mistakes, they no longer need guidelines and controls, but laws that allow them to build and anchor error cultures. They need to have the assurance from the legislator that they can collect and store critical data and that this data and their reporters are protected.
The historical reflex to more control is understandable, but in today's world it is outdated and inappropriate. If the Swiss parliament is really interested in how it could contribute to public safety, it can start to work with a new understanding of the issue. The Swiss economy and our healthcare system urgently need laws that protect reporters and safety relevant data. They need learning zones that are free from fear. Zones where individuals can learn from their mistakes and zones where we can work relentlessly to improve our complex socio-technical systems.