In recent blog articles, there has been repeated talk of the complex work environment. We can widely observe how the costs of progress are presented to us in the form of increasing complexity. Be it new products, services, contracts, or laws. They all emerge from an environment that is becoming more complex at a breathtaking pace. Sometimes they themselves are turning the complexity screw. "Mastering Complexity" is becoming an aspiration, not only for executives. Can we handle it? Well, if this is supposed to mean a desire to regain control, the expectation is bound to lead to disappointment. The answer is more likely to lie in clever and intelligent influence. This may take some getting used to. I am afraid that there is an adjustment to reality in the air here.
Complex systems have the pleasant side of making new things possible. And they have the unpleasant side that they can lead to system failures without the individual functions needing to exceed their defined performance limits. The phenomenon of functional resonance can cause an undesirable event to occur with a random overlap of tolerable performance deviations of individual functions. Randomly.
How we try to protect humans from their fallibility
Most systems are built around the human being. As a rule, we still have an important function. Our fallibility is reduced to an acceptable level by a variety of technical and organizational means. For example, pilots in modern cockpits are prevented from leaving the aerodynamic envelope of the aircraft by a computer-guaranteed envelope protection. An aerodynamic stall would not only be a frightening maneuver for the passengers but could also be the cause of a crash. Which means nothing less that we still let the pilots steer, but only to a predefined degree. By the way, this feels like preventing you from crossing the street without having looked left and right first. It's only a marginally good experience. Organizational measures that set limits to the free actions of fallible human beings are, for example, rules, process specifications and competence restrictions. With increasing complexity, it is not surprising that the longer, the more we threaten to drown in a flood of laws and regulations. They are all an expression of the attempt to limit human fallibility to a tolerable level so as not to lose control. All this works reasonably well. We should be satisfied with what we have achieved so far, even if it is accompanied by the acceptance of certain risks. After all, we know today that more regulation does not mean more safety or reliability. We have come to an end with that.
Dealing honestly with error
Let us remind ourselves that the occurrence of certain risks in complex systems is due to randomness? They can only occur if there happens to be a superposition of several (individually tolerable) functional deviations. As soon as a human being was involved in such an undesired event by its action or omission, we tend to turn all eyes on him. Is it honest, if it turns out that he did not leave the limits of his 'given envelope'? That he has not acted intentionally or with gross negligence? That he was involved, with an intact inner attitude, in a situation that led to an unintended incident? He contributed, we can assume in most cases, that it happened. Perhaps with carelessness, which alone, however, would not have led to the undesired result. Would it be fair if this incident, which was due to chance, would find its way into his personnel file? How fair is it when such spontaneous fallibilities are used to qualify an employee or manager? How fair are the side-eye looks from colleagues who usually work with him? We are often merciless. And we are often indecently undifferentiated. Who makes the effort to learn about all the other influences that contributed to the event? Who starts trying to do justice to the complexity? Who manages not to give a single fallibility the significance of a qualifying dimension?
About good leadership
For some readers, this may sound like a plea for a general amnesty. As an attempt to take employees and managers in complex work environments off the hook in general. As an argument for not taking responsibility for one's own actions. As a sign of weakness. Now, this image can only be created by someone who implies that the fallible person involved in the event intends not to take responsibility. That is fierce. I don't know any doctors, pilots, air traffic controllers or operators in nuclear power plants who deal with responsibility in this way. How could trust ever arise under such a mindset? By indulging my expectations of infallibility as a manager or colleague, and meeting anyone who fails to live up to them with skepticism and accusations, regardless of how the circumstances might have been contributing? That these are corrosive thoughts need not be elaborated here. It is obvious that they speak the word of a culture of mistrust. A culture that would have a hard time ensuring reliability and safety in a high reliability organization. In the hospital, in the nuclear power plant, in the airline or in rail operations, to name just a few.
Whoever, as a manager, allows him- or herself to be tempted or even finds it right that incidents of the kind described should be used for the qualification of those involved, should not be surprised if he or she lacks the qualification as a manager in a high reliability organization.
The ‘honest mistake’ has no place in a personnel dossier.
If you as a manager manage to anchor this basic principle of Just Culture in your area of responsibility, you will be surprised what happens. Do you have doubts? Then contact me. I can show you the effects with concrete examples.