Our idea of how undesirable events occur, in which humans are involved, is in urgent need of an update!

In complex systems, we tend to attach too much importance to people and their actions. In doing so, we diminish the influence that all other functions in a system have on the outcome. At the latest when it comes to the question of his responsibility in the event of an incident, this view becomes dishonest because it does not correspond to the real situation.

Our brain was formatted in a time in which there was always a direct causal connection between cause and effect. Man's actions led to a result that could only come about because of his actions. With emphasis on 'only' in the sense of exclusively. It was the time of simple systems. Causality everywhere you looked. Everything was related to the other in a comprehensible way. Good old times. Not that we have completely abandoned these cause-effect chains. No, they still exist (fortunately), although we really have to look for them in the modern working environment. If, on the other hand, we look around a bit and take a closer look at the world of people at work, a completely different picture emerges. They give inputs into systems that present them results without being able to understand how they came about. They work with outputs of black boxes without knowing their inner logic. The fact that something has been installed between the acting human being and his output is one of the characteristics of our contemporary working world. It relativizes his causality and would have to relieve him of his full responsibility to the same extent. However, we find this very difficult.

But that is only one aspect of the new world of work. The other is the increase in complexity. Many of us, I dare say the majority, are embedded in large systems in which we are a small node in a huge network. The image of the tiny cog in the big wheelwork belonged as a metaphor to industrialization, which we have long left behind us. It vividly showed the mechanical cause-and-effect chain and its causality. That was once upon a time.

Integrated into a network

In the network, however, I am only influencing, and I cannot say what output my inputs will result in exactly. We have largely lost the controllability of our systems and are concerned with influencing. Is there anyone who can explain the cell phone to us in detail? I doubt it. Do we need it? No, because we learn to understand and operate it through different inputs (influence). Or do you still read instruction manuals? The fact that we can only influence, but no longer genuinely steer and control, also means that we are only indirectly causal and that we can at best bear some responsibility if something undesirable should happen.

In today's working world, we are caught in the web of countless functions that hold the system together and make it perform. We influence with our actions, and we are influenced by the other functions. Each function in the system serves a purpose and is usually well described and regulated. As examples, to name a few, the following may be helpful to us: Selection of employees and managers, training, rules and regulations, contracts, management systems, technical systems, communication and last but not least the performance (actions) of the people in the system. For each function, the others form the environment because they all interact with each other in the system. All of these functions have normal and accepted variability. That is, they are allowed to vary to some degree, and they do. For example, technical systems have a 'Mean Time Between Failure rate' (MTBF). This describes the expected time between failures for a repairable system. The procedures for selecting managers have come a long way, but they do not give us a hundred percent guarantee of finding the right ones. Communication in the company usually serves its purpose, and yet miscommunication or misunderstandings occur time and again. And last but not least, despite successful training and adequate experience, a mishap can happen to us humans in the system that contributes to an undesirable event. Even with an intact inner attitude, our performance varies. We are fallible, like all other functions in the system.

Functional resonance

If we look at a system in this way, we can understand that random overlaps of fluctuations of different functions can occur, producing an incident. This, although in no function the threshold of tolerable would have been exceeded. Everything worked as it was supposed to work. Only the system as a whole failed. We are dealing with the phenomenon of 'functional resonance'.

In such a world, to maintain the notion that the actor on the site is always directly fully responsible for the result produced in the system is questionable. It reduces the focus of consideration to one function, the human being, and ignores all other influences. It is a reality distortion of the first order and dishonest on top of it. If this view is held up by managers, it is an expression of an attitude of having nothing to do with the whole thing, in order to possibly shirk one's own shared responsibility. We have arrived in the world of cooperation and complex systems.

Conclusion

Functional resonance confronts us with the difficult-to-accept fact that incidents or accidents in complex systems can occur with an unpredictable randomness. What this means for the organizations and companies that must prove themselves in the high-risk environment is something I will discuss in the next blog posts. This much up front: all those who still live in the notion that human action on the front lines follows the cause-and-effect chain in the modern world of work and can be considered the only viable rationale for the outcome must accept the accusation of social romantic, industrialized reverie. They urgently need an update if they are to take a hand in complex systems.