In the last blog, we explored a perfidious irrationality that makes it difficult for us as leaders to ensure psychological safety in the organization. Unfortunately, it's not the only one we should pay closer attention to. Another handicap that nature has imposed on us is giving us a hard time in our efforts to embed a safety culture in the company that is worthy of the name. One that is not characterized by superiors perfecting their expectations of employee behavior and setting them out in ever more insistent appeals and detailed mission statements, directives and regulations. But one that is based on mutual trust and in which managers succeed in providing psychological safety as a cultural element. It is the phenomenon of the hindsight and results bias that we need to take a closer look at. This thinking error is also a distortion of perception. It is related to the problem of assigning blame and thus has a direct impact on the trust relationship. Trust, however, is an indispensable prerequisite for a promising safety culture.
The effects of the hindsight and outcome bias.
First, this thinking error leads us to believe that if the outcome of an action was bad, the actions of the actors involved were also bad. In such cases, we assume that it was wrong decisions, poor situation analysis, or missed opportunities that were the reasons for the bad outcome. The hindsight bias is therefore quickly accusatory and undermines trust. The assumption that the actors, the people, were fallible is quite often the product of the thinking error we explored in the last blog. "What you see is all there is" (WYSIATI Rule). That perceptual bias that presents us unasked with a coherent story even based on minimal available information.
We can further observe that actions are posthumously assessed as mistakes in the light of the damage situation, although they were judged as normal, reasonable and appropriate for the actors in the situation itself. Often, in retrospect, an action is therefore judged to be irresponsibly risky. Take, for example, a standard low-risk surgical procedure in which complications arise and the patient dies. In retrospect, the survivors, lawyers, or judges will tend to believe that the surgery was risky as from the beginning. They are convinced that the doctor should have known better.
Furthermore, this thinking error leads to a general overestimation of the probability of occurrence of the incident. And it makes us believe that the ability of the involved persons to correctly assess this probability is insufficient. This means nothing other than that at the moment we become aware of the damage situation, we believe that we can make a factually correct judgment about the probability of occurrence. This is an arrogance that belittles those who were involved in the events. If we could not attribute this error of reasoning to an irrationality inherent in our nature, we would correctly have to apologize for it.
These three effects of the hindsight bias undermine trust to a particular extent, because the cause of the damage in all of them implicitly always lies with the person and the blame is thus placed on him or her. Any unreflected reaction by a leader to an undesired event therefore undermines the building of a culture of trust. I will explore this issue in a forthcoming blog.
Of systemic importance, on the other hand, is the fact that with this perception bias, the reasons that led to the incident are viewed uncritically and the root cause identification is handled far too superficially. This is because it makes us believe that we have always known. Thus, we already know the causes and any further consideration of the case is unnecessary. This observation shows that under such an interpretation of an incident, it becomes very difficult for the safety experts to obtain the necessary resources in the company that a professional investigation requires. Only this will enable the company to actually learn.
What is to be done?
Some of these consequences of the hindsight and result bias are difficult to bear for the persons involved in incidents. On the one hand, in combination with the WYSIATI rule, they cause judgmental superiors to fail to do justice to the people involved and the situation they were experiencing. This corrosively undermines the relationship and mutual trust. Second, they prevent managers from taking responsibility for the organization and working to improve the system because they believe they know the root causes. Much is gained if leaders are aware of these effects. This helps them to overcome their negative impact and, for example, to see and understand the situation as it presented itself to those involved before the events took their course.
In my experience, it is therefore important in the context of safety culture development projects to give managers the opportunity to take a closer look at this robust and perfidious cognitive illusion. After all, we are reluctant to give up a notion that makes us believe we can grasp the unpredictability of the world. When we need to break away from the comforting "I've-always-known," we as leaders come face-to-face with loss of control. It's uncomfortable. It pays to have a coach by your side during such confrontations.
The highest and most important commandment in connection with the hindsight and result bias is never to judge the quality of a decision by its outcome. But always by the quality of the process that led to the decision or action. Seen in this light, it would be correct for managers who have made high profits by taking too much risk, but who can only owe these profits to luck, to be sanctioned by their superiors. But because we are subject to the hindsight and outcome fallacy, we tend to attribute a sense of success to such managers. All those who doubt happiness-addled momentary luminaries are labeled mediocre, timid, and weak in light of the success they claim for themselves. This reflection illustrates to us the persistence of the hindsight and outcome fallacy. As leaders, we are challenged to counter it with consistent self-leadership. Those who set out to anchor a safety culture in the company are challenged to start with themselves.
This call is not only addressed to executives, but in particular to prosecutors and judges. They are always in the position of the retrospective. The hindsight bias helps the prosecuting party to present a coherent story, which may or may not coincide with reality. The effect of "it was obvious (given the harm) that the defendant took too great a risk" works great with the judging public as well as with many judges. It is good to know that there are increasingly judges who can successfully resist this perfidious distortion of perception. What is difficult for judges, on the other hand, is the fact that criminal law does not primarily focus on the courses of action and decision-making processes that led to the damage, but on the magnitude of the damage. It would be time for our legislators to balance the two orientations here.