Recently, one of my favorite surgeon bloggers Skeptical Scalpel wrote about wrong sided surgery and the recent report of two such incidents in California hospitals this past week. Apparently, a hospital spokesperson said “We have policies in place, and training in place, but the system broke down because of the human element.” I presume this is in reference to the usual perioperative checklists, procedures (such as marking the surgical while the patient is awake), and pre-procedural “time out” protocols intended to confirm the correct patient and procedure. As Skeptical Scalpel notes, “These were human errors and will likely happen again. The existing policies were adequate. They simply were not followed.” This raises an interesting philosophical question: if the policies can be disregarded (whether deliberately or accidentally), are they indeed adequate?
We can examine policies in terms of a taxonomy of violations developed by James Reason (famous for the “swiss cheese” paradigm of errors, illustration here from Duke.). A violation, as Reason has defined it, is a deliberate (though not necessarily malevolent) deviation from rules, policies, or best practices. Others have expanded Reason’s descriptions of violations, and several taxonomies now exist. Though sometimes known by synonyms, violations may be of several varieties:
- Unintentional violation: Rules are not well disseminated; these violations are willful acts but not willfully defiant of a rule
- Exceptional violation: A novel or urgent situation doesn’t have a well-prescribed set of rules, so improvisation is needed “I didn’t know what to do, so I just did my best under the circumstances”
- Routine violations: Common and condoned within the organization “Everyone does it this way”
- Situational violation: “It is literally impossible to get the job done otherwise”
- Optimizing violation: “I thought it was better/faster/less expensive to do it this way”
- Reckless violation: “I don’t care about the rule. I’m so great – it doesn’t apply to me”
The first four are opportunities for reflection, because these violations may represent areas for actual improvements. These offer potential systems changes that can improve safety culture as well as efficiency or other processes. The fifth, however, requires personal motivation changes, which can be particularly difficult to effect. Is the blatant disregard of a surgical “time out” verification likely unintentional or exceptional? I doubt it. If it is routine, that is a sad commentary on the institutional culture. I am not convinced of any circumstance in which it is situational – if the surgeon plans to operate, he or she can also plan to participate in the safety process. Could it be optimizing or reckless? Probably either one; the best remedies are distinct.
Reason says that humans do not plan and execute actions in isolation, but within a regulated social milieu – in this case, within the safety culture, or lack thereof, of the operating room. While errors may be defined in relation to the cognitive processes of the individual, violations can only be described with regard to behavior within a social context. This can be interpreted to mean that policies cannot be considered adequate without the firm buy-in of those who are expected to carry them out. The existence of a culture and a system that tolerates deviation from the policy, by definition, is not adequate. The training is not adequate. The leadership is not adequate. The followership is not adequate. At all levels, a safety culture fails to exist if violations are tolerated or even tacitly rewarded (as is often the case with “cutting corners” for improved speed or reduced cost).
Another reason that violations are so important for consideration and study is that they represent the “tip of the iceberg” for safety innovations. According to a 2009 review of safety violations, all violations do not necessarily lead to unwanted outcomes such as accidents and injuries. On example cited that among industrial accidents, there are only 29 minor injuries and 1 major injury for every 300 accidents that do not result in an injury. Therefore most errors and violations go unnoticed or even condoned. When behaviors and decisions that have potential to be catastrophic (but usually are not) become normalized in culture, it is called “normalization of deviation.” This normalization leads to a phenomenon called “outcome bias”. Outcome bias is the tendency to judge a decision behavior on the eventual outcome rather than on the quality of the decision. It is common in medicine to focus on outcomes as a quality measure, but it is surely not the only important measure. There will always be some number of adverse outcomes even when no error or violation is made. And, there will always be the statistical rarity with which catastrophic events occur. In medicine, we use statistical rarity to justify evidence based decisions, evaluating the “number needed to treat” before a benefit is measurable, and other tools of the like. However, individual decisions are not always conducive to that kind of evaluation.
A violation cannot be judged to be a good decision simply because it does not always (or even often) result in adverse outcomes. A clear example is that of drunk driving (or – in more recent media emphasis, texting while driving). It has been estimated that by the time a drunk driver has been arrested, he has “successfully” driven drunk 80 times previously. And arrests for drunk driving are more common that catastrophic accidents involving drunk driving. Therefore, if we rely on outcome bias and statistics alone, drunk driving seems like a pretty safe thing to do. But because the potential for an adverse outcome to result is both real and grave, we do not consider it a good decision.
Publishing policies that are not enforced, or that lack buy-in from those expected to implement the policies are not adequate policies. Healthcare organizations need to understand the psychology of decision behavior and target safety violations appropriately, if we hope to actually eliminate “never events” like wrong sided surgery.