Missing the obvious
In an attempt to restore confidence after a flurry of blunders, the Hospital Authority is reportedly planning to establish a 'court of discipline and professional competency' for proceedings against staff involved in serious adverse events. Coupled with management exhortations for improved teamwork and the prospect of 'safety police' patrolling the wards, the public may be forgiven for assuming that delinquency is widespread among hospital staff.
From a judgmental perspective, it can be difficult to understand why ostensibly straightforward tasks are bungled with such regularity by supposedly well-trained individuals. But this view is strongly prejudiced by our societal concept of sentient autonomy, which attributes erroneous actions to poor choices and inapt risk-taking. This is fundamentally unfair to the hospital frontline workforce who are overwhelmingly dedicated and conscientious.
The psychology of cognition provides a more complete (but uncomfortable) explanation: error-proneness is itself a by-product of expertise. This paradox arises because proficiency is attained by attentional filtering and the formation of mental models. Skilled behaviour becomes increasingly automatic and less under conscious control as rules develop (based on expectations) which allow for fast and accurate predictions and correct performance most of the time.
However, these schemata are subject to subliminal biases of confirmation (selectively seeking data which support a preconceived view) and cue generalisation (using broader, rather than more detailed, characteristics for inference) which can lead to failure of the planned task in some settings. A key concept here is looking but not seeing ('inattentional blindness'), which is a seemingly inexplicable failure to apprehend what should be obvious. A classic demonstration of this phenomenon is one in which observers tasked with counting basketball passes often failed to perceive a gorilla walking across the court.
Perceptual failures are a common feature of mistakes in medical care, arising from a mental tradeoff between attention allocation and expectation, typically occurring during checking, identification and administering procedures. Studies show that errors of this type materialise in around 1 in 300 steps or characters, and the risk is increased by stress, overburden, distraction, tiredness and 'look-alike, sound-alike qualities' (of drugs, tests and patients). In general, the greater the expertise, the more automatic the performance becomes, which is why the best people can sometimes make the worst mistakes.
These types of unintended slips should be distinguished from violations, which are deviations from safe operating practices (for example, failing to use a mandatory checklist). Deliberate breaches generally reflect lack of individual motivation in the context of a permissive organisational culture. In such instances, censure is justified, but remediation will necessitate a change in hospital governance.
What, then, can be done to limit medical blunders? We can start by accepting that human fallibility is the least manageable aspect. This may be difficult to accept because of the potentially devastating consequences of these mistakes, but the errors in themselves are cognitively quite ordinary.
A more fitting target for preventing errors is to modify a system's design - for example, improving the quality and delivery of information in the workplace. Authentication and identification-assistance tools such as bar-coding and radio-frequency tagging provide for highly reliable data capture and information support.
However, a human line of defence is still important and requires emphasis, since these technologies may introduce new hazards that can easily escape detection if staff reduce their vigilance in deference to the computerised system.
What can the health care consumer do to reduce their own risk? Wherever possible, patients should be active, involved and informed participants on their health care team.
The 'Speak Up' programme of The Joint Commission (which accredits health care organisations internationally) provides a useful framework: speak up if you have questions or concerns; pay attention to the care you are receiving; educate yourself about your diagnosis, tests and treatment; ask a family member or friend to be your advocate; know which medications you are taking and those to which you are allergic; use a hospital that has undergone a rigorous on-site evaluation of safety standards and; participate in all decisions about your treatment.
There is no simple solution to tackling human error in health care. Regimented orders for more careful conduct are bound to fail since errant staff are usually unaware that they were unaware. Innovative strategies, akin to guerilla warfare designed to ambush vulnerabilities in organisational defences, are more likely to reduce the number of unnoticed gorillas in our midst.
Darren Mann is a clinical associate professor (honorary) at the Chinese University of Hong Kong and examiner in surgery of the Royal College of Surgeons of Edinburgh