
Earlier this year, one of Singapore’s F-16 fighter jet crashed soon after take-off. The pilot had to eject because the pitch rate gyroscopes were faulty and giving wrong information, resulting in an “uncontrollable” aircraft. A Singapore Airlines (SIA) flight from London to Singapore encountered turbulence so bad that it shook the whole world. The pilots could have called for an emergency landing at Yangon, but decided to go with Bangkok as SIA had a bigger station in the Thai capital and the medical facilities there could cope better with the emergency. Ironically despite the pilots’ decisive response, everyone including our newly-minted prime minister called for a “thorough investigation”, almost overlooking the pilots' brilliant performance in the face of a sudden misfortune.
On a separate but eventually related note, the World Health Organization (WHO) has begun a greater push towards incorporating human factors and systems thinking in patient safety. Their Global Patient Safety Action Plan 2021-2030 explicitly highlighted the need to adopt the human factors and systems approach. Even this year’s World Patient Safety Day theme and message emphasized the importance of looking at cognitive and systemic factors surrounding diagnostic safety. I have the privilege of being actively involved with WHO's to promote patient safety and healthcare human factors this year.
Diagnosing Diagnostic Errors
As the obscure safety message goes: There’s more than one way to get skinned by a cat. Decision-making biases quickly come to mind when discussing fallibilities in clinician decision-making. Indeed we still suffer from availability bias, wherein a colleague’s cough and cold symptoms are swiftly attributed to COVID-19. Research in biases led to the late Daniel Kahneman (Princeton professor who wrote Thinking Fast & Slow; he died earlier this year in March) winning a Nobel Prize in economics. Yet while it remains important to be aware that we are prone to biases, mastering all 150+ biases will not make us immune to them, a phenomenon known as the GI Joe Fallacy. Knowing is less than half the battle.
How then might we win the battle and achieve diagnostic safety supremacy? Many booby-traps do appear as individual weaknesses as well as bad designs in the system. We may miss out on critical clues because we were too rushed or fatigued to pay sufficient attention to them. Details may not be presented in an obvious, salient manner. Overstretched job demands and poor teamwork among colleagues further confound our ability to perform, because clinical diagnoses are never a one-person sport. Among the many creative ways we might be ensnared during our care delivery, one fundamental characteristic holds true: the quality of information affects the quality of your decision. Bad information is worse than no information.
Bad information on TikTok has led to people wrongly diagnosing themselves as autistic instead of seeking professional advice. Professionals can get it wrong too. As much as 95% of penicillin allergies (e.g.: amoxicillin) in the US are wrong. The rashes that appeared due to a secondary viral infection were wrongly attributed to an allergy reaction to the antibiotics administered for a bacterial ear infection, for example. The wrong initial assessment of a patient, perpetuated by anchoring bias and framing effect of the care team, led to a patient admitted for heart failure but died due to tuberculosis. Unlike no information, bad information can have us believing we are rightly going down the wrong rabbit hole without us realizing it until it is too late.
The Nature of Making Medical Diagnoses
And because healthcare is a complex team sport, everyone is the proverbial NASA janitor whose job was to help put a man on the moon. Diagnostic safety means each and every one of us has a role to play in ensuring good quality information is made available so that our patients receive the best clinical decisions. It could be as basic as ensuring the correct patient received the correct procedure, or something as technical as keeping medical equipment well-maintained and communicating professional insights among colleagues. Even patients and their caregivers are diagnostic safety stakeholders, through detailed elaboration of symptoms and other details, as well as pointing out errors and blind spots along the care journey.

This leaves physicians as star quarterbacks of the team, and we think their goal is to achieve 100% clarity in whatever that's plaguing us as patients. In reality, humans don't come with a user manual of error codes, and doctors are dancing with uncertainty all the time. Experts earn their honors not by being spot-on, but by coming up with a workable course of action within a short amount of time in the face of complex or novel situations. You could say the pilots of that fateful flight SQ321 were the best in their business, and were able to swiftly begin their descent into Bangkok after 17 minutes since the turbulence event The plane was on the ground in less than an hour, with medical services waiting to receive them.
Nonetheless, Daniel Kahneman wanted you to know that the danger of intuition is a premature decision. A urologist and an orthopedic surgeon will perceive your back pain very differently. Delay intuition. Be systematic about collecting data.
And then there're algorithms and automation, which warrants another commentary for another time.
As you celebrate patient safety later this year, remind yourself not to be a problematic pitch rate gyroscope in a complex and interconnected machinery. Sometimes it just take one component to fail in the care journey for harm to occur. You have a role to play in ensuring accurate and reliable information for everyone.
Comments