By Binseng Wang

BinsengWangx250

Binseng Wang

Safety champion John Nance published an award-winning book titled Why Hospitals Should Fly: The Ultimate Flight Plan to Patient Safety and Quality Care in 2008, which encouraged hospitals to adopt aviation safety methods to reduce patient harm. Unfortunately, if hospitals were to fly as he recommended, the United States would lose all of its physicians, surgeons, and nurses in about 10 years.

The dire prediction above is based on the following calculation: According to the Institute of Medicine’s 1999 report “To Err Is Human: Building a Safer Health System,” at least 44,000—and perhaps as many as 98,000—patients die in hospitals each year as a result of medical errors. For reference: A typically mid-range commercial airliner has two pilots and five flight attendants who would lose their lives (as well as more than 100 passengers) in the case of a crash.

If each patient death takes two doctors and five nurses along, a simple computation will show that the roughly 780,000 doctors and 2.8 million nurses that we currently have will be exhausted in nine to 13 years if the 44,000/year estimate holds; and if the higher estimate is accurate, it will only take four to six years. Even if we educate and train clinicians rapidly, it will be impossible to replace them fast enough.

Giving Hospitals Wings

Obviously, this prediction is based on misleading assumptions, but it illustrates the fundamental differences between aviation and health care as pointed out by several health safety experts—which I’m not among. The fundamental difference alluded in the computation above is the fact that the crew is acutely aware of the fact that if something is wrong and/or if they make serious mistakes, they will pay the ultimate price themselves.

So it’s not surprising that pilots check their planes really well before flying and refuse to take off if they don’t feel assured. Don’t get me wrong: I’m not accusing clinicians of being irresponsible, but you can be sure that they would be even more careful if their lives or well-being were in jeopardy. It’s just human nature.

Another fundamental difference resides in the clientele: Inside an airplane, passengers are typically more likely to follow orders from the crew. Even the arrogant surgeon who refuses to turn off his/her cellphone in the surgical suite is more likely to do so in the plane, knowing that he/she may go down with it or be escorted out. The same goes for patients and visitors who decide to smoke inside of the hospital, even while being medicated with oxygen, but wouldn’t dare to light up in an airplane. Again, it’s human nature.

Now, let’s change our focus to technology. It’s inconceivable that a pilot would fly an airliner in which a different manufacturer makes each cockpit instrument and control system, and these technologies don’t communicate well with each other. But that’s exactly what’s going on in surgical suites and intensive care units.

To make matters worse, each medical equipment manufacturer creates its own set of alarms with signals, sounds, and thresholds that it believes are the most “appropriate” (i.e., considered the most defensible by their litigation lawyers). If I were the pilot, I would have ejected myself promptly if multiple instrument alarms were to activate simultaneously without hierarchy or justifiable reason. So much for human-factors engineering.

Speaking of technology, let’s examine our favorite subject: maintenance. While aircraft mechanics don’t fly with the planes they service and, thus, could be less concerned about service-induced failures that could physically harm them, the Federal Aviation Administration (FAA) tightly regulates them. In health care, however, anyone can call him/herself a “BMET” and even service ventilators and hemodialysis equipment without being required to undergo specialized education and only demonstrating minimal competency; contrast this with barbers and beauticians, who are licensed by the states, and much less likely to cause serious injury or death if they make mistakes. It’s beyond me why we are still resisting licensure while lamenting the lack of professional recognition.

Then there’s the fundamental difference in health care that some safety experts from industrial backgrounds seem to overlook: Airlines and nuclear reactors are designed and manufactured by mankind, using the scientific and engineering knowledge accumulated over many centuries. On the other hand, our knowledge of human beings—and other living matters—is quite limited, despite of all the advances humans have made in the last several thousand years. It’s pretentious to say that we know what each health technology—including drugs, devices, procedures, etc.—does to the body in “normal” conditions, much less in pathological ones.

To expect clinicians to be able to use and control the technology available to them like airplane pilots is unrealistic, if not arrogant. All the stakeholders involved—including manufacturers, regulatory agencies, clinicians, and clinical engineering professionals—must admit that they have imperfect knowledge and need to learn from mistakes. Like Alexander Pope once said, “To err is human; to forgive, divine.” To that I would add: “To learn from one’s own mistakes is smart; to learn from those made by others, sublime.”

Finally, as we all know, health care is regulated by several disconnected federal and state agencies—including, but not limited to, the Centers for Medicare & Medicaid Services, the FDA, and state licensing agencies—while aviation is the sole domain of the FAA. Also, only one agency—the National Transportation Safety Board—investigates all aviation accidents, while medical errors are not always properly investigated and reported by every health care organization.

Still, these differences don’t mean that health care can’t learn from aviation and other industrial-safety disciplines. Quite the opposite, tools such as surgical checklists, medical simulation, failure mode and effect analysis (FMEA), and immunity for incident reporters, have all proven to be very effective. However, health care organizations have learned that it’s not enough to hire retired airline pilots to teach safety to clinicians. Instead, to improve health care safety, they must create a multidisciplinary team that includes pilots, clinicians, and other support service professionals.

Total Productive Maintenance

The discussion above is not mere idle academic discourse, but it demonstrates an important lesson we must learn when we evaluate equipment maintenance and management techniques developed in other industries for use on medical equipment. One method is total productive maintenance (TPM)—pioneered by Seiichi Nakajima and widely adopted in Japan and other countries for physical plant and production equipment. TPM’s goal is to eliminate equipment failures, product defects, and other equipment-related manufacturing issues via a company-wide program, instead of relying on a separate maintenance team. When fully implemented, TPM can allegedly eliminate downtime altogether.

Most factories that adopt TPM train and require equipment operators to perform their own maintenance (aka: “autonomous maintenance”). One example of TPM implementation in health care is Japan’s clinical engineering (CE) practice. Instead of focusing on maintenance and management, Japanese CE professionals actually spend most of their time operating high-severity equipment, such as ventilators and hemodialysis machines, in addition to servicing them.

Unfortunately, TPM has not caught on in American factories, due to cultural differences and lack of management buy-in. It’s even less likely to succeed in health care, as CE professionals would have to displace licensed respiratory therapists and other technologists. The other alternative is to train and order nurses to maintain medical equipment. Good luck to any CE professional who wants to try it in the states!

Reliability-Centered Maintenance

Another maintenance method that has gained much attention since CMS mentioned it in its Survey and Certification (S&C) Letters is reliability-centered maintenance (RCM), which grew out of aviation. As I explained in the May installment of CE Perspectives, the cornerstone of RCM is FMEA. Once failure modes are determined, a decision process is used to establish the best maintenance strategy, varying from preventive maintenance, run-to-failure, and, if needed, to redesign.

Unfortunately, due to the progressive incorporation of software into medical equipment and the manufacturer’s desire to protect its intellectual properly, it has become impossible to determine failure modes. Furthermore, FDA regulations strictly prohibit redesign by non-manufacturers.

Thus, it seems ironic that instead of learning from our industrial colleagues, CE professionals have to learn from our clinical colleagues to find ways to improve equipment maintenance. The evidence-based maintenance method that I’ve been championing is nothing but an adaptation of evidence-based medicine principles, with equipment being our “patient.”

As long as we are cognizant of the differences between our “patients” and the real patients, as well as the rules dictated by regulatory agencies, we should—and can—find the most cost-effective way to keep equipment safe and performing to specifications. After all, we may one day be “flying on the plane that we serviced”—aka: seeing the equipment we serviced being used on ourselves.

Binseng Wang, ScD, CCE, fAIMBE, fACCE, is director, quality and regulatory affairs with Greenwood Marketing LLC. The views expressed in this article are solely those of the author. For more information, contact chief editor Keri Forsythe-Stephens [email protected].

References and Footnotes

  1. https://www.ache.org/pubs/Releases/2009/Hamilton_09_release_FINAL.pdf. Accessed 7/4/2016.
  2. Nance J. Why Hospitals Should Fly – The Ultimate Flight Plan to Patient Safety and Quality Care, Second River Healthcare Press, Bozeman, MT, 2008
  3. Institute of Medicine – IOM. To Err Is Human: Building a Safer Health System, Kohn LT, Corrigan JM & Donaldson MS (eds.), National Academies Press, Washington DC, 2000
  4. KapurN, Parand A, Soukup T, Reader T & Sevdalis N. Aviation and healthcare: a comparative review with implications for patient safety. Journal of the Royal Soc Med Open, 7:2054270415616548, 2015. Available at http://shr.sagepub.com/content/7/1/2054270415616548.full. Accessed 7/4/2016.
  5. Murphy K. “What Pilots Can Teach Hospitals About Patient Safety.” The New York Times, 10/31/2006. Available at http://www.nytimes.com/2006/10/31/health/31safe.html?_r=0. Accessed 7/4/2016.
  6. See, e.g., Venkatesh J. “An Introduction to Total Productive Maintenance.” Available at http://www.plant-maintenance.com/articles/tpm_intro.shtml. Accessed 7/4/2016.
  7. See, e.g., Atarashi H, Ide H & Koike S. Clinical engineers increasingly appointed as medical equipment safety managers in Japan. J Clin Eng, 41:127-133, July 2016
  8. Centers for Medicare & Medicaid Services. Memorandum S&C 12-07-Hospital, issued on 12/2/2011, available at https://www.cms.gov/Medicare/Provider-Enrollment-and-Certification/SurveyCertificationGenInfo/downloads/SCLetter12_07.pdf. Memorandum S&C 14-07-Hospital, issued on 12/20/2013, available at https://www.cms.gov/Medicare/Provider-Enrollment-and-Certification/SurveyCertificationGenInfo/Downloads/Survey-and-Cert-Letter-14-07.pdf. Accessed 7/4/2016.
  9. Wang B. “Evidence-based Maintenance Is CE’s Moonshot.” 24×7 magazine, April 8, 2016. Available at https://24x7mag.com/2016/04/evidence-based-maintenance-ces-moonshot/. Accessed 7/4/2016.
  10. Wang B. “Evidence-based Maintenance?” 24×7 magazine, p.56, April 2007