Why HTM professionals must learn from past failures and adapt to modern challenges to ensure patient safety.

By Rick Schrenker 

I was recently reminded of my pre-retirement risk management work and how the writings of Henry Petroski and others influenced it. “Things work because they work in a particular configuration, at a particular scale, and in a particular context and culture.”

Design Changes Must Be Analyzed for New Failure Modes

Construction of the Francis Scott Key Bridge commenced the same year as my Baltimore high school commencement: 1972.  I had attended what would today be called a STEM school. In my sophomore year, our class was the first to take a “computer math” course. We did some FORTRAN programming using punch cards. Surely, reader, you remember using them.

“Any design change…can introduce new failure modes or bring into play latent failure modes. Thus, it follows that any design change, no matter how seemingly benign or beneficial, must be analyzed with the objectives of the original design in mind.”

While the design of the Key Bridge did not change over its almost 50 years of service, the type of container ship that caused the collapse of the bridge did not exist when the bridge was conceived. Time will tell how inadequate 20th-century standards-compliant safeguards in the Key Bridge were to deflect and absorb the impact of an out-of-control 21st-century ship.

But what does the Key Bridge disaster have to do with medical technology?

Old Rules Challenge Modern Medical Technology

Consider this: The law that continues to serve as the foundation for the regulation of medical devices in the United States, the Medical Device Amendments to the Food Drug and Cosmetics Act of 1938, dates to 1976..

The age of the Medical Device Amendments is relevant is because devices granted 510k clearance since 1976 must be substantially equivalent to an approved predicate device.The initial set of predicate devices, known as preamendment devices, were commercially available before 1976.

(In my snarkier moments, I’ve been known to argue that if applied to NASA, this would support a claim that a Space Shuttle is substantially equivalent to a Wright Flyer.)

Now consider this observation from a 2006 article in IEEE Computer concerning issues that arose with the simultaneous presence of multiple generations of IT in a healthcare environment.

“… We observe that many clinical systems currently in use were created prior to the recent, dramatic changes in healthcare delivery. Integrated health networks with more complex workflows and a greater need for seamless movement of patient data on demand, anywhere within the network, have for the most part replaced free-standing hospitals, clinics, and group practices. Retrofitting yesterday’s systems to meet today’s needs can only result in a “solution” that falls short … Software engineers have long known that extensive retrofitting causes software to age very rapidly. Considering what we do know about building complex software systems and in the light of these dramatic changes in the industry, it is unfortunate that the prevailing sentiment among healthcare professionals seems to be that legacy information systems, their developers, and their vendors are failing to meet the needs of physicians and hospitals.”

But that was years ago too, right? These kinds of simultaneous old and new mixes of technology no longer occur, right?

Boeing 737 MAX Crashes Linked to Design Changes

Consider the more recent (and mind-bogglingly ongoing) series of consequential failures around Boeing’s 737 MAX. Technically, the 2018 and 2019 fatal crashes tie back to a series of design changes aimed at increasing the efficiency and range of the 737 MAX so Boeing’s products could better compete with those of Airbus. But there is more to the story. For me, the most telling aspect of how Boeing’s corporate culture sacrificed attention to safety was that it successfully lobbied to weaken FAA oversight of design processes, allowing Boeing to self-regulate.

And yet here we are in 2024, reading about yet more Boeing failures.

“It wasn’t that long ago that Boeing’s reputation was that of a staid industrial giant, known for building the safest, most advanced planes in the sky. It helped introduce the world to commercial jet travel.

Pilots and others in the industry, as well as members of the flying public, summed up their confidence in the company with the expression, “If it’s not Boeing, I’m not going.” The company still sells coffee cups and tee shirts with that slogan.”

Boeing was once known for safety and engineering. But critics say an emphasis on profits changed that.”

But the point is that this is nothing new.

Petroski returns time and again to that theme, describing it as ‘… the myopia that can occur in the wake of prolonged and remarkable success…’  Intellectually we know that a safe past does not guarantee a safe future, but Petroski drives the point home by describing how success-fueled hubris took the National Aeronautics and Space Administration (NASA) from the glory years of the Apollo program to and through the failures of Mars satellites and two space shuttle disasters. And it is worth remembering that many of the space program failures were not associated so much with design as with program management.

Healthcare Delivery Faces High Consequential Risks

Consequential failures of technology happen. Consequential accidents happen. Consequential management decisions happen. And they happen in every safety-critical line of business. The delivery of healthcare is not immune. 

And while failures that occur during the delivery of healthcare rarely, if ever, result in shutting down a harbor that employs thousands or plane crashes that kill hundreds, in total they can be far more consequential. Keep in mind that it was just over 30 years ago that the National Institute of Medicine estimated that somewhere between 44,000 and 98,000 U.S. citizens died annually because of failures in the delivery of care.

Yes, changes have been instituted that have improved safety, some as simple as verifying a patient’s birthdate before starting any caregiving activity. On the other hand, the symbiotic working relationships among healthcare workers in general, and nursing and CE/HTM in particular, are bound to be increasingly stressed by staffing shortages. Certainly, the workload has not lessened, leading to increasing production pressure on everyone involved. And this can increase the likelihood of scenarios that place patients at increased risk.

HTM Faces Increasing Challenges

The CE/HTM world has always had a lot of priorities to juggle, but from my now admittedly at-a-distance perspective it has never been more challenging. Still, we should remember that the profession originated to address patient safety issues associated with technology. There is a lot of technology safety history from which to learn, some very recent. The Key Bridge and Boeing stories are ongoing, and CE/HTM can learn from them, especially since many technology failures are often attributable to management failures.

The warning from Santayana is no less important now than it has ever been: “Those who cannot remember the past are condemned to repeat it.”

Rick Schrenker is a former systems engineering manager for Massachusetts General Hospital. Questions and comments can be directed to 24×7 Magazine chief editor Keri Forsythe-Stephens at [email protected].

  1. Petroski, H. Success through Failure—The Paradox of Design Princeton University Press. 2006; 167.
  2. Petroski, H. Design Paradigms—Case Histories of Error and Judgment in Engineering Cambridge University Press. 1994. 57.
  3. LaPlante P. et al, Healthcare Professionals’ Perceptions of Medical Software and What to Do About It, pp 28 – 29, IEEE Computer, April 2006.
  4. Robison P, “Flying Blind – The 737 Tragedy and the Fall of Boeing”, p 117, Anchor Books, 2022.
  5. https://www.cnn.com/2024/01/30/business/boeing-history-of-problems/index.html . Last accessed April 11, 2024.
  6. Schrenker, R, Learning from Failure: The Teachings of Petroski, Biomedical Instrumentation and Technology, 2007, https://array.aami.org/doi/full/10.2345/0899-8205%282007%2941%5B395%3ALFFTTO%5D2.0.CO%3B2, Last accessed April 11, 2024.
  7. Phillips J, et al, Nursing and Patient Safety, PSNet, April 2021, Last accessed April 11, 2024.
  8. Holt C, Confronting the BMET Staffing Shortage, 24×7, August 17, 2018. https://24x7mag.com/professional-development/department-management/succession-planning/confronting-bmet-staffing-shortage/, Last accessed April 11, 2024.