The iterative processes that engineers and technicians use to address problems could have been applied by decision-makers throughout the COVID-19 pandemic.
By Rick Schrenker
I’ll never forget where I was when I first questioned whether those making strategic decisions concerning our continued COVID-19 pandemic response had considered getting input from people who actually knew something about assurance. When the Centers for Disease Control and Prevention (CDC) announced that it was okay to ditch masks after being fully vaccinated, I pondered: How did they know?
What bothered me most was that I could never find an answer to that question. What made the CDC so sure? What evidence did they consider? What models did they use? What uncertainties were factored in? What indicators did they select for tracking to assure themselves that their decision held up? How frequently should they assess the indicators, and when and how should they respond to deviations?
Now forget about COVID for a moment, and ask yourself: Do these questions seem familiar to you? If you’re working in the CE/HTM field, they should. They are questions you deal with—implicitly as well as explicitly—every day of your career.
Processing the COVID-19 Pandemic
Prior to retiring, I had the opportunity to do a little work with quality and safety assurance tools such as safety cases and STAMP/STPA (systems-theoretic accident model and processes/systems-theoretic process analysis). It was quickly obvious to me that if CE/HTM did not apply formal tools like these to their operations, they did so without necessarily realizing it. For example, a medical equipment management program and plan could be argued to be a form of assurance case, as could an alternative equipment maintenance program.
A few months ago, I did some searching to see if anything had been published about applying these kinds of tools to the pandemic. What I found served more as frameworks and templates for applying the tools, but my sources provided no examples of specific applications.
As part of the need to keep myself busy in retirement, I started thinking about how to develop an assurance case for COVID that would be grounded in a claim loosely along these lines:
COVID 19 can be controlled in (a region) if A% of the population is vaccinated and B% of the population is regularly tested and … Z% …
This is a very rough, very high-level template. For one, it assumes that A, B, and Z are independent when, in reality, they are not. However, I still see it as a starting point for capturing all of the known or envisioned components of the solution in a model: every claim and sub-claim, every argument supporting a claim and the evidence and indicators to go with it, every assumption, and every justification. What gives the authors confidence in the analysis? What are the uncertainties?
This would be an unwieldy project at any scope, but my sense is that it would be doable at a high level and, at a minimum, would result in explicitly identifying the uncertainties in the claims. Long story short, there are at least two tools that have been proposed to address confidence and uncertainty in assurance cases.
Thinking Like an Engineer
Setting aside formal approaches and getting back to what people in CE/HTM do implicitly, it seemed that mine was the kind of reasoning that others dealing with the pandemic were following. However, I had second thoughts after reading the following excerpts from an article from MIT Technology Review:
“[…] the right balance between pure research results and pragmatic solutions proved alarmingly elusive during the pandemic […]
“Often these problems require iterative solutions, where you’re making changes in response to what does or doesn’t work,” [says Jon Zelner, an infectious disease modeler and social epidemiologist at the University of Michigan.] “You continue to update what you’re doing as more data comes in and you see the successes and failures of your approach. To me, that’s very different—and better suited to the complex, non-stationary problems that define public health—than the kind of static one-and-done image a lot of people have of academic science, where you have a big idea, test it, and your result is preserved in amber for all time.”
I strongly recommend reading the whole article, which argues that the iterative processes that engineers and technicians use to address both large and small problems could have been applied by epidemiologists throughout the pandemic. My colleague, who shared the article with me, and I had just assumed that those making the decisions during the pandemic were following the iterative process that people like us take for granted.
Do your colleagues in other areas of healthcare recognize these skills as something you and your department bring to the table? Over the years, I reviewed hundreds of human study protocols that incorporated the use of devices. I didn’t simply verify the performance and safety of the devices as if they were used in a vacuum. I always considered where and how they were to be used and by whom. Had researchers considered what they would do if the device failed in use? How did they know the devices were doing what they intended? How could they present risk to the subject, regardless of whether the devices were functioning properly?
How did they know?
Your experience may be different from mine, but virtually none of the researchers had considered these or any other quality, performance, or safety assurance-related issues outside of the direct scope of their research. It is not simply that they didn’t know to do it—it was more that the thoughts had never crossed their minds. Still, I was surprised to learn that epidemiologists have not approached aspects of the pandemic from our perspective. Reflecting on this makes me wonder if CE/HTM departments are cognizant of the applicability and potential value of their skills to their whole organizations—not just within the walls of their shops.
Rick Schrenker is a retired systems engineering manager for Massachusetts General Hospital. Questions and comments can be directed to 24×7 Magazine chief editor Keri Forsythe-Stephens at [email protected].
- Habli I., et al. Enhancing COVID-19 decision making by creating an assurance case for epidemiological models. National Center for Biotechnology Information.
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7477796/. Published September 2020. Accessed Oct. 25, 2021.
- Cockcroft A. COVID-19 Hazard Analysis using STPA. Medium. https://adrianco.medium.com/covid-19-hazard-analysis-using-stpa-3a8c6d2e40a9. Published March 2020. Accessed Oct. 25, 2021.
- Chen S., et al. Analyzing National Responses to COVID-19 Pandemic using STPA. National Center for Biotechnology Information. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7859697/. Published February 2021. Accessed Oct. 25, 2021.
- Habli I. Safety Cases—An Impending Crisis? White Rose. https://eprints.whiterose.ac.uk/169183/1/Habli_Safety_Cases_Crisis_2020_v3_Final.pdf. Published February 2021. Accessed Oct. 25, 2021.
- Schrenker, R. Sufficient Evidence—Making the Case for Safety. Biomedical Instrumentation & Technology. http://rdcms-aami.s3.amazonaws.com/files/production/public/FileDownloads/BIT/2008_BIT_ND_SufficientEvidenceMakingtheCase.pdf. Published November 2008. Accessed Oct. 25, 2021.
- Goodenough J., et al. Toward a Theory of Assurance Case Confidence. Carnegie Mellon University. https://resources.sei.cmu.edu/asset_files/TechnicalReport/2012_005_001_28161.pdf. Published September 2012. Accessed Oct. 25, 2021.
- Roberts S. Reimagining our pandemic problems from the mindset of an engineer. MIT Technology Review. http://www.technologyreview.com/2021/10/15/1037195/engineering-epidemiology-pandemic-problem-solving. Published October 2021. Accessed Oct. 25, 2021.8
- New CDC guidelines say vaccinated Americans can now ditch the masks, with a few exceptions: USA Today. https://www.usatoday.com/story/news/health/2021/05/13/covid-vaccine-cdc-variant-fda-clots-world-health-organization/5066504001/ Accessed Oct. 25, 2021.