L-R, Larry Taylor, Michael Ehlert, Bob LaMountain, and Ted Cohen, manager, clinical engineering.

It’s the way we’ve always done it.” At some point, everyone has heard this well-worn argument for maintaining an existing process in the face of change. Changing the mind-set of the speaker is often a challenge, and is rarely accomplished through an emotional argument. Instead, one of the best ways to tackle an inefficient or outdated process is through objective proof. Instituting benchmarking practices can provide the necessary, impartial evidence of what works and what does not.

“When looking at benchmarking, I would start with the clinical engineering basics—have a good inventory of the equipment you are maintaining, have a trained staff, know how much you’re spending—and then put systems in place so the day-to-day work of your employees yields some quality, accurate data for the measurements that are important to you,” says Ted Cohen, MS, CCE, manager of clinical engineering at the University of California Davis Medical Center.

During his 3 decades of service to UC Davis, Cohen has established measurement tools for a number of functions performed by the clinical engineering team.”The first step is tracking your own performance, making sure you are improving month-to-month or year-to-year.”

Cohen uses the financial metric he helped established with peers through a study that quantifies medical equipment repair and maintenance costs as a ratio. “The ratio is created by taking the sum of all medical equipment repair and maintenance costs and dividing that figure by the equipment’s original acquisition value,” Cohen says. “What you get is a percentage for 1 year that should be in the range of 3 to 7 percent.”

Crunching other numbers, the clinical engineering department at UC Davis is responsible for recharging all of its expenses. As a result, all costs for services performed, as well as salaries for supervisors, clerical staff, and all technicians, are recouped by charging customers for the services performed.

The department’s rigorous cost-management structure includes the internal tracking of employee productivity, which is tallied and reported to each biomed every 2 weeks.

“We are expected to achieve 80% ‘productive time’ per day, which is tracked throughout the year,” says Larry Taylor, imaging equipment specialist at UC Davis. “The reports let us know if our PMs are up and done on time, along with what our productive time was, as a percentage of our goal.”

In many cases, these figures represent the timeliness of completing paperwork as much as the activity level of the tech. “It’s a reflection of your paperwork, and tracking all of the work you do—not just what is called in through the dispatcher,” says Robert LaMountain, a senior BMET at the hospital. For example, this includes calls that are placed directly to a biomed, rather than being routed through official channels, or work requests made by staff when they are on location completing a different job. “While walking through a department, people will often stop to ask you a question or point to a piece of equipment in need of repair,” LaMountain says. “Unless we can repair it on the spot, we must either ask the user to call the device in to our front desk, or we can gather the specifics and create a work order ourselves when we return to the shop. The same protocol applies if we are paged or called by an equipment user. Proper documentation will make the repair process smoother while adding to our productivity numbers.”

The reports on daily biomed productivity are helpful not just to management. “All technicians are responsible to produce a recharge time per day, which is extremely important,” says Michael Ehlert, CBET, respiratory equipment specialist at UC Davis. “It not only is a baseline to compare how well you are doing as compared to other shops, but it exemplifies our experience, showing how valuable we are as in-house techs. It also allows us to set realistic, individual performance objectives and track continual improvement within the shop.”

Tracking by Priority

Data also is collected on the downtime of roughly 60 systems identified as critical systems such as central station patient monitors for telemetry, or MRIs and CTs in the trauma center. For systems that have been tagged as critical, it is mandatory for the technical staff to monitor and document the equipment’s availability.

It is imperative that the processes established for capturing the information are within the day-to-day activity of the staff.

The established goal is an annualized downtime of 2% or less, or 98% uptime on the designated units. The resulting information is then included in various reports and is distributed throughout the organization, including the safety committee.

Other measurements regularly reported to hospital administration are turnaround times and response times. At UC Davis, these figures are tracked in detail according to the type of call involved. The dispatcher who receives repair calls will not only consult with the caller to determine the exact nature of the issue, but will also interpret the priority of the repair before notifying the appropriate technician.

“This system works because our administrative staff is familiar with a majority of the equipment we work on and they have developed a close working relationship with the clinical staff at UCDMC,” LaMountain says. If the biomed who is assigned the work order believes the priority is inaccurate for any reason, he or she will consult directly with the requester to clarify their specific needs. Generally speaking, however, the process is straightforward, works well, and requires few corrections. “It allows me to organize and perform my daily tasks in the most efficient order, and I am confidant that if something urgent arises, I will be notified immediately,” LaMountain says.

Prioritizing calls in this way also allows response and repair times to be measured based on the level of urgency assigned to each call. “It helps prioritize the way we actually run our business,” Cohen says. “If you have stat requests, which we define as anything demanding a 1-hour response, we can measure whether we responded within that hour or not—and that’s the important factor. If somebody says they need something done within the next days, it doesn’t really matter or help us if we respond in an hour. In fact, that biomed or tech’s time would be better spent working on a project with a higher priority.”

Challenges Along the Way

While areas revolving around money and time are often easier to collect data on, there are other functions performed by the clinical engineering team at UC Davis that offer value to the hospital, but are difficult to quantify. One example is the quality of customer service delivered by biomeds.

“We have looked at ways to verify that we are providing a valuable service, quality measures, if you will,” Cohen says. Past efforts have included the distribution of customer surveys. “We try to make those as quantitative as possible, and while it always seems to be somewhat subjective, it is certainly important to try and measure it.”

In addition to empirical tracking of customer satisfaction, LaMountain listens to users during service calls to help gauge his success in that area. “I evaluate my effectiveness as a biomed on a continual basis with customer feedback. The clinical staff wants the equipment repaired quickly, and they want to be updated if it’s a lengthy repair, so communication is important,” he says. “A positive relationship between the biomed and customer is key to the success of the biomed department.”

Tough as it may be to quantify, such anecdotal evidence is important to biomeds—and the bottom line. One such example is the increase in ventilators that occurred after the hospital was designated as an official pandemic response center. To ensure the hospital would be able to handle the inevitable uptick in patients that would follow an emergency situation, the decision was made to keep aging ventilator systems rather than trade them in for newer models. Having the older ventilators has meant more repairs and an increase in the need for replacement parts; however, it has been a net plus for the hospital.

“Our ability to maintain the vents has helped the respiratory manager reduce her rental costs to almost zero,” Ehlert says. Before the decision was made to keep older ventilators, the hospital would often find itself in short supply, requiring rental equipment to be brought in to handle patients. “Her rental budget was as much as $100,000, but because we’ve been able to keep these older ventilators in service, we have all but eliminated the need to rent those units,” he says.

Keeping the older systems running without incurring sizable increases in equipment costs has been accomplished by applying benchmarking techniques to the process.

“Overseeing vendors and managing spare parts are both benchmarking techniques I use, along with experience, to help me determine what I keep on the shelf so we don’t have to buy the parts overnight,” Ehlert says. “As a result, we have been able to save quite a bit of money by quickly repairing older ventilators or negotiating the acquisition of trade-in ventilators with an affiliated hospital—and all of that combines to reduce downtime.”

“The ultimate benefit is often interpreted to be the cost savings, but to the individuals in the department who work with us, we are a time-saver,” Taylor says. “If someone calls with a serious problem, we’re usually there and working to resolve their problem within 4 minutes.”

Making it Happen

Even when benchmarking guidelines are in place, they are only as good as the data they collect. In the stressful reality of a biomed’s typical day, it is imperative that the processes established for capturing the information are within the day-to-day activity of the staff, from dispatchers to the techs.

The department’s rigorous cost-management structure includes the internal tracking of employee productivity.

“It must be incorporated into their standard workflow,” Cohen says. Noting the downtime of critical systems, for example, is a mandatory field on the report generated by UC Davis’ computerized maintenance management system. “They must provide this information for every job they do. The system will not allow them to skip that step.”

But using the computerized form means techs also do not have to go out of their way to record the information.

“I think you’ll find that if you talk with our biomeds about benchmarking, it won’t resonate with them, but if you talk with them about cost management or documenting downtime on critical systems or response priority, tracking those tasks is part of their ongoing responsibilities,” Cohen says. “So while benchmarking itself is not part of their day-to-day, the data-collection piece of the process is—and it’s critical.”

Using downtime as an example, the computer knows when the initial call came in and the date and time the technician responded to or closed the job, according to Cohen, who literally wrote the book on the topic. Computerized Maintenance Management Systems for Clinical Engineers was developed, co-written, and edited by Cohen. AAMI published the book in 1994 and the updated version in 2003.

“They may or may not have any control over whether they get the part, or something else that might slow their completion of the project, but they know that they have to document the downtime,” he says. “Having this type of automated data collection is beneficial to the overall operation of the department, because it provides biomeds with some tools that make it easy for them to document those things that we, as managers and clinical end users, need to know. For it to work, we need to make sure it’s easy for the technical staff to document those things important to our customers.”

In addition to helping management track performance, using metrics within the clinical engineering department also may provide the justification necessary to get funds approved for upgrades to aging systems or for sending techs to additional training so more work can be done in-house.

“It helps our department build our case, because we can tell them how much downtime a particular system has had, how much we are spending in parts, and the number of labor hours we’ve committed to it,” Taylor says. “That is the information they consider when determining if we need a new device or need to start escalating the service.”

Cohen believes it also impacts the overall work of the department. “Looking at opportunities for cost reduction, for example, if you find out the way you’re operating is in the highest percentile of all your peers, then you should be doing something different,” he says. “But if you don’t have the data, how can you make the comparisons to look for other processes? What can you use to help improve your practice?”

Benchmarking Matters

Ted Cohen, MS, CCE, manager of clinical engineering at the University of California Davis Medical Center has been tackling the efforts of interhospital comparisons since the mid-1990s, when he and a group of colleagues conducted a study and authored articles for publication in journals for the Association for the Advancement of Medical Instrumentation. Cohen recently received the American College of Clinical Engineering Lifetime Achievement Award in recognition of his long-term contributions to the field of clinical engineering.

“We were looking at what we were calling ‘validating metrics,’ trying to determine what kinds of measurements could be validated and useful for clinical engineering,” he says.

Cohen believes that beyond internal benefits, measuring performance and productivity provides a ruler for comparison against businesses of similar size and purpose. “Where benchmarking really provides a benefit is in helping identify best practices and allowing one institution to learn from other institutions through organizations such as the University HealthSystem Consortium (UHC),” Cohen says. UHC, an alliance of 101 academic medical centers and 178 affiliated hospitals, works to advance knowledge, foster collaboration, and promote change to help its members succeed.

For benchmarking to foster successful peer-to-peer comparisons, it is important that the data be consistent across all participating institutions.

“It is just incredibly difficult to accurately compare ‘apples to apples’ across different institutions without standards regulating the type of data being collected,” Cohen says.

While he and his co-writers did make some progress, particularly with financials, they realized there were some across-the-board issues that were hampering their advancement.

“One of the biggest problems is that there are no standards in this area,” Cohen says. “Computerized maintenance management system vendors have not standardized their reporting or their definitions. Instead, they let the individual hospitals define and report these kinds of measurements.” As a result, one hospital may provide detailed information about outside vendor repair and maintenance, for example, while another will not even mention the topic.

“Unfortunately, whenever you leave out these large pieces of data, it is often not possible to compare apples and apples, and it makes it very difficult for institutions to look at and learn best practices from each other,” he says.

Groups like UHC have helped forge some guidelines. However, there are still no validated, mandatory benchmarks, according to Cohen, because many organizations either do not want to disclose their information or do not have the data available to share.

“Because there are only optional benchmarks, they can’t get enough people to report on them, and as a result, UHC doesn’t want to make them mandatory,” Cohen says, which could create a “catch-22” situation.

All too often, benchmarking is primarily driven by financial reasons, according to Cohen, which means the areas, which are more challenging to summarize, get left out.

“It’s important that benchmarking not be done in a vacuum and that quality indicators—such as downtime and response time, customer service surveys, and other indicators—be included along with any financial measurements,” he says. “Financial measurements by themselves are not enough. We all could reduce costs and reduce the time we spend providing a service, but that is not necessarily good for either the hospital or the patients.”

—DH


Dana Hinesly is a contributing writer for 24×7. For more information, contact .