Variation: the problems it creates for your laboratory, and how to solve them

One of the single biggest challenges facing today’s clinical lab is the challenge of managing process variation. Variation is truly an enemy to effective laboratory processes, and it is usually a key culprit when labs fail to meet customer expectations and reach business objectives. And, surely, part of what makes it so problematic is that the concept of variation itself is often misunderstood or not completely understood by lab leaders. A lack of understanding of the importance of managing variation can result in failed root-cause analyses, which in turn can lead laboratory management to gravitate toward temporary stopgap measures that turn into standard practice, and toward solutions based on gut-feelings and anecdotal evidence that, in fact, are not effective solutions. In short, when managers make decisions that fail to recognize the crucial role of variation in frustrating process success, they often make problems worse. 

When there is significant variation in lab processes, everyone from the laboratorian to the physician is affected. Lab professionals feel frustrated because they don’t know what will confront them every day when they come to work and have little confidence in their ability to keep customer commitments. Physicians have little certainty about what to expect from the lab or when their results will be delivered. But ultimately it is patients who feel the greatest pain, as their treatment is contingent on results from the laboratory. With customer experience ranking higher and higher in the reimbursement equation, it is important for lab leaders to be rigorous about performance, customer satisfaction, and how variation affects both. 

Process improvement initiatives centered on LEAN principles can be powerful tools in ridding the laboratory of process variation and outcomes can be much enhanced through automation implementation. 

Decreasing variation through process improvement

Recently, a process improvement leader engaged with a laboratory serving as the core lab to a large network in a project to help redesign its process. During the preceding several years, the laboratory had experienced tremendous growth in outreach volumes; it found itself processing about seven million samples annually. The growth, while a significant step forward for the lab, created tremendous variation in its processes because, even though its business changed, its processes did not. The challenges were felt in many ways, including longer turnaround times, an increase in FTEs required to process the work, physician complaints, and poor morale within the laboratory workforce. The situation was so adverse that, regularly, hundreds of samples older than 24 hours were waiting each morning for processing. When the day’s work would come in, it became difficult for staff to keep track of what new specimens came in and when they arrived.

Onsite evaluation of the situation quickly revealed that process variation was at the core of the problem. The entire process was timed from start to finish for a single specimen and then for several batches of specimens. Enormous queue times were discovered between process steps, and inconsistent batch sizes only worsened the situation. To tackle the problem, the focus was directed to establishing a few critical and basic practices to standardize the work and minimize the process variation. 

Batch sizes were reduced. The smaller batches were completed start to finish and placed on the automation line before the next ones were started. This increased the consistency of the workload processed by the automation line, which no longer sat idle moving empty pucks.

First in, first out (FIFO) lanes were established for the process area to help keep track of when specimens arrived and which specimens were next to process. This created better flow in the processing area and decreased the number of samples that became turnaround time (TAT) outliers simply because nobody knew how long they had been sitting untouched. 

The responsibilities of the laboratorian in charge of processing STAT or add-on testing samples were restructured to include processing already labeled specimens and getting them on the line sooner. Those samples had routinely sat idle for four to six hours, and this change alone improved the wait time of those samples by 83% to 87%. 

After two process improvement events to implement the above practices, the laboratory has seen a 35% increase in throughput for the processing area and reduced queue times for the majority of the work. All of that was accomplished in a few months, without the need for capital expenditures and despite significant personnel and lab leadership turnover during that period. 

Variation eradication—to the next level

Automation can drive these kinds of improvements even further after pre-analytical processes have been improved. This is because its purpose is to remove process variation by decreasing naturally occurring process uncertainty and human error. Automating efficient processes ensures that capital investment results in the expected improvements in operation that enable the laboratory to reach a steady state that meets customer demands and is prepared to grow as new business opportunities require. 

Internal studies across numerous process improvement and automation engagements illustrate how automating lab processes can drive performance to the next level.  

Figure 1. “Before and after” automation

Figure 1 provides a “before and after automation” comparison of turnaround time for one laboratory. Before automation, the 90th percentile test completion was nearly 100 minutes; after automation, the 90th percentile was just over 50 minutes, nearly cutting sample TAT in half. Furthermore, in a separate study comparing BMP TAT performance before and after automation in a laboratory, automation decreased workflow variation by bringing both routine and STAT testing to a common level of performance, decreasing 90th percentile TAT for manual tests by more than 60% to meet “best performance” TAT for STATs (Figure 2).

Figure 2. Routine and STAT testing

This level of improvement is not an isolated observation, an outlier, but rather a regular outcome across sites that implement automation. Figure 3 shows a compilation of BMP TAT performance from several laboratories that leverage their automation systems to the best of their advantages. Regardless of laboratory volumes, size of institution, or even size of automated solution, improvements in performance due to a decrease in process variation are consistently similar.

Figure 3. Results across sites

Reduced variability and ED performance

One critical area that can be improved by reducing variability through process improvement and automation is TATs for emergency department (ED) testing. Because the ED is one of the most important “customers” of the laboratory and also one of the main points of entry for patients into the hospital system, laboratory testing TAT becomes a critical component in ED length of stay (LoS) and patient experience. Focusing efforts in the laboratory to implement process improvement initiatives and leveraging the power of automation can result in significant gains that benefit not only the laboratory but also ED process performance.

Clinical laboratories are now fully immersed in a world of ever more stringent quality and efficiency requirements, with patient experience metrics becoming increasingly important for full reimbursement realization. Decreasing variation in laboratory processes results not only in performance predictability, critical for establishing and meeting customer commitments, but can also support hospital-wide initiatives to improve patient experience, care, satisfaction, and outcomes. 

Sergio Sanchez Manchinelly, PhD, serves as Senior Manager, Continuous Improvement Programs, for Beckman Coulter Diagnostics.