Calibration verification can be performed remotely
It’s well-known that reimbursement incentives and performance standards for quality of care are being used to lower healthcare costs. These facts, coupled with the American Society for Clinical Pathology’s newest Vacancy Survey data, which concludes that the laboratory workforce shortage is worsening, mean clinical laboratories must conduct their required analyzer calibrations as labor-efficiently and comprehensively as possible.
Analyzer calibration verification is a time-consuming, painstaking process that can disrupt the workflow of any laboratory and interfere with result reporting. Analyzer manufacturers began providing quality assurance support in the form of interlaboratory quality assurance programs years ago. Now, the industry is looking at providing advanced calibration support to certain high-end hematology analyzers.
Calibration verification can now be performed remotely using data the manufacturer has collected on a specific analyzer’s performance, quality control status, and calibrator recovery, along with data from the laboratory’s participation in an interlaboratory quality assurance program. This is a holistic approach to calibration, providing significantly more data on which to base the calibration process versus calibration values alone. It’s an approach that has reduced non-productive analyzer time. Calibration verification can take less than an hour while still meeting regulatory requirements for documentation. This approach also provides streamlined and comprehensive reports with statistical and graphical presentation of analyzer performance anytime, from anywhere.
Jackie Guenther, MT(ASCP), MEd Senior Product Manager, Service SolutionsSysmex America, Inc.
The right QC saves time and money
With approximately 70 percent of clinical decisions based on laboratory test results, it is essential to ensure that the results you provide do not lead to misdiagnosis, inappropriate patient treatment, or death. Without effective and reliable quality control, the risk of any of these events occurring is significantly increased, as errors go unnoticed and incorrect patient results are released. This can cost your laboratory precious time and money.
To minimize lot-to-lot variation with immunoassay-based methods, quality controls should be manufactured from 100 percent human material, thus ensuring they are commutable and mirror the performance of patient samples. Many laboratories will experience shifts in QC values each time they change batch of reagent; such shifts are not reflected in patient samples and are due to the presence of non-human components in the material, thus affecting the reliability of results produced.
It is essential that the quality controls employed in your medical laboratory have been manufactured to clinically relevant levels. By ensuring the controls in use cover clinical decision levels, laboratories can be confident of the reliability and accuracy of the patient results they release, while ensuring that no extra controls need to be purchased to cover the full range of patient testing.
Jim Hardwick Director of SalesRandox Laboratories Manufacturer of Acusera true third party quality controls
The QC/QAs in the clinical lab are changing
The QC/QAs in the clinical lab are changing. The Centers for Medicare and Medicaid Services (CMS) has embraced a voluntary QC option for meeting CLIA quality control standards called IQCP, or Individualized Quality Control Plan, to be implemented by January 2016 for all labs that have been utilizing Equivalent Quality Control (EQC), as these standards will no longer meet CLIA Standards. One of the common questions that has resulted is this: Does the IQCP affect laboratories’ need to perform calibration verification requirements?
The answer: No, the IQCP does not change the requirements. The laboratory is still responsible for calibration verification for all nonwaived “moderate to high complexity” test systems.
The laboratory must perform calibration verification at least twice per year, for documentation purposes, as well as whenever any of the following occurs: 1) after any major preventive maintenance; 2) when critical parts affecting an instrument’s performance are replaced; 3) after the laboratory switches lot numbers on the reagents it used in conjunction with an instrument; and 4) after the laboratory identifies an unusual trend or shift reflected in its control material, or results that fall outside of established acceptable limits.
Calibration verification material can be obtained from many sources; the laboratory director should explore what options are best for the lab. Things to consider include liquid, stable, or lyophilized; how many different analytes per kit; web-based data regression program; and the cost, based on how often the kits will be used.
Basic QC/QA: back to the future
The 1996 Presidential Commission’s Report on the Space Shuttle Challenger tragedy summarized its true cause in one sentence: “Faults include a lack of problem reporting requirements and inadequate trend analysis.” That quote is also a good analysis of the basic causes of quality control problems in the clinical lab.
In current laboratory terminology, the root cause of this extremely adverse event is inadequate tracking and trending of problems—a/k/a deviations or nonconforming events. CAP, The Joint Commission, FDA, and AABB all recommend, require and/or regulate deviation tracking and trending. After all, the best way to reduce and/or eliminate adverse events is to learn from our mistakes.
Most deviations in the clinical laboratory and blood bank or transfusion service are caught and corrected early enough to prevent serious consequences. Innocuous though they may be, these deviations represent increased processing costs, and decreased efficiency, service response time, and customer satisfaction. If you use tracking-and-trending technology to help investigate the who, what, when, where, and why of these deviations—then leverage this information to identify and correct the cause at the root—you can improve efficiency, cost-effectiveness, service, quality of care, patient satisfaction, and regulatory compliance.
Looking back to the future then, let’s envision and work toward a future where we put lessons learned from the Challenger disaster to good use.
George Behr, PhD Founder & CEONouvation, Inc. Manufacturer of OTIS-Blood Bank and OTIS-Laboratory
The Future of QC/QA
In meetings with laboratory professionals from all over the globe three topics are commonly raised: (1) how to design QC to manage patient risk, (2) how to design QC for multiple identical instruments within a laboratory or healthcare system, and (3) solutions for monitoring and reporting multiple key performance indicators.
Risk management guidelines and the advent of IQCP have many labs showing interest in how to design QC to minimize the risk of patient harm and how to distinguish between tests that need more control from those that require less oversight. Why expend equal resources on high performing low risk tests as those that are problematic?
Many labs require multiple identical instruments to support their testing volume and they are concerned that they are adequately controlling the variability of patient results between instruments. Consequently, they are seeking new approaches to calibration and quality control that can effectively assess and manage the group of instruments rather than managing each instrument independently.
In the world of QA, dashboards are growing fast in popularity as an excellent way to visualize quality metrics to rally teams and leadership around activities that have a measurable effect on patient satisfaction, laboratory quality, and productivity. As new informatics solutions are developed, this trend will continue.