The clinical laboratory plays an increasingly important role in the patient-centered approach to the delivery of healthcare services. Physicians rely on accurate laboratory test results for proper disease diagnosis and for guiding therapy; it is estimated that more than 70% of clinical decisions are based on information derived from laboratory test results.1
The process of blood testing, also known as the “Total Testing Process,” begins and ends with the patient. It includes the entire process from ordering the test to interpretation of the test results by the clinician. The Total Testing Process can be subdivided into three stages:
- Pre-analytical: test request, patient and specimen identification, specimen collection, transport, accessioning and processing
- Analytical: specimen testing
- Post-analytical: reporting test results, interpretation, follow up, storage, retesting if needed.
Additionally, the term “pre-pre-analytical phase” has been used for the initial part of the pre-analytical phase, focused on test selection and identification of test needed, and the term “post-post-analytical phase” has been used for the interpretation of results by the clinician.2
The numbers don’t lie: it’s a significant problem
Clinical laboratory errors directly lead to increased healthcare costs and decreased patient satisfaction. A laboratory error is defined as any defect that occurs during the entire testing process, from ordering tests to reporting results, that in any way influences the quality of laboratory services.3 Any error during the laboratory testing process can affect patient care, including delay in reporting, unnecessary redraws, misdiagnosis, and improper treatment. Sometimes, these errors may even be fatal (e.g., acute hemolytic reaction after incompatible blood transfusion caused by an error in patient identification).3 It has been observed that diagnostic errors have led to the most prevalent type of malpractice claim in the United States.4
Although errors can arise at any of the three stages, studies show that the pre-analytical phase accounts for 46% to 68.2% of errors observed during the Total Testing Process (Table 1).5 Considerable advances in laboratory instrumentation have significantly reduced the error rate during the analytical phase.6 However, despite the improvements in pre-analytical automation, the pre-analytical phase remains the most error-prone part of laboratory testing due to its complexity, that is, due to the presence of many steps that occur both before and after the specimen reaches the laboratory.
|Table 1. Sources of pre-analytical errors|
More than one-fourth of all pre-analytical errors are estimated to result in unnecessary investigation or inappropriate patient care, thus resulting in additional financial burden on healthcare system.7 Healthcare economists have developed a model to quantify hospital costs related to laboratory error and inefficiencies due to poor blood specimen quality.3 The model’s estimates are based on 1) institutional financial data such as operating costs and number of beds; 2) laboratory data such as test volume, rate of rejected specimens, and instrument downtime; and 3) clinical practice data based on physician interviews identifying the frequency of receiving erroneous laboratory results and their impact. On average, pre-analytical error costs represent between 0.23% and 1.2% of total hospital operating costs.3 This unnecessary expenditure can be extrapolated to a typical U.S. hospital with approximately 650 beds to $1.2 million per year.3 This represents a sum of increased costs based on various factors, including patient management, redraws, lab investigations, blood collection consumables, and instrument downtime.
Preanalytical errors are not inevitable; the right training and proper quality control measures can prevent them (Table 2). This entails a holistic approach, including close coordination among the members of the specimen management team, from the clinician who orders the test, to the phlebotomist, to the courier who picks up the specimen, as well as the laboratorian who processes the specimen for testing.
|Table 2. Most common pre-analytical errors and best practices to minimize them|
Steps labs can take: minimizing pre-analytical errors
Best practices are just that—the best ways to approach and solve a problem. They are not ways of achieving perfection, but they can go a long way toward the goal of eliminating pre-analytical errors. Here are some recommended strategies.
Phlebotomy education. All employees should be required to take continuing education classes to stay current with recent developments in pre-analytical error reduction. Healthcare professionals should also know the effect of pre-analytical errors on specimen quality (i.e., what might happen if correct steps are not followed). In addition, the competencies of employees must be assessed annually.8
Using appropriate technology. Technologies such as barcodes, radiofrequency identification, and wristbands can help overcome mistakes in patient identification. Automation through pre-analytical robotic workstations, specimen labelers, specimen management systems, and automated phlebotomy tray preparation can significantly reduce the rate of errors that are due to active human factors.8 Automated detection of serum indices such as the hemolysis index is more reliable than visual observation.8 Barak and Jaschek have described an efficient scenario of the total laboratory process utilizing a new information technology-based, pre-analytical approach. Their holistic approach utilizes various factors including pre-barcoded tubes, link to the laboratory Information system (LIS), and interface with Health Maintenance Organization (HMO) Hospital Information System.9
Choosing appropriate products. Institutions serve different patient populations and have different needs, and must choose types of products accordingly. Many experts think that plastic tubes offer significant advantages over glass tubes, such as minimizing exposure to blood by reducing the chance of shatter, increasing shock resistance, increasing centrifugation speed tolerance, and decreasing shipping weights.10 Plastics also help the facility to comply with the Occupational Safety and Health Administration (OSHA) guidelines to minimize the risk of blood exposure.10 Closed blood collection systems are seen as safer and thus preferable to syringe and needle for collecting blood.11
Many phlebotomists prefer to use wingsets over blood collection needles due to the better patient experience.12 However, when collecting blood for coagulation testing while using a wingset, one should be aware of the need to collect blood into a discard tube prior to collecting coagulation specimens to avoid a decrease in blood-to-additive ratio due to the presence of blood in the tubing. For plasma-based clinical chemistry testing, lithium heparin has been the preferred anticoagulant. EDTA, oxalate or citrate cannot be used for routine chemistry testing since they have commonly measured counterions (sodium, potassium).13 Finally, large institutions should try to standardize their products across various collection sites. This will ensure that the central labs receive similar specimens and deliver comparable results.
Adhering to standard guidelines. Söderberg et al. observed that venous blood specimen collection guidelines are not always followed.14,15 Adhering to CLSI guidelines for the correct order of draw is necessary to minimize the carryover of tube additives, which may affect test results. For example, falsely high potassium and falsely low calcium values may be obtained if potassium-EDTA tubes are collected before serum tubes.10 It is equally important to follow the manufacturer’s recommendations for proper draw volume to ensure proper additive-to-blood ratios.10 For example, coagulation tubes should be adequately filled within +/-10% of the stated fill volume.16
Developing clear, written procedures. Specific protocols and operating procedures (SOPs) in the lab help to reduce the heterogeneity in process and streamline the lab workflow. All labs must have written procedures which explain how to identify a patient, collect and label the specimen, and subsequently transport it and prepare it for analysis.8 Clinical labs should establish specific rejection criteria for specimens and should follow them closely. Each rejected specimen should be recorded in a log book with relevant details. Appropriate personnel should be promptly informed when a specimen is rejected so that corrective actions can be taken such as requisition for a fresh specimen.
Validating any new instrument or procedure. The laboratory must validate all products and methods to ensure that they are compatible and acceptable for the specified tests. It is the responsibility of the individual laboratory to determine the equivalency of test results before switching to a new product or method. Validation is also necessary when converting from one blood collection tube to another or when switching from serum to plasma or vice versa. CLSI has published a step-by-step guidance document to help with validation of blood collection tubes.17 In addition, validation is a formal requirement to meet accreditation standards.
Monitoring quality indicators in the lab. Labs should keep a record of pre-analytical errors observed. Devising and following corrective strategies can gradually free a lab from such errors. The data of serum indices and lab errors also help to monitor the quality of blood collection process and assess the efficacy of measures taken. Section GEN.20316 in the Laboratory General Checklist of the College of American Pathologists (CAP) specifically lists a few such pre-analytical quality indicators which should be monitored:2
a. Patient identification (% of patient wristbands with errors, % of ordered tests with patient identification errors, % of results with identification errors)
b. Test order accuracy (% of test orders correctly entered into a laboratory computer)
c. Specimen acceptability (% of general Hematology and/or Chemistry specimens accepted for testing).
Serum vs. plasma: some considerations
Another consideration for maximizing lab efficiency is the choice between serum and plasma. The choice between serum and plasma for laboratory tests is often made based on a tradeoff between speed and specimen quality. Various healthcare trends are affecting this choice. Hospitals are moving to plasma due to the importance of rapid turnaround time. Serum is preferred by reference labs, which are more concerned about maximum yield and specimen stability. It is important for labs to have separate reference ranges for serum and plasma for the analytes such as ammonia, lactate dehydrogenase (LD), and potassium, which have different test values with serum as opposed to plasma.19
While serum is considered a cleaner specimen (i.e., free of cells and other interferences), it needs to be clotted for 30 to 60 minutes, depending on the tube used. Rapid clot blood collection tubes with thrombin-based clot activators offer a 5-minutes clotting time for serum. This ensures a fast and clean specimen—a “rapid serum.” On the other hand, there is no need to wait for clotting with plasma. The downside of using plasma is interference in tests due to clotting factors, white particulate matter and lower stability leading to decreased glucose levels and increased enzymatic activity with time. “Cleaner plasma” is still an unmet need for labs.
A strong case can be made that gel blood collection tubes offer a significant improvement over plain tubes. The recommended clotting time for serum gel tubes is significantly lower than plain serum tubes. Gel is also seen by Bowen et al. as a more stable barrier, which facilitates specimen storage and transport and removes the need for aliquoting.10 For some analytes, separation of serum or plasma is required quickly, and this can be achieved with gel. However, laboratories face a few challenges while using separator gel tubes, such as the potential for the separator gel to absorb hydrophobic compounds such as some drugs; instability under extreme temperature conditions leading to obstruction of instrument probes by gel/oil and subsequently to instrument downtime.20
Pre-analytical errors damage an institution’s reputation, diminish confidence in healthcare services, and contribute to a significant increase in the total operating costs, both for the hospital and laboratory.3 Although it is not possible to eliminate all pre-analytical errors, compliance with best practices can significantly reduce their incidence. Proper management of pre-analytical errors requires significant interdepartmental cooperation, since many sources of these errors fall outside the direct control of laboratory personnel. Laboratory professionals must be leaders in ensuring patient safety, both outside and inside the walls of the laboratory.6
- Datta P. Resolving discordant specimens. ADVANCE for Administrators of the Laboratory. July 2005:60.
- Hawkins R. Managing the pre- and post-analytical phases of the total testing process. Ann Lab Med. 2012;32(1):5-16.
- Green SF. The cost of poor blood specimen quality and errors in preanalytical processes. Clin Biochem. 2013;46(13):1175-1179.
- Plebani M. Exploring the iceberg of errors in laboratory medicine. Clin Chim Acta. 2009;404(1):16-23.
- Plebani M. Errors in clinical laboratories or errors in laboratory medicine? Clin Chem Lab Med. 2006;44(6):750-759.
- Plebani M. Laboratory errors: How to improve pre- and post-analytical phases Biochemia Medica. 2007;17(1):5-9.
- Plebani M, Carraro P. Mistakes in a stat laboratory: Types and frequency. Clin Chem. 1997;43(8):1348-1351.
- Hammerling J. A review of medical errors in laboratory diagnostics and where we are today. Lab Med. 2012;43(2):41-44.
- Barak M, Jaschek R. A new and effective way for preventing preanalytical laboratory errors. Clin Chem Lab Med. 2013;50(4):1-4.
- Bowen RA, Hortin GL, Csako G, Otañez OH, Remaley AT. Impact of blood collection devices on clinical chemistry assays. Clin Biochem. 2010;43(1-2):4-25.
- WHO Guidelines on Drawing Blood: Best Practices in Phlebotomy. Geneva: World Health Organization; 2010. 3, Blood-sampling systems. http://www.ncbi.nlm.nih.gov/books/NBK138666/. Accessed on March 21, 2014.
- Stankovic A. Putting patients first during blood collection. Medical Laboratory Observer. August 2013;44-45.
- McCall R, Tankersley C. Phlebotomy Essentials. 5th ed. Philadelphia, PA: Lippincott, Williams and Wilkins, 2008:245.
- Söderberg J, Brulin C, Grankvist K, Wallin O. Preanalytical errors in primary healthcare: a questionnaire study of information search procedures, test request management and test tube labelling. Clin Chem Lab Med. 2009;47(2):195–201.
- Söderberg J, Wallin O, Grankvist K, Brulin C. Is the test result correct? A questionnaire study of blood collection practices in primary health care. J Eval Clin Pract. 2010;16(4):707–711.
- Favaloro EJ, Adcock DM, Lippi G. Preanalytical variables in coagulation testing associated with diagnostic errors in hemostasis. Lab Med. 2012;43(2):1-10.
- Validation and verification of tubes for venous and capillary blood specimen collection; approved guideline. CLSI document GP34-A. Wayne, PA: Clinical and Laboratory Standards Institute, 2012.
- Lippi G, Fostini R, Guidi GC. Quality improvement in laboratory medicine: extra-analytical issues. Clin Chem Lab Med. 2008;28(2):285-294.
- Arneson W, Brickell J. Clinical Chemistry: A Laboratory Perspective. Philadelphia, PA: FA Davis Company, 2007:418.
- Lippi G, Becan-McBride K, Behúlová D, Bowen RA, Church S, Delanghe J, et al. Preanalytical quality improvement: in quality we trust. Clin Chem Lab Med. 2013;51(1):229-241.