Answering Your Questions

Sept. 1, 2011

Solution for unexpected elevated potassium results

A

In response to the question “Could elevated potassium results be due to dehydration?” in the July 2011 edition of Tips (p. 34), I have experienced a similar issue, and have found a solution. Until recently, I also had a major issue with unexpected, sporadic elevations in patient potassium results. This was an ongoing issue for quite some time. I performed a lengthy investigation of our pre-analytical and analytical processes, which included testing the blood collection tubes and needles for contamination, different needle gauges, instrument calibration, and reagents. My centrifuge was set according to the manufacturer's direction, and biannual tachometer checks were acceptable. I also performed numerous correlation studies with other laboratories, which were comparable.

I even performed patient chart reviews, reviewing to see if these patients were receiving an ACE-I for hypertension. The investigation pointed in the direction of the pre-analytical phase. After much research into centrifugation, I did discover that this was my issue. There is a difference in the G-force between fixed-angle and horizontal swing-bucket centrifuges. Also, the rotational radius of the centrifuge is an important factor in determining the appropriate speed-RPM/RCF. You will find this information in the “BD Tech Talk,” Volume 6, March 2008.

—Susan M. Beahlen MT(ASCP)
Director of Ancillary Services
Kaukauna Clinic, Kaukauna, WI

Minimal volume for EDTA tubes

A

For the question, “Should we accept half-full tubes?” in Tips from the Clinical Expert in the August 2011 issue of MLO (p. 38), I agree with Dr. Karon that the lab should validate a minimal volume for acceptance of unfilled EDTA tubes. At a laboratory of pediatric hospital, the issue of unfilled tubes is more prominent. We did a comparison study to look at the differences of blood volumes on CBC, reticulocyte count, and WBC differentials. We found that collecting 1 mL of blood in a 4-mL EDTA tube has essentially no effect on CBC and WBC differentials compared to a completely filled tube. The study was published in International Journal of Laboratory Hematology. Now we accept 1 mL as our minimal volume in a 4-mL EDTA tube.

—Min Xu, MD, PhD
Medical Director of Core Laboratory
Department of Laboratories
Seattle Children's Hospital
Seattle, WA 98105

Further reading

  1. Xu M, Robbe VA, Jack RM, Rutledge JC. Int J Lab Hematol. 2010 Oct;32(5):491-7.

Conflicting CBC results

Q

Our system includes two hospital labs. We have been tracking a line-draw phenomenon at both facilities where the same patient will be drawn twice within a short period of time (<4 hours), and complete blood count (CBC) results will be significantly different, specifically the red blood cells (RBC) and hemoglobin and hematocrit (H&H). With one draw, the white blood count (WBC) and platelets (Plt) will be higher than the other draw, while the RBC and H&H are much lower, typically at critical levels. In most cases, the draw with the critical RBC and H&H is the accurate draw. We have the same analyzers at both labs, so we have ruled out an issue due to the methodology. What sort of issue with line draws could be causing this false increase in red cells while the other cellular elements are only mildly impacted and with an opposite effect?

A

My first impression is that the critically low results were erroneous, since line draws are notorious for hemolysis. But you state with certainty the “actual” results are those that are critically low. I assume your investigation included checking the tube that produced the critically low results for hemolysis by centrifugation and found none. Assuming your premise is correct, I would suspect hemoconcentration is responsible for the elevated results. If the staff who drew the questionable samples left the tourniquet on longer than one minute, the cellular components of the sample will have changed significantly.

One study found tourniquet constriction caused falsely elevated RBC, hemoglobin, and hematocrit levels after two minutes, and decreased white blood cell and platelet counts after three minutes.1 I would also investigate whether or not the questionable sample was diluted with IV fluids. If an adequate discard volume was not removed prior to sampling, the results would be falsely lower by dilution. According to the Clinical and Laboratory Standards Institute's venipuncture standard, twice the dead space volume of the vascular access device should be discarded prior to drawing routine non-coagulation samples.2

—Dennis Ernst, MT(ASCP), Director
Center for Phlebotomy Education, Corydon, IN
Mayo Clinic, Rochester, MN

References

  1. Lippi G, Salvagno GL, Montagnana M, Franchini M, Guidi GC. Venous stasis and routine hematologic testing. Clin Lab Haematol. 2006;28(5):332-337.
  2. CLSI. Procedures for the Collection of Diagnostic Blood Specimens by Venipuncture; Approved Standard-6th Edition. CLSI document H3-A6. Wayne, PA: Clinical and Laboratory Standards Institute; 2007.

Do diluted samples produce results?

Q

Is it safe to dilute a sample with a high creatine-kinase (CK) value until you get the final result without any flags from the chemistry analyzer? I was told that enzymes are diluted only up to a certain point since they will lose their activity if overdiluted. 

A

Enzymes require cofactors and, in some cases, will have very different activity in matrices other than human blood. Thus, there will be a limit beyond which dilution will cause inaccurate estimation of enzyme activity, unless you are able to dilute into human plasma or blood with no enzyme activity. Rather than take this approach, a better option is for each lab to define the clinical reportable range (CRR) for each test offered, based on the analytic measurement range (AMR) and the clinical relevance of high results.

Assume that the instrument can measure neat plasma up to a CK value of 10,000. Dilution one hundredfold would then allow you to measure CK values up to 1,000,000. Your medical director must then make a judgment about whether values greater than 1,000,000 have clinical relevance; or whether results reading on your instrument as “>10,000” even after one hundredfold dilution into an appropriate diluent will be reported as “>1,000,000.” If the medical director believes that reporting up to 1,000,000 is appropriate to allow good medical care in your institution, then the lab must then verify that one hundredfold dilution of CK values on that instrument using an appropriate diluent produces accurate results. If so, then the lab has defined the CRR for the test and validated the dilution protocol. Some labs choose to define a single maximum dilution for all tests on a given platform; while others define maximum dilution on a test-by-test basis. Some regulatory agencies require that the AMR and CRR be defined for every test offered, so this should be something to consider for every test that can be diluted to produce a result.

—Brad S. Karon, MD, PhD, Director
Hospital Clinical Laboratories
Mayo Clinic, Rochester, MN

Reportable ranges for HbA1c test

Q

If my lab can measure the HbA1c at a linearity of >14%, is it necessary to send out to a reference lab because the doctor wants to know what is the value that over 14%? If we know the value is 16% or 18%, is this value more useful than to report >14%? 

A

HbA1c level reflects the mean glucose concentration over the previous six-week to 12-week period and provides a much better indication of long-term glycemic control than blood- and urinary-glucose determinations. The National Glycohemoglobin Standardization Program (NGSP) has standardized more than 99% of the assays used in the United States to the Diabetes Control and Complications Trial (DCCT) standard, effectively decreasing the variability between laboratories.1 Correspondingly, for laboratories performing HBA1c testing, College of American Pathologists, or CAP, has progressively lowered the acceptable limits on accuracy-based grading for the GH2 whole-blood proficiency survey. In 2011, acceptable limit are +/-7%. In addition, a new reference method has been established that will provide for even more reliable worldwide standardization of all A1c assays.2

To establish the upper and lower limits of detection, laboratories must determine the concentration range at which the analyte can be accurately measured. This typically is done by measuring a range of analyte concentrations in replicate, determining the precision at each level (accuracy), and plotting the linear regression with the goal of the slope of the AMR being 1 (linearity). For HbA1c, an upper limit of the AMR of 14% is common among laboratories. It is well established that the non-diabetic reference range of HbA1c is between 4% and 6%. The ADA recommends a HbA1c level of =6.5% be used for diagnosis of diabetes. Correspondingly, the recommended target for HBA1c level to achieve glycemic control at most organizations is either 6.5% or 7%. Therefore, 14% HbA1c is at least twice this clinically optimal level.

In addition to the absolute level, the physician will use the patient's HbA1c level to determine whether the patient's glycemic control is stable, improving, or worsening. Many physicians have suggested that a 0.5% change is clinically significant.3 Most diabetic patients will have a HbA1c value that is within the laboratory's upper limit of 14%. In fact, an HbA1c value above >14% should be repeated, preferably by a different testing methodology, to identify falsely positive results caused by the presence of a hemoglobin variant. At our laboratory for example, out of the 55,397 HbA1c tests run in 2010, only 44 were reported as over 14%. Therefore, values for any of these three metrics would be easily obtained in most cases. Of course, for those patients whose HbA1c level is above 14%, a change in their glycemic control may not be evident unless their value falls below this cutoff. Using the HbA1c conversion to estimated average glucose, however, HbA1c levels of 16% and 18% would correspond to glucose levels of 412 mg/dL and 470 mg/dL, respectively.4 Given that the International Expert Committee Report on the Role of the A1C Assay in the Diagnosis of Diabetes in 2009 stated that “individuals whose A1c values are close to the 6.5% A1c threshold of diabetes (i.e., 6.0%) should receive demonstrably effective interventions,” both of these values would be exceedingly high and warrant serious medical treatment.5 Finally, if a physician wants to monitor a patient's glycemic control and the HbA1c is above 14%, he can always monitor the blood glucose level to determine if recent dietary changes and/or therapy is working. Therefore, establishing a reportable range above 14% for HbA1c is not clinically necessary, and resulting up to 14% is more than adequate for accurately reporting the vast majority of results.

—Leslie J. Donato PhD, Clinical Chemistry Fellow
Mayo Clinic, Rochester, MN

References

  1. Steffes, MW, Sacks, DB. Measurement of circulating glucose concentrations: the time is now for consistency among methods and types of samples. Clin Chem. 2005;51(9):1569-1570.
  2. Hanas R, John G. 2010 consensus statement on the worldwide standardization of the hemoglobin A1C measurement. Diabetes Care. 2010;33(8):1903-1904.
  3. Little RR, Rohlfing CL, Sacks DB. Status of hemoglobin A1c measurement and goals for improvement: from chaos to order for improving diabetes care. Clin Chem. 2011;57(2):205-214.
  4. Nathan DM, et al. Translating the A1C assay into estimated average glucose values. Diabetes Care. 2008;31(8):1473-1478.
  5. Ackermann RT, et al. Identifying adults at high risk for diabetes and cardiovascular disease using hemoglobin A1c. National Health and Nutrition Examination Survey 2005-2006. Am J Prev Med. 2011;40(1):11-17.

MLO's “Tips from the clinical experts” column provides practical, up-to-date solutions to readers' technical and clinical issues from experts in various fields.

Readers may send questions to [email protected].