Quality control: more challenges for molecular diagnostics

May 1, 2012

Laboratorians sleep most comfortably when they have sufficient protocols in place to assure the accuracy of their laboratory test results. Such confidence is found in a solid quality assurance program including quality control (QC) samples, a validated laboratory test, quality control monitoring, and error prevention protocols. Molecular laboratorians have been somewhat slow to adopt traditional QC practice, but these elements can now be found for some molecular tests, particularly infectious disease tests where comprehensive quality control materials are commercially available,1 the tests are FDA cleared and are quantitative with a limited number of analytes, and there are published QC protocols directly applicable.2,3

This is evidence that historically proven QC practices could become the norm for at least some molecular diagnostic tests.4 However, many categories of molecular diagnostics tests continue to lack evaluation and monitoring by familiar and proven quality control procedures typically used in the clinical chemistry lab. Some progress has been made, but challenges include a persistent lack of QC materials, difficulty of monitoring multiplexed tests, little or no software to track and analyze QC, and a general lack of published QC protocols for molecular tests.

Characterized materials useful for QC monitoring are available for several common molecular singleplex tests that detect genetic disease.1 There are even some multiplex controls commercially available for multiplex tests such as cystic fibrosis.5,6 However, of the more than 2,500 disease tests listed on GeneTest,7 quality controls are available for only a small percentage. Recent reports from regulatory agencies 8,9 discuss the problem and make recommendations for solutions. The Secretary’s Advisory Committee on Genetics, Health, and Society (SACGHS) suggested that Centers for Medicare & Medicaid Services (CMS) “hire sufficient staff to fulfill CLIA’s statutory responsibilities (i.e., ensure the quality of laboratory tests), and the program should be exempt from any hiring constraints imposed by or on CMS.” SACGHS also proposed that “HHS should ensure funding for the development and characterization of reference materials, methods, and samples—for assay, analyte, and platform validation, for quality control, for performance assessment, and for standardization.” Notably, where there is availability of quality controls, or when a laboratory has resources to be creative, traditional QC practices can be applied in spite of test complexity.10-12

However, molecular life is about to become further complicated as even more complex tests—Next Generation Sequencing (NGS), RNA and DNA microarrays, and mass spectrophotometry, to name a few—quickly head for the clinical laboratory. These complex platforms test a variety of human specimen types for sometimes thousands of analytes—potentially the entire human coding genome of approximately 22,000 genes. In order to apply federal QC guidelines of CLIA’ 88 that state a positive and negative control must be run for every analyte detected, well over 1,000 controls could be required—an impractical approach. Labs don’t have sufficient freezer space to archive patient samples for use as controls, software isn’t available to track QC, and the cost is too high to run so many controls even on a rotating basis. However, expectations of accuracy are high and tests are used to help the physician make critical treatment decisions. Thoughtful redesign of traditional QC systems and protocols seems preferable to the risk of abandoning basic QC principles based on perceived impracticality.

Regulations may provide some guidance. CLIA’88, CLIA Final Rules of 2003, and now CLSI’s EP23-A, Laboratory Quality Controls Based on Risk Management, all require that labs develop a quality control plan (QCP) for every test implemented, that lab directors justify the plan, and that the plan be designed to have a high probability of detecting errors that exceed the stated medical allowable limit. Given the examples found in professional literature, it is doubtful that too much QC is currently in place For example, Harismendy et al, in a study of single nucleotide polymorphism (SNP) discovery by next-generation sequencing technologies, report false positive and false negative rates of 3% to 12% and 1% to 8% respectively.13

Complexity and history paint a daunting picture, but while an effective QCP is possible, will such a process be too complicated to implement? Traditional practice would require evaluation of the test system for variability and probability of failure of the thousands of components and variables present in multiplex arrays. Incorporating patient data into the analyses by identifying informative patterns can also be used to assure test quality.2,14

In an attempt to simplify traditional QC practice, CMS instituted an Equivalent Quality Control (EQC) option in CLIA Final Rule 2003 based on dependence on controls built into some tests by the manufacturer. Just as the first molecular platforms with built-in controls hit the market, CMS announced that EQC will be replaced (date pending) by a new Individualized Quality Control Plan (IQCP) built around CLSI EP23.15 CLSI EP2316 promotes the principles of each lab developing its own IQCP based on risk assessment for each test in its laboratory. The degree of risk, and therefore the QC required, is determined by committee assessment using a tool such as Failure Mode and Effects Analysis. IQCP details are not yet known, so it is unclear how this new rule will be applied to molecular tests.

Thought leaders in molecular diagnostics are concerned about how to assure accuracy of the latest molecular tests. One example of a well thought-out approach is the article “Quality Assurance of RNA Expression Profiling in Clinical Laboratories” by Tang et al.17 Although the article focuses on RNA profiling using large scale arrays, the principles for quality assurance can be applied to multiplex DNA sequencing and DNA array testing. While recognizing that inclusion of a positive and a negative control in every run for every analyte of a microarray is not possible for each of the dozens to thousands of target analytes, the authors also state that, “Quality control is among the most important of quality assurance measures.”18 They promote running an exogenous control alongside patient specimens to evaluate assay performance in a general manner. A separate exogenous control, representing each of the main outcome groups, could be included periodically throughout the run at a frequency dependent on the number of patient samples and the medical impact of an erroneous result, keeping in mind that, in event of a QC failure, good QC practice requires investigation of test results back to the last correct QC, and possibly repeat testing. The authors suggest that when the same control material is used in multiple runs, selected numeric results can be tracked over time using Levey-Jennings charts to visualize drift or shift. Use of multiple controls for a multiplex test increases the probability that one or more will frequently fall outside traditional Levy-Jennings control limits, but rather than abandon the multiple control approach, this group notes that, “This high failure rate emphasizes the benefits of a quality control strategy that includes multiple controls for the many critical aspects of the assay and synthesizes multiple data points to interpret overall success or failure of an assay.”

The point taken is that the failure of one control does not necessarily invalidate the entire assay; however, comprehensive monitoring allows data analysis to be done in the context of a thorough understanding of the technical strengths and weaknesses of the test system. The authors have chosen to take the established path toward ensuring quality of their test. Every step has been evaluated for its failure potential in a manner similar to that recommended by Krouwer et al19, 20 and proven quality management principles2 applied.

The Tang paper describes the successful adaptation of traditional QC principles to minimize failure risk in a highly complex test protocol. However, in light of increasing pressure on healthcare budgets and limited personnel, it is understandable that the laboratory may ask if equivalent assurance of quality can be provided with a simpler approach. In view of the risks and costs of delivering poor quality results, careful consideration should be given to the possible effects from QC shortcuts and simplifications.

Clinical laboratory principles such as using statistical process control with QC analysis to quantify and prevent errors and improve processes were first developed and applied in industrial settings. Toyota built a reputation for rigorous quality control and high quality products, yet recent quality-related issues prompted one expert observer to conclude, “One primary reason for the [quality] crisis is that this overwhelming complexity exceeded Toyota’s organizational capability.”21 Toyota management had become complacent, but in the face of flagging quality, the company’s response has been to “meet the challenge of the complexity problem by mobilizing all its employees toward advancements in such areas as design rationalization, refinement of electronic control systems and digital engineering, and quality control.”21 A parallel can be drawn to the laboratory medicine industry today with the advent of molecular testing.

The increased technical complexity of molecular testing does mean that healthcare manufacturers and laboratories are treading new ground in failure mode analysis and quality control plans. Determination of what to monitor, how to monitor, and how often to monitor remains to be worked out. One aspect of laboratory testing that hasn’t changed is the need for constant evaluation of quality. Laboratorians should be proactive and sometimes creative in developing QC plans for the emerging complex molecular tests so we can all sleep at night knowing the best patient care possible continues to be provided.

References

  1. Genetic Testing Reference Materials Coordination Program (GeT-RM). Atlanta, GA. Centers for Disease Control and Prevention, U.S. Dept. of Health and Human Services. http://www.phppo.cdc.gov/dls/genetics/qcmaterials/default.aspx. Accessed April 5, 2012.
  2. Westgard JO. Basic QC Practices. 3rd ed. Madison, WI: Westgard QC, Inc; 2010: 362.
  3. Clinical and Laboratory Standards Institute. Quantitative Methods for Infectious Diseases; Approved Guideline, Second ed. Wayne, PA. 2010. CLSI document MM06-A2, Vol. 30.
  4. Burd EM. Validation of laboratory-developed molecular assays for infectious diseases. Clin Microbiol. 2010;23(3):550-576.
  5. Maine Molecular Quality Controls, Inc. Control is vital. http://www.mmqci.com/index.php. Accessed April 5, 2012.
  6. SeraCare. Quality is in our blood. http://www.seracare.com/Home/tabid/36/language/en-US/Default.aspx. Accessed April 5, 2012.
  7. NCBI. GeneTests database. http://www.ncbi.nlm.nih.gov/sites/GeneTests/?db=GeneTests. Accessed April 5, 2012.
  8. Chen B, et al. Good laboratory practices for molecular genetic testing for heritable diseases and conditions. MMWR. 2009;58(RR-6)1-37;quiz CE-1-4.
  9. Secretary’s Advisory Committee on Genetics, Health, and Society. U.S. system of oversight of genetic testing: a response to the charge of the Secretary of Health and Human Services. Department of Health & Human Services. 2008.
  10. Giese MC, Highsmith WE. Laboratory-developed tests: validation strategies to meet regulatory requirements. J. Gordon, ed. AACC National Meeting and Expo: Anaheim, CA. 2010.
  11. Gullapalli RR, Carter AB, KJ A. Automated data analysis of real-time PCR data (Poster), in Association for Molecular Pathology Annual Meeting 2008. Grapevine, TX. J Mol Diagn. 613.
  12. Liang SL, et al. Application of traditional clinical pathology quality control techniques to molecular pathology. J Mol Diagn. 2008;10(2):142-146.
  13. Harismendy O, et al. Evaluation of next generation sequencing platforms for population targeted sequencing studies. Genome Biol. 2009;10(3):R32.
  14. Westgard S. Failure modes of risk assessment. http://www.westgard.com/failure-modes-risk-assessment.htm. Accessed April 6, 2012.
  15. Westgard S. Pop Quiz: What’s an IQCP? 2012. http://james.westgard.com/the_westgard_rules/2012/03/pop-quiz-whats-an-iqcp.html. Accessed April 6, 2012.
  16. Clinical Laboratory Standards Institute. Laboratory quality control based on risk management. Wayne, PA. 2010. CLSI document EP23-A, Vol. 31.
  17. Tang W, et al. Quality assurance of RNA expression profiling in clinical laboratories. J Mol Diagn. 2012;14(1)1-11.
  18. Clinical Laboratory Standards Institute. Development and use of quality indicators for process improvement and monitoring of laboratory quality; approved guideline. Wayne, PA. 2010. CLSI document GP35-A, Vol. 30.
  19. Krouwer JS. An improved failure mode effects analysis for hospitals. Arch Pathol Lab Med. 2004;128(6):663-667.
  20. Clinical Laboratory Standards Institute. Estimation of total analytical error for clinical laboratory methods; approved guideline. Wayne, PA. 2003. CLSI document EP21-A, Vol. 23.
  21. Shook J. Toyota troubles: fighting the demons of complexity: An interview with Professor Takahiro Fujimoto. http://www.lean.org/shook/displayobject.cfm?o=1395. Accessed April 6, 2012.

Clark Rundell, PhD, DABCC and Joan Gordon, MT (ASCP) have more than 30 years of experience in laboratory medicine. In 2000, they co-founded Maine Molecular Quality Controls, Inc. (MMQCI) to focus solely on providing the community with quality controls for molecular diagnostics.