Achieving best QC practices for clinical lab water

May 20, 2020

In today’s laboratories, the availability of pure water is essential, and while consumers consider tap water to be “pure,” laboratory scientists and healthcare professionals regard it as highly contaminated. Analytical and experimental scientists are concerned with elements and compounds at concentrations in the parts per billion (ppb) range or lower as many of these contaminants can have a negative effect on applications through their interaction with other substances, including the substance under analysis.

For most laboratory and clinical applications, the water that is used is typically purified from drinking water; however, the unique ability of water to dissolve (to some extent) virtually every chemical compound and support practically every form of life means that drinking water supplies contain many substances in solution or suspension, and additional impurities are derived during the drinking water purification process. Furthermore, unlike other raw materials, drinking water may vary significantly in purity – both from one geographical region to another and from season to season.

There are five classes of impurities found in natural and drinking water: suspended particles, dissolved inorganic compounds, dissolved organic compounds, microorganisms and biomolecules, and dissolved gases. The overall objective of water purification methods for scientific and medical applications is to remove drinking water impurities while minimizing additional contamination from purification system components and bacterial growth.

Water quality for clinical diagnostics

Water quality is extremely important in clinical diagnostics, and water quality that is below the accepted standards not only affects the chemistry of the tests but can also affect the general operation of the analyzer. This, in turn, will reduce the reliability of the test results and increase calibration times and reagent costs.

On one hand, in a clinical analyzer, purified water can be used for many different functions, including washing reaction cuvettes; feeding wash stations for probes and stirrer paddles; diluting reagents, samples, and detergents; incubator baths; and interface between syringe and sample.

On the other hand, poor water quality can affect the analyzer performance in different ways as well. For example, poor water quality can reduce the accuracy of pipetting volume due to particles and bacteria; cause errors in photometric readings as a result of particles interfering when a water bath is used; contaminate cuvette washing, carryover, and water marks; contaminate sample and reagent probe washing and carryover; affect sample and dilution, leading to errors and poor reagent stability; and reduce calibration stability and sensitivity as a zero standard (Ca, Mg, PO4, HCO3, etc.).

In addition, in immunoassay systems, bacterial byproducts (notably alkaline phosphatase) can interfere with some enzyme-based assay results; however, perhaps the most important aspect of pure water for automated pathology analyzers is arguably reliability. Laboratories without the budget or space for a “duplex” system require a robust design, which incorporates sub-systems to be used in the event of an emergency or system failure.

Pure water requirements and regulations

Since purified water is required in all science-based organizations, this has led international and national standards authorities to establish water quality standards for various applications. The most relevant to the clinical analyzer market is the Clinical and Laboratory Standards Institute (CLSI) – formerly the National Committee for Clinical Laboratory Standards. The key purified water guidelines recommend three main types of water (Type I-III), of which Type I is most relevant to clinical laboratories and feeds to automated instruments. These have been replaced with the terms Clinical Laboratory Reagent Water (CLRW), Special Reagent Water (SRW), and Instrument Feed Water.

When it comes to pure water regulations, in most countries’ public sectors, laboratories are advised/regulated by an accreditation body that establishes working standards and guidelines. While this is not mandatory for private-sector laboratories, the significant credibility and advantages gained have resulted in more of these laboratories registering with an accreditation body. For example, although the College of American Pathologists (CAP) is the accreditation body in the U.S, many laboratories in different countries also apply for CAP registration. CAP recommends that laboratory water should meet the CLSI CLRW grade standard as a minimum.

Clinical analyzer companies are also further regulated through organizations, such as the Food and Drug Administration (FDA). Ultimately, the analyzer companies are responsible for ensuring their chemistries are validated and that purified water of a suitable standard is used so that all results are accurate and reproducible.

Validation and trend monitoring

Increasingly, validation of water purification systems is becoming mandatory, and objective evidence has to be provided that confirms a purification system meets the requirements for a specific use or application. Purified water should be validated as fit for its intended purposes, and the purity specifications should be incorporated into the water purification validation procedure. This is used to document the system’s ability to deliver adequate volumes of purified water at the stated specifications, as detailed in the user requirement specification.

After validating the water as fit for its purpose, it is critical to ensure that it continues to meet the required specifications, which is achieved by measuring and documenting defined parameters at established regular intervals. Furthermore, this approach can detect deterioration of purification components before this has impacted the required water quality. Deterioration in a measured parameter, such as changes in the required resistivity or bacteria level, indicates the need for system maintenance or further investigation to ensure the required water specification is always met.

Additionally, recording critical parameters over a defined time is essential to identify gradual changes in water quality and enable corrective measures to be taken. For example, if ion exchange cartridges are used beyond their intended life, impurities that could interfere with the test-analysis reactions can be eluted into the purified water at levels that may not register on built-in monitoring systems.

Effect of pure water requirements

Advancements in analyzer technologies demand a good quality water feed for maintaining high performance and reliability. Since water is used in virtually every process on an analyzer, it is crucial that quality is monitored and verified to ensure the integrity of test results. Integration of multiple technologies into a single analyzer to perform both chemistry and immunology applications results in higher quality pure water being required for more sensitive immunology testing.

While smaller sample and reagent volumes reduce costs, they require higher purity water because of the increased sensitivity needed for smaller sample volumes.

Among lab tests, the diagnosis or extent of certain diseases is associated with the levels of specific proteins, known as biomarkers in the blood. Examples of these include elevated levels of Troponin, which signifies atherosclerosis; B-type natriuretic peptide (BNP), which indicates coronary artery disease; AFP, which indicates hepatocellular carcinoma; and CA-19-9, which is correlated with pancreatic cancer and PSA, which is a marker for prostate cancer. These proteins generally occur in very low concentrations like nmols/l or pmols/l and are detected by techniques that are extremely sensitive. Compared to traditional tests/assays, these current detection methods have the advantage of reducing the number of tests that have to be performed.

However, since they are more susceptible to contaminant interference, it is crucial that the water is of the appropriate grade so that it will not contribute to this problem.

Conclusion

Along with the many uses of pure water in clinical labs, the importance of water quality can also be seen in the life science industry for a wide range of research lab applications, such as molecular biology, electrophoresis, electrophysiology, endotoxin analysis, histology, monoclonal antibody research and immunocytochemistry, and radioimmunology (RIA) and enzyme-linked immunosorbent assays (ELISAs).

Adding to the life science need for quality control of pure water is the demand for cleaning, sanitizing, and sterilizing reusable medical equipment in healthcare, which presents yet another industry in need of guidelines and international standards as concerns grow over infection control in hospitals and the spread of increasingly resistant pathogens.

As test and research demands continue to increase, scientists and researchers around the world will look to meet these challenges with the installation of point-of-use water purification systems as reliable solutions that can deliver certainty in the quality control of pure and ultrapure water for all aspects of their work.

(This article was created with the permission of ELGAⓇ LabWater, with information taken from their 2019 Pure Labwater Guide). ELGA ® LabWater is a leading global water purification manufacturer. For more information, visit www.elgalabwater.com.