Molecular biology has its roots in the biochemistry laboratories of the 1960s and 1970s; PCR was developed in the mid-1980s, and by 1990 was in widespread use in academic laboratory settings. Common clinical use of molecular diagnostic techniques lagged behind academic applications, and in many cases still does in the present day.
There are a number of reasons for this, but a very major one was (and remains) the complexities inherent in performing a molecular test to clinical standards of result reliability. Expertise, equipment, and infrastructure are all needed to handle the steps of nucleic acid extraction, assay setup, and results interpretation within a context of contamination control, to say nothing of the less MDx-specific laboratory demands of sample tracking, results reporting, and quality assurance functions. While the situation has improved immensely from the days of laborious manual crude lysate sample preparations followed by slow thermocycling on a water-cooled instrument, agarose gel electrophoresis, and squinting into a UV light table in search of dim fuzzy bands in order to call positive or negative on a given molecular marker, molecular testing still mostly requires infrastructure and specialized technical expertise not available at smaller medical facilities. Note the “mostly” in that last sentence, however; the topic of this installment of The Primer will be the instruments designed to be the exceptions to that statement.
The future is…now?
Consider the list of steps described above: extraction, amplification, detection, and interpretation/reporting, while maintaining contamination control. Imagine now an instrument which is capable of doing all of these steps in an automated fashion with little or no user interaction from the time an initial “raw sample material” is put in until a result—validated with automated, appropriate internal controls—is pushed out to the laboratory computer (LIMS) network. The appeal of such a device is obvious, and it inspires futuristic visions of going to your local clinic or individual doctor’s office to have complex molecular tests done on the spot. Gone would be the days of delays (or worse fates) from having to have a specimen transported to some remote site specially equipped for its analysis and interpretation; the potential speed of MDx would be fully realized, and a distributed “near point of care” model for many kinds of diagnostics would be effective.
Well, the vision is not so futuristic because such a brave new world already exists. Numerous instruments of this type, known as “sample-to-answer” MDx devices, are currently on the market, and several have well established, regulatory-approved assays. A still larger number of such devices are in various stages of development.
On the market
Examples of six currently approved systems will help to illustrate the general properties of these systems (Note: they are selected for illustrative purposes only and their inclusion here should not be considered an endorsement; neither should the omission of any other sample-to-answer instruments be considered a reflection on them).
The Cepheid GeneXpert system is based around small disposable cartridges, barcoded and pre-loaded with reagents for particular assays. Raw sample is put into the cartridge, and it is placed in the instrument, which reads the barcode and initiates an assay-specific protocol. Extraction proceeds by chemical (and ultrasonic as needed) approaches, followed by qPCR and fluorescent monitoring of multiple channels for target and control signals. The computer system interprets these, and in an hour or so from sample addition—with no intervention by a laboratorian—a simple results call is generated. The system also verifies that the test cartridge is within expiry date, and it can interface with the LIMS. This system is available in a range of processor sizes to handle different numbers of random access cartridges at any given time, and with a wide range of available assays. Released to the market in 2004, this system is the pioneer among sample-to-answer instruments.
The BioFire FilmArray system is a more recent entry to the field. Rather than a disposable cassette, a disposable foil pouch with embedded “blisters” of reagents is used in this platform, and each processor can only handle one pouch at a time; again, however, the only user input is addition of sample to pouch and placing pouch in instrument. Rollers within the instrument sequentially transfer reaction steps from one “blister” to the next, with results being available in about an hour. For detection, the FilmArray system utilizes an array within the pouch system but optically scanned from the outside to achieve good levels of multiplexing (~20 targets) not possible with real-time approaches. This level of multiplexing is particularly well suited to clinical presentations where multiple etiological agents might be suspected, such as acute respiratory infection, sepsis, or gastrointestinal distress, and assays for these applications are among what is available on this platform.
Focus Diagnostics has the 3M Integrated Cycler, a small device capable of taking rotary sample discs, which can run either classical real-time PCR-based assays with separate prior extraction, for up to 96 samples, or sample-to-answer type reactions on up to eight samples per disc. (The difference arises due to space being taken up by extraction microfluidics within the sample-to-answer discs, as opposed to the thermocycle-only discs). FDA-cleared tests include one for influenza A, B, and RSV, and work direct from a nasal swab to results in about an hour.
Nanosphere’s Verigene system has separate processor and reader modules, which run single cassettes through all steps of extraction, amplification, and detection through hybridization. While nucleic acid amplification is possible with this device, in some assays this step is omitted and sufficient sensitivity is possible with direct target hybridization. This sensitivity in turn arises through the labeling method, which unlike fluorescence in many other systems utilizes optical detection of light scattering from derivitized gold nanoparticles. Results are available within the same timeframe (1 to 2 hours) as our other example systems, and currently FDA-cleared test offerings include tests for bloodstream infections, gastrointestinal infections, and respiratory infections.
The iQuum (recently purchased by Roche) Liat “Lab In A Tube” system utilizes small disposable linear tubes, which also consist of a “blister pack”-like series of reagents. Sample is added in the top of the tube, and the tube is inserted into a diminutive single sample processor, which again uses mechanical rollers outside the tube walls to sequentially transfer the sample from a top lysis step through nucleic acid purification steps and eventually to a real-time PCR chamber for cycling and more conventional real-time analysis (complete with display of the reaction amplification curve on a screen built into the processor). iQuum’s device and assay for detection of the Influenza A H1N1(2009) “Swine Flu” received clearance in 2011.
The BD MAX system is appreciably larger than the other examples given here, as it is based around a liquid handling robot. As with the other systems, preloaded reagent cartridges and unprocessed sample can be loaded in the system, and the robotics utilizes 3-axis motion of pipettor heads across a reagent tray table with dedicated “stations” to carry out extraction, purification, and subsequent real-time PCR and detection steps. Unlike the other examples described above, this system also can support user-defined protocols, such as lab-developed tests (LDTs).
Beyond these examples, a number of other similar systems are in various stages of development as this issue of MLO goes to press, and may have received FDA clearance by the time you read these words. This rapid evolution of the sample-to-answer space is evidence that many people feel this technology is going to be increasingly popular.
Pros and cons
The use of these types of instruments has a number of advantages. For a start, they can be operated by staff with less technical training than is needed for a traditional molecular lab technologist. They bypass the need for many of the specialized infrastructural requirements of a traditional molecular laboratory, pertaining to sample extraction setup and equipment and many aspects of contamination control. In general, they also require only a few minutes of hands-on time per specimen, freeing up lab staff to deal with other duties while the instrument handles the whole process. All of these attributes make sample to answer instruments attractive for applications “near point of care” (POC), where molecular testing facilities and dedicated staff aren’t readily available but samples are, and 1-to-2 hour turnaround times on results can be clinically beneficial.
There are downsides to these devices too, however. One, unavoidably, is costs, both per instrument and per-test. Costs vary greatly between instrument and specific test, but in most cases they are appreciably higher on a per-test basis than running the equivalent molecular test in a more traditional assay system. This cost of convenience is not surprising, but is a downside where large sample throughputs occur.
Another disadvantage is that these systems utilize sometimes limited, precious sample and consume it all internally. In the case of a negative test result, this can be problematic, whereas with a more traditional full-service molecular laboratory approach, there is likely to be extracted nucleic acid from the sample available for additional testing as desired.
Finally, the simplicity and ease of automated results-calling by these devices comes with a price: removing the trained eyes, experience, and judgement of an experienced clinical laboratory scientist. Readers who have experience in qPCR, for example, have probably seen cases of samples with slow, creeping, clearly aberrant amplification curves which manage to cross instrument Ct thresholds and be called “positive” yet which likely are not true positives; or the opposite, the sample which shows signs of positivity at the very end of the run, not quite achieving cut-off. While both of these situations are rare, and their impact on sensitivity and specificity is inherently addressed in the instrument/assay validation and performance claims, they represent the sorts of cases where a full-service molecular laboratory may be able to catch uncommon specimen behavior and perform alternate or repeat testing.
The state of the art
The overall picture that emerges is that sample-to-answer instruments are currently most suited to clinical venues not readily served by larger traditional molecular laboratory core facilities; however, they can also offer potential uses within the context of a full molecular laboratory. This can occur, for example, in capacity to handle rush “stat” specimens for particular tests without significantly disturbing routine lab workflow, or in allowing extended testing hours for labs which are not staffed for molecular work 24/7. As test menus increase on this class of device, their use in these applications will also increase and simultaneously help justify the instrument costs.
This potential for application in lower-resource lab/near-POC settings as well as for cooperative integration to existing full molecular laboratories in some settings makes sample-to-answer instruments an increasingly popular choice. As more instruments and more tests on each platform become available, their popularity is only likely to increase.