Clinical laboratories are always striving to find a better way to test for analytes that are relevant for the prevention or diagnosis of disease. Today, there are many tests performed on single analytes that provide important information upon which clinical decisions can be made. What is not clear yet in the clinical diagnostics field, however, is the impact that today’s “omics research” will have on the daily work in the clinical lab five to ten years from now.
Ultimately, biomarkers that are discovered and validated through omics research are expected to help in the following ways:
- Identify patients at risk for disease.
- Facilitate prevention and early detection.
- Allow timely management of disease.
- Screen the effectiveness of new drug therapies.
- Monitor progression and response to treatment.
The biomedical researchers who are using proteomics, metabolomics, and lipidomics—or even combining these for a “cross-omics” approach—to advance biomarker research believe that the complex diseases that they are studying will often require multiple biomarkers to effectively diagnose or stratify. That could mean that we are moving toward a future that will not be focused solely on individual molecules, as many in the medical community think of biomarkers today, but increasingly be focused on the patterns of protein expression and/or lipid and metabolite abundance associated with disease stage and outcome.
Biomarkers: a changing concept
Over the next decade, the common concept of a “biomarker” is likely to transition from a single analyte to something that involves a panel of three or more analytes, such as proteins. What could also become increasingly important is the relationship between proteins and their related lipids and metabolites, as the integrated network likely plays a key role in understanding the disease stage. There will be a need for definitive multi-analyte tests that capture and interpret all this information.
Proteins are important molecules to act as biomarkers, as they are often the effectors of the disease changes and, therefore, often the targets of drug therapies. Another way biomarkers may be identified for routine clinical use in the future is by finding a particular post-translational modification (PTM) of a protein. For example, phosphorylation is a PTM that serves as a common signaling “flag” indicating the turning-on or turning-off of protein activity within a signaling network. This represents a more subtle way of thinking about “biomarkers”; it isn’t just the presence or absence of a protein but, more specifically, the activation state that may be the key to understanding the clinical question.
From the research lab…
We are truly on the brink of a revolution in understanding biology. Granted, it will take time to play out as there is much complexity that still needs to be worked through. This path that the biomedical research community is on—the shift from looking at a problem with a narrow lens to analyzing the problem very broadly with omics technologies—is not an easy one, but it is one that is well worth traveling.
The field of mass spectrometry is positioned to have a very large impact in omics research, as the technology is uniquely suited to both global-unbiased analysis and sensitive-specific detection and measurement. In the future, thanks to breakthroughs with large-scale studies happening today (primarily in academic research institutions), we will have tests using protein biomarker panels that provide more actionable information than ever before. Indeed, many large clinical labs are already working to bring on more protein-based tests, and many of these will be done using mass spectrometry.
While all omics research is important to understanding biology, the proteomics field has received the most attention during the last decade. In the post-genomic era, proteomics was expected to yield rapid returns; however, the complexity was much greater than expected. The promise of proteomics was never an illusion, however; rather, the technology required wasn’t fully developed yet.
In the last few years, the technology has evolved to the point that researchers with clinical perspectives can now reproducibly analyze large proteomics sample sets, approaching the size that is needed for validation of a biomarker for clinical use.
Figure 1 represents a generalized workflow for biomarker research. Omics researchers today more clearly understand what it takes to go from discovery to clinical utility, and they are gearing up to do more large scale verification/validation studies. We will be able to start realizing the potential of protein biomarkers of disease as more of these proteomics pipelines deliver results for the various diseases under investigation.
|Figure 1. Biomarker research pipeline showing the three different phases for advancing a biomarker candidate, ranging from discovery to verification on the path toward clinical utility.|
…through large-scale studies
The challenge ahead for researchers will be to perform large-scale verification/validation studies with enough sample numbers to prove that these biomarkers are diagnostic and will yield clinically valuable information. Thousands of samples are needed for solid validation of the markers before we can even think about moving them into a clinical setting.
One example of this type of scalable research is being done by Jenny Van Eyk, PhD, a researcher at Johns Hopkins University. She is an innovator who is seeking to identify and validate new cardiac biomarkers for potential use in the clinic. Focused on advancing heart and vascular disease research, she is analyzing thousands of samples in her validation program to assess which protein targets could have the most clinical utility. At an industry meeting last year, Dr. Van Eyk also detailed how automated workflows for targeted quantitative proteomics have accelerated her work—both in simplifying and automating sample preparation and multiplexing her LCMS workflow.
The entire industry for proteomics and cross-omics research in pursuit of biomarkers has matured. There has been an explosion of larger-scale collaborations as well as more bio-banking initiatives, which are making an enormous difference. Moreover, the quantitative aspects of mass spectrometry for proteins have finally come of age. Quantitative data on proteins and peptides can now be obtained reproducibly across large studies. There needs to be confidence in the results, and it’s happening today at a level never before achieved. (Targeted quantitative proteomics was named the 2012 Nature Method of the Year, highlighting the progress the field has made.)
A recent innovation in mass spectrometry workflows is a data-independent acquisition technique that provides quantitative accuracy with high multiplexing and high detection reproducibility in proteomics verification studies, accelerating the biomarker candidates through the research pipeline. Reproducibility of measurement is critical to the verification/validation of a panel of markers, just as much as sensitivity in the detection technology is required to find lower abundant markers.
…to the clinical lab?
As each of these enabling technologies come on line in the protein biomarker research pipeline, it increases the critical verification/validation work being done in the clinical research field. It is important that validated protein markers emerge from these efforts, so the clinical labs can focus on assay development and not biomarker validation.
Clinical laboratories are already beginning to shift to mass spectrometry. Samples with interferences and high complexities are moving to LC/MS/MS (liquid chromatography / tandem mass spectrometry). For example, thyroglobulin is a known marker of recurrence of thyroid cancer. Immunoassays are very sensitive but can have specificity issues due to auto-antibodies and other confounding issues, and thyroglobulin is a good example of this. This assay has been recently translated to a mass spectrometry-based assay and is getting validated for LDTs in some of the larger clinical labs.
Omics and personalized medicine
While advances in genomics over the past decade have been impressive, a sequenced genome tells you only a small part of the story. The bigger story—the more complex story—is at the systems level. Genomics can provide valuable information about pre-disposition to disease, for example, as it does provide the “blueprint.” However, out on the front lines, what’s happening is with proteins, lipids, and metabolites.
Another interesting aspect of the recent trend in using data-independent acquisition strategies is that this provides the ability to go back and re-interrogate samples for different information, without needing to run them through the analyzer again. It can create a “digital record” of the proteomics sample where all the qualitative and quantitative information about the detectable species is stored, allowing researchers to re-interrogate the information over and over again as new hypotheses emerge.
Imagine a scenario in the future in the age of personalized medicine. Patients would have their blood taken at regular intervals. Samples would be analyzed using a data-independent technique and the data archived. Information on wellness markers could be mined from the data at that time. Then, 10 years later, at the indication that a patient has a specific disease, a clinician could go back into the archived data and look for specific disease markers to see the time progression or establish a baseline. This retrospective information would provide remarkable, clinically useful insight.
A much more personalized approach to monitoring an individual’s wellness or monitoring specific disease progression is on the horizon. Imagine the day when, thanks to proteomics research, we can identify health changes at an earlier stage so patients can make changes in their lives to counteract disease and lead healthier lives
Moving forward, the next big hurdle will be relating conclusions from proteomics with other classes of biologically relevant molecules. But the speed with which biomarker research has accelerated in the last few years shows that there may be regulatory-cleared tests using markers from omics research in the near future.