Researchers have shown that an automated cancer diagnostic method, which pairs cutting-edge ultrasound techniques with artificial intelligence, can accurately diagnose thyroid cancer, of which there are more than 40,000 new cases every year.
The method — deemed high-definition microvasculature imaging, or HDMI — noninvasively captures images of the tiny vessels within tumors and, based on the vessel features, automatically classifies the masses. Researchers at the Mayo Clinic College of Medicine and Science, who developed the technique, tested it on 92 patients with thyroid tumors, finding that the method could distinguish if the growths were cancerous with 89% accuracy. In a study published in the journal Cancers, the authors suggest that HDMI could potentially resolve a long-standing diagnostic challenge of assessing thyroid tumors in the clinic.
Researchers have shown that the addition of chemicals called contrast agents, which are easily visualized and routinely used in other medical imaging procedures, allows ultrasound to reveal the details of tumor microvasculature, but these substances must be injected into patients and sometimes elicit unfavorable side effects.
While newer ultrasound techniques can produce clearer images of nodules, physicians must ultimately assess them subjectively.
Along with colleague Mostafa Fatemi, a biomedical engineering professor at Mayo Clinic, Azra Alizad, M.D sought to develop a low-cost, noninvasive imaging solution that provides measurable results and minimizes errors. To accomplish the task, they developed HDMI, wherein a kind of artificial intelligence called machine learning evaluates high-resolution images of tumor microvasculature.
The technique has previously shown promise in reaching accurate conclusions for breast tumors. In the new study, the authors tested HDMI’s mettle in the thyroid by evaluating tumors in 92 patients.
The researchers took pictures of the tumors with HDMI and measured a dozen features related to the size and shape of the microvasculature in the images, including their density and number of branching points.
The patients in the study, with their physicians’ input, all elected to have their tumors biopsied to determine malignancy status. Those with tumors that the procedure indicated were cancerous then underwent surgery to have the masses removed.
To teach their machine learning algorithms how to judge if a feature pointed one way or another, the researchers provided them with 70% of their imaging data from the patient tumors, along with malignancy status, essentially allowing the algorithms to study with an answer key in hand.
Through trial and error, the algorithms built predictive models, which the authors of the study put to task, using them to determine the status of tumors imaged in the remaining 30% of the data.
HDMI’s classifications were accurate 89% of the time based on the clinical assessments of the biopsies and surgeries.