A new AI algorithm has been trained like a radiologist to show how it diagnoses malignant and benign breast tumors
New algorithm shows if — and why — cancer patients should undergo biopsies
January 26, 2022
by
John R. Fischer, Senior Reporter
A new algorithm developed at Duke University is designed to not only help indicate if a cancer patient should undergo an invasive biopsy but shows radiologists how it reached its conclusions.
Designed to locate and evaluate potentially cancerous lesions in mammograms, the AI solution is trained just like a radiologist would be to assess tumors, and can show the evidence it used to make its findings. This involves assessing the edges of lesions, rather than surrounding tissues, or learn based on how the specific equipment that it is used with functions. As a result, it can freely develop its own procedures.
Additionally, it was trained with a validation data set of 1,136 images. The platform classifies a breast lesion into a specific category. Radiologists can then assess if the AI has based its decision on viable criteria. If so, the classification can be confidently included in the radiologist’s report. If not, the radiologist can override the AI and understand why it failed.
Most independent algorithms for medical imaging are trained on less than 1000 scans or contain demographic information. This, along with recent failures around the use of these solutions, has led many physicians to question their use in high-stakes medical decisions.
“By suggesting an interpretable diagnosis, the algorithm can help radiologists to improve their performance and consistency. Long term, we hope that this approach can avoid many benign biopsies, which would benefit not only the patients but also expedite the time-consuming processes of diagnostic workup and biopsy scheduling,” Joseph Lo, professor of radiology at Duke University School of Medicine, told HCB News.
The aim, according to Lo and his colleagues, is for their approach to reduce the number of benign biopsies. This would not only benefit patients but expedite the time-consuming processes of diagnostic workup and biopsy scheduling.
Images were taken from 484 patients at Duke University Health System. Using them, the researchers taught the algorithm to find the suspicious lesions in question and ignore the health tissue surrounding it and other irrelevant data.
Images were labeled to teach it to focus on the edges of lesions, where the potential tumors met healthy surrounding tissue, and compare them to edges in images with known cancerous and benign outcomes.
Radiologists often look for radiating lines or fuzzy edges known as mass margins first when assessing tumors, as they are the best predictors of cancerous breast lesions. Also, cancerous cells replicate and expand fast, which makes it hard to see all of a developing tumor’s edges in mammograms.
While the algorithm did not outperform human radiologists, it did just as well as other black box computer models. The difference was when the algorithm was wrong, people using it were able to recognize its error and see why it made a mistake.
The team is working to make the platform take into account other physical characteristics in its decision-making. One is lesion shape, which is the second feature that radiologists look for. They also are beginning to expand its use to CT imaging.
“This AI platform should be applicable to most imaging tasks, thanks to its explainability,” said co-author Dr. Fides Regina Schwartz, research fellow in radiology at Duke. “There are many valuable targets but ideally, we would next apply it to other entities that have clear classification systems already in place (e.g., prostate cancer with PI-RADS, lung cancer with LUNG-RADS, liver cancer with LI-RADS).”
The university has provided a Duke MEDx High-Risk High-Impact Award to continue development of the algorithm and to conduct a radiologist reader study on its clinical performance and confidence.
Supporting the research are the National Institutes of Health/National Cancer Institute; MIT Lincoln Laboratory; Duke TRIPODS; and the Duke Incubation Fund.
The findings were published in Nature Machine Intelligence.