AI and Radiologists Together: Better Breast Cancer Diagnoses?

Radiologists and data scientists decided to get together to evaluate the effectiveness of artificial intelligence and human observation when it comes to detecting breast cancer in mammograms.

A team led by New York University radiologists and data scientists evaluated the effectiveness of a deep neural network to look at mammogram conclusions, and then checked it against the human powers of observation.

The findings showed that the combination of the artificial intelligence (AI) with the trained eye of the radiologists led to the best results, as presented in the new study IEEE Transactions on Medical Imaging

“In our study, a hybrid model including both a neural network and expert radiologists outperformed either individually, suggesting that the use of such a model could improve radiologist sensitivity for breast cancer detections,” the authors concluded. 

The AI tool was trained on 1,001,093 images from 141,473 patients; the data came from most patients involving 4 standard scanning mammography images of each breast for each of those patients, according to the paper. 

The machine first focused on small patches of the full images to create “heat maps” of disease likelihood. The overall structural features were processed afterward, guided by those heat maps, according to the researchers. The machine-learning methodology pickled its own identifying features. 

The training set had full outcomes, so the machine’s accuracy was gauged. It was then tested against a reader study of 14 humans (12 attending radiologists, a resident, and a medical student). 

Each human read 740 exams; 368 of them were randomly selected from a biopsied subpopulation, and 372 images randomly chosen from a group not matched with any biopsy, according to the study. Those images were then assessed by the machines, as well. 

Put together, the machine and the human observers produced an accuracy of 89.5% (AUC), according to the results. 

AI saw some cancers that the radiologists could not, and vice-versa, said Krzysztof J. Geras, PhD, assistant professor of radiology at NYU Langone School of Medicine in New York, New York. 

“AI detected pixel-level changes in tissue invisible to the human eye, while humans used forms or reasoning not available to AI,” said Geras. 

“Although our results are promising,” the paper added, “we acknowledge that the rest set used in our experiments is relatively small and our results require further clinical validation.”

The objective is to require fewer biopsies, said Geras. The goal is not to have machines making sole diagnoses-it’s to make them available as tool to trained human experts, he said. 


“The ultimate goal of our work is to augment, not replace, human radiologists,” Geras added.