Scientists at the University of Washington and University of California, Los Angeles, have designed an artificial intelligence system that could assist pathologists in interpreting biopsies more precisely, and lead to improved detection and diagnosis of breast cancer.
To diagnose breast cancer, clinicians analyze images of breast tissue biopsies. But the differences between cancerous and benign images can be challenging for the naked eye to categorize. This new algorithm helps understand them—and it does so almost as accurately or better than a skilled pathologist, based on the task. The study team reported its results in the August 9th issue of the journal JAMA Network Open.
This work concentrated on how to capture the characteristics of the different diagnostic classes by analyzing the pattern of the tissue classes surrounding the ducts in whole-slide images of breast biopsies. My doctoral student, Ezgi Mercan, invented a novel descriptor called the structure feature that was able to represent these patterns in a compact way for use in machine learning.
Linda Shapiro, Study Co-Author and Professor, Paul G. Allen School of Computer Science and Engineering and Electrical and Computer Engineering Department, University of Washington
In 2015, a study from the UW School of Medicine learned that pathologists frequently disagree on the interpretation of breast biopsies, which are conducted on millions of women annually. The study exposed that diagnostic errors happened for about one out of every six women who had a non-invasive type of breast cancer called “ductal carcinoma in situ.” Furthermore, wrong diagnoses were given in about 50% of the biopsy cases with abnormal cells that are related to a higher risk for breast cancer—a condition known as breast atypia.
Medical images of breast biopsies contain a great deal of complex data, and interpreting them can be very subjective. Distinguishing breast atypia from ductal carcinoma in situ is important clinically, but very challenging for pathologists. Sometimes doctors do not even agree with their previous diagnosis when they are shown the same case a year later.
Dr. Joann Elmore, Study Co-Author and Professor of Medicine, David Geffen School of Medicine, UCLA
She was previously a professor of internal medicine at the UW School of Medicine.
The researchers articulated that artificial intelligence could deliver more accurate readings reliably. It uses a large dataset that renders it conceivable for the machine learning system to identify patterns related to cancer that are hard for doctors to see. After examining the approaches that the pathologists used during breast biopsy interpretations, the team formulated image analysis techniques tailor-made to resolve these challenges.
The team uploaded 240 breast biopsy images into a computer, teaching it to distinguish patterns related to numerous types of breast lesions, ranging from noncancerous and atypia to ductal carcinoma in situ and invasive breast cancer. The accurate diagnoses were established by a consensus among three skilled pathologists.
To put the system to test, the scientists compared its readings to autonomous diagnoses made by 87 practicing U.S. pathologists who deduced the same cases. The algorithm was nearly as successful as human doctors in distinguishing cancer from non-cancer. But it outdid doctors when distinguishing ductal carcinoma in situ from atypia, properly diagnosing pre-invasive breast cancer biopsies about 89% of the time, compared to 70% for pathologists.
“These results are very encouraging,” Elmore said. “There is low accuracy among practicing pathologists in the U.S. when it comes to the diagnosis of atypia and ductal carcinoma in situ, and the computer-based automated approach shows great promise.”
The researchers are already involved in training the system to diagnose skin cancer.
Ezgi Mercan, a researcher at Seattle Children’s Hospital who carried out this research as a doctoral student in the Allen School, is the paper’s first author. Other authors include Sachin Mehta, a doctoral student in the UW’s electrical and computer engineering department; Dr. Jamen Bartlett at Southern Ohio Pathology Consultants; and Dr. Donald Weaver at the University of Vermont. This study was sponsored by the National Institutes of Health.