Posted in | News | Medical Robotics

Novel AI System Could Reduce Time Required to Get Expert Radiologist Opinion for Abnormal Chest X-rays

As part of a new study, researchers have found that the time required to obtain a quick opinion for abnormal chest X-rays with critical findings from an expert radiologist can be considerably reduced, from an average delay of 11 days to less than 3 days, by an innovative artificial intelligence (AI) system.

This is professor Giovanni Montana, Chair in Data Science WMG, University of Warwick. (Image credit: University of Warwick)

A broad range of conditions that affect the heart, lungs, soft tissues, and bones are often diagnosed and monitored by performing chest X-rays.

Scientists from WMG at the University of Warwick collaborated with Guy’s and St Thomas’ NHS Hospitals to derive a dataset of half million anonymised adult chest radiographs, or X-rays, and created an AI system for computer vision with the ability to identify radiological abnormalities in the X-rays in real-time and propose how swiftly these exams should be reported by a radiologist. While developing the AI system, the researchers created and validated a Natural Language Processing (NLP) algorithm with the potential to read a radiological report, understand the results cited by the reporting radiologist, and automatically deduce the priority level of the exam. The researchers applied this algorithm to the historical exams and produced a huge volume of training exams that enabled the AI system to perceive the visual patterns in X-rays that were predictive of their urgency level.

The team, headed by Professor Giovanni Montana, Chair in Data Science in WMG at the University of Warwick, noticed that normal chest radiographs were detected with a negative predicted value of 99% and a positive predicted value of 73%, and at a speed that indicated that it is possible to prioritize abnormal radiographs with critical findings to obtain an expert radiologist opinion quite sooner compared to the usual practice.

The outcomes of the study have been reported in Radiology, a leading journal, on January 22nd, 2019 in a paper titled “Automated triaging and prioritization of adult chest radiographs using deep artificial neural networks.”

Artificial intelligence led reporting of imaging could be a valuable tool to improve department workflow and workforce efficiency. The increasing clinical demands on radiology departments worldwide has challenged current service delivery models, particularly in publicly-funded healthcare systems. It is no longer feasible for many Radiology departments with their current staffing level to report all acquired plain radiographs in a timely manner, leading to large backlogs of unreported studies. In the United Kingdom, it is estimated that at any time there are over 300,000 radiographs waiting over 30 days for reporting. The results of this research shows that alternative models of care, such as computer vision algorithms, could be used to greatly reduce delays in the process of identifying and acting on abnormal X-rays—particularly for chest radiographs which account for 40% of all diagnostic imaging performed worldwide. The application of these technologies also extends to many other imaging modalities including MRI and CT.

Giovanni Montana, Professor and Chair in Data Science, WMG, University of Warwick.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.