Researchers use Machine Learning to Accurately Diagnose Fetal Heart Defects

UC San Francisco researchers have found a way to double doctors' accuracy in detecting the vast majority of complex fetal heart defects in utero - when interventions could either correct them or greatly improve a child's chance of survival - by combining routine ultrasound imaging with machine-learning computer tools.

The team, led by UCSF cardiologist Rima Arnaout, MD, trained a group of machine-learning models to mimic the tasks that clinicians follow in diagnosing complex congenital heart disease (CHD). Worldwide, humans detect as few as 30 to 50 percent of these conditions before birth. However, the combination of human-performed ultrasound and machine analysis allowed the researchers to detect 95% of CHD in their test dataset.

The findings appear in the May issue of Nature Medicine.

Fetal ultrasound screening is universally recommended during the second trimester of pregnancy in the United States and by the World Health Organization. Diagnosis of fetal heart defects, in particular, can improve newborn outcomes and enable further research on in utero therapies, the researchers said.

"Second-trimester screening is a rite of passage in pregnancy to tell if the fetus is a boy or girl, but it is also used to screen for birth defects," said Arnaout, a UCSF assistant professor and lead author of the paper. Typically, the imaging includes five cardiac views that could allow clinicians to diagnosis up to 90 percent of congenital heart disease, but in practice, only about half of those are detected at non-expert centers.

"On the one hand, heart defects are the most common kind of birth defect, and it's very important to diagnose them before birth," Arnaout said. "On the other hand, they are still rare enough that detecting them is difficult even for trained clinicians, unless they are highly sub-specialized. And all too often, in clinics and hospitals worldwide, sensitivity and specificity can be quite low."

The UCSF team, which included fetal cardiologist and senior author Anita Moon-Grady, MD, trained the machine tools to mimic clinicians' work in three steps. First, they utilized neural networks to find five views of the heart that are important for diagnosis. Then, they again used neural networks to decide whether each of these views was normal or not. Then, a third algorithm combined the results of the first two steps to give a final result of whether the fetal heart was normal or abnormal.

"We hope this work will revolutionize screening for these birth defects," said Arnaout, a member of the UCSF Bakar Computational Health Sciences Institute, the UCSF Center for Intelligent Imaging, and a Chan Zuckerberg Biohub Intercampus Research Award Investigator. "Our goal is to help forge a path toward using machine learning to solve diagnostic challenges for the many diseases where ultrasound is used in screening and diagnosis."

Co-authors include Lara Curran, MBBS; Yili Zhao, PhD, and Erin Chinn, MS, from UCSF; and Jami Levine from Boston Children's Hospital. The project was supported by the UCSF Academic Research Systems, National Institutes of Health (UL1 TR991872 and R01HL150394), the American Heart Association and the Department of Defense. Please see the paper for additional authors, funding details, and disclosures.

Source: https://www.ucsf.edu/

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.