Posted in | News | Medical Robotics

AI Model Integrates Genomics and Histology to Provide Prognostic Data for Cancer Patients

Although it has long been known that predicting outcomes in cancer patients necessitates taking into account a variety of characteristics, including patient history, genes, and disease pathology, doctors find it difficult to combine this data when making decisions regarding patient care.

AI Model Integrates Genomics and Histology to Provide Prognostic Data for Cancer Patients.

Image Credit: Shutterstock.com/ CI Photos

A proof-of-concept model that utilizes artificial intelligence (AI) to aggregate multiple forms of data from different sources to predict patient outcomes for 14 distinct types of cancer is revealed in a new study by researchers from the Mahmood Lab at Brigham and Women’s Hospital. The findings are released in Cancer Cell.

To diagnose and prognosticate diverse types of cancer, experts rely on a variety of sources of information, including patient history, pathology, and genetic sequencing.

While they can utilize this information to predict outcomes owing to the current technology, manually integrating data from many sources is difficult, and experts frequently find themselves making judgment calls.

Experts analyze many pieces of evidence to predict how well a patient may do. These early examinations become the basis of making decisions about enrolling in a clinical trial or specific treatment regimens. But that means that this multimodal prediction happens at the level of the expert. We are trying to address the problem computationally.

Faisal Mahmood, Assistant Professor, Division of Computational Pathology, Brigham and Women’s Hospital

Mahmood is an associate member of the Cancer Program at the Broad Institute of Harvard and MIT.

Mahmood and colleagues discovered a way to computationally combine several types of diagnostic data to provide more precise outcome predictions through the use of these new AI models.

The AI models show prognostic decision-making abilities while also revealing the predictive underpinnings of variables used to forecast patient risk, a quality that might be exploited to find new biomarkers.

The Cancer Genome Atlas (TCGA), a freely accessible resource that contains information on numerous types of cancer, was used by researchers to create the models. They subsequently created a multimodal deep learning-based algorithm that can extract prognostic data from a variety of data sources.

They were able to combine the technologies into a single integrated entity that offers important prognostic information by first developing distinct models for histology and genetic data.

Finally, scientists assessed the model’s performance by giving it data sets from 14 different cancer types as well as histological and genetic information about the patient. Results showed that the models produced are able to forecast patient outcomes that were more precise than those using only a single source of data.

This study shows that it is possible to predict disease outcomes by integrating several forms of clinically informed data using AI. According to Mahmood, these models might help researchers find biomarkers that integrate various clinical aspects and better comprehend the kind of data they require to identify distinct types of cancer.

The value of each diagnostic method for certain cancer types and the advantages of combining numerous modalities were also quantitatively evaluated by the researchers.

The AI models can also reveal pathologic and genetic characteristics that influence prognostic predictions. The scientists discovered that the models utilized patient immune responses as a predictive sign without being instructed to do so.

This is an important discovery since prior research has shown that patients with malignancies that evoke higher immune responses typically have better outcomes.

While this proof-of-concept model demonstrates a newly discovered use for AI in the treatment of cancer, this research merely represents the beginning of the clinical application of these models.

Larger data sets must be incorporated, and these models must be validated on several independent test cohorts before being used in clinical settings. Mahmood plans to incorporate more patient data in the future, including radiological scans, family histories, and electronic medical records, and ultimately apply the model to clinical trials.

Mahmood added, “This work sets the stage for larger health care AI studies that combine data from multiple sources. In a broader sense, our findings emphasize a need for building computational pathology prognostic models with much larger datasets and downstream clinical trials to establish utility.

Disclosures

Mahmood and co-author Richard Chen are the inventors of a patent application for deep learning-based multimodal data fusion.

Funding

The National Science Foundation (NSF) Graduate Fellowship, the National Library of Medicine (NLM) Biomedical Informatics and Data Science Research Training Program (T15LM00709), the National Human Genome Research Institute (NHGRI) Ruth L. Kirschstein National Research Service Award Bioinformatics Training Grant (T32HG002295), the NIGMS (R35GM138216), the BWH President's Fund, MGH Pathology, Google Cloud Research Grant, Nvidia GPU Grant Program, and the NIH National Cancer Institute (NCI) Ruth L. Kirschstein National Service Award (T32CA251062) funded the study.

Journal Reference:

Chen, R. J., et al. (2022) Pan-Cancer Integrative Histology-Genomic Analysis via Multimodal Deep Learning. Cancer Cell. doi:10.1016/j.ccell.2022.07.004.

Source: https://www.brighamandwomens.org/

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.