Despite our remarkable progress in medicine and healthcare, cancer treatment continues for us. On the bright side, we have made considerable progress in detecting many cancers in earlier stages, allowing doctors to provide long-term survival treatments. The credit for this is due to “integrated diagnosis”, an approach to patient care that combines molecular information and medical imaging data to diagnose cancer types and, ultimately, predict treatment outcomes.
However, there are many intricacies involved. The correlation of molecular patterns, such as gene expression and mutation, with image characteristics (such as how a tumor appears in a CT scan), is commonly referred to as “radiosomics”. This area is limited by its frequent use of high-dimensional data, with the number of features exceeding observations. Radiosomics also suffers from several simple model assumptions and a lack of validation datasets. While machine learning techniques such as deep neural networks can alleviate this situation by providing an accurate prediction of image features from gene expression patterns, a new problem arises: we do not know what the model has learned.
“The ability to understand models is critical to understanding and validating learned radiogenomic associations,” explains William Hsu, associate professor of radiological sciences at the University of California, Los Angeles, and director of Integrated Diagnostics Sharp Resources. Hsu’s lab works on problems related to data integration, machine learning and imaging informatics. In an earlier study, Hsu and his colleagues used a method of interpreting neural networks called “gene masking” to interrogate trained neural networks to understand learned associations between genes and imaging phenotypes. He demonstrated that the radiogenic associations discovered by his model were consistent with earlier knowledge. However, he used a single dataset for brain tumors only in his previous study, meaning that the generalizability of his approach was to be determined.
Against this backdrop, Hu and his colleagues, Nova Smedley, former graduate student and lead author, and Dennis Eberle, a thoracic radiologist, have investigated a study of whether deep neural network gene expression, histology (microscopic features of biologicals) May represent associations between tissues), and CT-derived image features. They found that the network could not only reproduce previously reported associations, but also identify new ones. The results of this study are published in Journal of Medical Imaging.
Researchers used a dataset of 262 patients to estimate 101 features from their vast collection of 21,766 gene expressions to their neural networks. They then tested its predictive capacity on an independent dataset of 89 patients, reducing its capacity against other models within the training dataset. Finally, they applied gene masking to determine the learned associations between subsets of genes and types of lung cancer.
They found that the overall performance of the neural network in representing these datasets was better than other models and generalized to datasets from another population. Additionally, gene masking results suggested that the prediction of each imaging feature was related to a unique gene expression profile governed by biological processes.
Researchers are encouraged by their findings. “While radiogenomic associations have already been shown to accurately put patients at risk, we are excited by the possibility that our model can better identify and understand the significance of these associations. We hope that this The approach increases the radiologist’s confidence in assessing the type of lung cancer. On a CT scan. This information will be highly beneficial in informing the individual treatment plan, “Hasu.
Nova f. Smedley et al., Using deep neural networks and explanatory methods to identify patterns of gene expression that predict radiation characteristics and histology in non-small cell lung cancer. Journal of Medical Imaging (2021). DOI: 10.1117 / 1.JMI.8.3.031906
Quotes: Https://techxplore.com/news/2021-05-medical-imaging-cancer-biology-deep.html from 11 May 2021 by integrating medical imaging and cancer biology with deep neural networks (2021, 10 May)
This document is subject to copyright. No part may be reproduced without written permission, except for any impartial behavior for the purpose of private study or research. The content is provided for information purposes only.