2318

Implementation of A Novel Deep Learning Network on Predicting Isocitrate Dehydrogenase (IDH) Mutation in Patients with Gliomas
Haonan Xiao1 and Zheng Chang2

1Medical Physics Program, Duke Kushan University, Kunshan, China, 2Radiation Oncology, Duke University Medical Center, Durham, NC, United States

Synopsis

The purpose of this study is to investigate the feasibility of the Inception-ResNet to reduce image pre-processing and improve the prediction accuracy of the IDH status of gliomas. The T1w-post contrast, T2, and FLAIR images of 91 glioma patients after intensity normalization are fed to the network as training and validation set, and another group of 12 patients is randomly selected as the test set. The prediction accuracies of two repeated experiments are consistent, both greater than 90%. The result shows that with Inception-Resnet, IDH status could be predicted at a high accuracy with minimal image pre-processing.

Introduction

In 2016, the World Health Organization divided gliomas into two subgroups, which are IDH mutant glioma and IDH wild-type gliomas.1 IDH-mutant gliomas are more sensitive to radiotherapy or chemotherapy and generally associated with better prognosis.2 Therefore, determination of the IDH status is valuable in clinical decisions. MRI is the most commonly used diagnostic imaging modality for gliomas. Recent studies have shown convolutional neural networks (CNN) could be used to determine IDH status based on MRI images.3 However, their training model requires image preprocessing, such as isotropic resampling, whole brain extraction, segmentation, and registration and it takes a long time, limiting its clinical application. Lately, a new deep learning architecture called Inception-ResNet was proposed and showed great potential on classifying images.4 The purpose of this study is to investigate the feasibility of the new network architecture to reduce image preprocessing and improve classification accuracy on the determination of the IDH status of gliomas.

Methods

103 patients were selected from TCGA-LGG and TCGA-GBM 5 based on the following criteria: (1) IDH status is documented; (2) MRI scan sequences include T1w-post contrast, FLAIR, and T2w; (3) Slice location is documented in the DICOM files; (4) Tumor location is known. Each patient can provide several slices of tumors depending on the tumor size. Intensity normalization is applied to every individual slice by subtracting its average pixel intensity and divided by its standard deviation. No other preprocessing is required. The T1w-post contrast, FLAIR, and T2w images at the same slice location are grouped together and are considered as one training sample. Hence, the experiment involves 209 IDH mutant glioma and 356 IDH wild-type glioma samples. The test set consists of 24 samples from 6 IDH mutant patients and 25 samples from 6 IDH wild-type patients, and the patients are randomly selected. The remaining data are randomly assigned to the training set and validation set for model training. To reduce overfitting in the model training, data augmentation is applied to both training and validation sets, which is random image rotation within 10 degrees. IDH mutant slices are augmented 20 times while IDH wild-type slices are augmented 12 times to balance the sample numbers. Images from one sample are fed to different input channels of Inception-ResNet, and the extracted features are concatenated at the fully connected layer together with patient’s age at diagnosis, on which the predictions are based (Figure (1)). The training is based on Keras with TensorFlow as backend. The Adam algorithm optimizes the binary cross-entropy objective function with a mini-batch size of 8. Learning rate determines the convergence of the objective function, and it decreases in several periods so that the optimizer can reach the global minimum of the objective function, as shown in Figure (2). The training ends when it reaches the 70th epoch or stops early once the training loss reaches 10-5. The network is trained on a GTX 1080 GPU card. To test its robustness, the above process is repeated with another pair of randomly selected training and test data sets from the 103 patients.

Results

Each training epoch takes 15 minutes and the two training processes stop at the 30th and 26th epoch, respectively, for the two different pairs of training and test data sets. The test results including sensitivity and specificity are shown in Table 1. As indicated in the table, the prediction results are generally consistent with high prediction accuracies even with different training and test data sets, which are 0.918 and 0.902, respectively.

Discussion

With Inception-Resnet, IDH status could be predicted at a high accuracy with minimal image pre-processing. While the feasibility of Inception-ResNet has been successfully demonstrated with glioma data, its robustness should be further tested with more clinical data. Although demonstrated only with brain data, its potential applications could be extended to other clinical sites such as prostate and breast.

Acknowledgements

No acknowledgement found.

References

1. Louis, D. N., Perry, A., et al. (2016). The 2016 World Health Organization classification of tumors of the central nervous system: a summary. Acta neuropathologica, 131(6), 803-820.

2. Molenaar, R. J., Maciejewski, J. P., et al. (2018). Wild-type and mutated IDH1/2 enzymes and therapy responses. Oncogene, 37(15), 1949-1960.

3. Chang, K., Bai, H. X., et al. (2018). Residual Convolutional Neural Network for the Determination of IDH Status in Low- and High-Grade Gliomas from MR Imaging. Clinical Cancer Research, 24(5), 1073-1081.

4. Szegedy, C., Ioffe, S., et al. (2017). Inception-v4, inception-resnet and the impact of residual connections on learning. arXiv:1602.07261, 2016.

5. Clark K, Vendt B, et al. (2013). The Cancer Imaging Archive (TCIA): maintaining and operating a public information repository. Journal of digital imaging, 26(6), 1045-57.

Figures

Figure 1: The modified Inception-ResNet

Figure 2: The learning rate

Table 1: Results on test sets

Proc. Intl. Soc. Mag. Reson. Med. 27 (2019)
2318