2744

Early Prediction of Total Knee Replacement using Structural MRI and 3D Deep Convolutional Neural Networks
Kevin Leung1, Gregory Chang2, Kyunghyun Cho3, and Cem Deniz4,5

1Courant Institute of Mathematical Sciences and Leonard N. Stern School of Business, New York University, New York, NY, United States, 2Department of Radiology, Center for Musculoskeletal Care, New York University Langone Medical Center, New York, NY, United States, 3Courant Institute of Mathematical Science and Center for Data Science, New York University, New York, NY, United States, 4Department of Radiology, Center for Advanced Imaging Innovation and Research (CAI2R) and Bernard and Irene Schwartz Center for Biomedical Imaging, New York University Langone Medical Center, New York, NY, United States, 5Sackler Institute of Graduate Biomedical Sciences, New University School of Medicine, New York, NY, United States

Synopsis

The early prediction of individuals who will eventually require total knee replacement (TKR) remains a challenging problem. In this project, we propose to use 3D deep convolutional neural networks (CNN) to predict the likelihood of a patient receiving a TKR within nine years using 718 subjects from the Osteoarthritis Initiative1 (OAI) dataset. We found that our model results in better performance compared to a logistic regression model using clinical risk factors2 (AUC: 0.8480±.0173 vs 0.7716±.0229 and accuracy: 77.15±1.88% vs. 71.16±2.70%).

Introduction

Osteoarthritis (OA) is the most common form of arthritis and the major cause of physical disability in older people. The discovery of imaging-based biomarkers could lead to identification of new treatment targets and mechanisms for shorter, more efficient clinical trials of possible disease-modifying agents. The OAI dataset has been used for exploring hand-crafted imaging biomarkers3-4. Recently, machine learning based approaches for detecting OA patients from segmented cartilage T2 relaxation maps have emerged5. In this study, we used deep learning methods to identify imaging biomarkers for knee OA using the MRI dataset collected for the OAI. Our aim is to automatically predict the outcome of OA as a subject’s probability of TKR within nine years directly from structural MR images. Figure 1 outlines our approach to model the probability of TKR using 3D CNN and structural MRI.

Methods

We created case-control pairs by matching individuals based on the baseline variables: age, BMI, gender and race; individuals were matched on a nearest-neighbor criterion using propensity score matching6 with these variables. Cases were defined as individuals who received a medically confirmed TKR after baseline; individuals who received a partial knee replacement or already received a TKR at baseline were not included in this study. For individuals who receive a TKR in both knees, we only used MRIs of the knee that was replaced first. Controls were defined as individuals who did not have a TKR in either knee on the 108-month visit. After matching, our study includes 359 cases and 359 controls. Figure 2 provides the summary of demographic characteristics of the patients with TKR and control subjects.

Using our matched pairs, we developed 3D CNNs with a binary target variable (1 for cases, 0 for controls) to predict TKR using baseline MRIs. We used the central 36 slices (each has 444x444 voxels) of the intermediate-weighted fat-saturated Turbo Spin Echo sequence of the Siemens 3T trio systems1. This sequence was chosen as it captures information about cartilage loss, meniscal tears, subchondral bone marrow edema, and ligament integrity.

Figure 3 shows the CNN used in this study. The weights of our kernels were initialized using the Glorot uniform method7. We used the rectified linear unit (ReLU)8 to apply non-linear transformations after our convolutional layers and batch normalization (BN)9 over the filter axis before every ReLU. In our first two convolution blocks, we used max-pooling immediately after each convolution-BN-ReLU operation to downsize the convolution outputs. For later layers, we use two convolution-BN-ReLU blocks before max-pooling; we repeat these operations twice before using global average pooling10 to get the average value of each channel after the 7th convolution layer. The global average pooling layer is followed by a fully-connected classification layer that uses softmax. Our network was trained using a binary-cross entropy loss function and the Adam11 optimizer with a learning rate of 1e-4. We used batch sizes of 8 and an early-stopping that finalizes training when the validation loss doesn’t improve for 15 epochs. We trained our network using stratified four-fold cross-validation.

The proposed method was compared against Hochberg’s logistic regression model2 that uses baseline age, gender, BMI, race, KOOS Quality of Life score, and WOMAC pain score. In the logistic regression model, we deliberately excluded the KL-grade variable as it directly influences the physician’s decision for TKR, resulting in over-matching.

Results

The proposed 3D CNN model achieved validation accuracy across the four-folds of 77.15±1.88%. ROC analysis of the proposed method and the logistic regression method is shown in Figure 4. The AUC (accuracy) for our CNN model and logistic regression model are .8480±.0173 (77.15±1.88%) and .7716±.0229 (71.16±2.70%), respectively. Using the DeLong test12, we found the difference between the two AUC curves to be highly significant with a p-value of .000212.

Discussion and Conclusion

We presented an automatic method for early TKR prediction using deep learning and structural knee MR images. Our preliminary results indicate that there is potential for using CNNs for predicting the TKR by extracting an ensemble of features directly from MR images. We found that MRI-based features extracted by 3D CNN result in better predictive power of describing TKR compared to baseline clinical variables.

In future studies, we plan to incorporate other MRI protocols from OAI’s dataset into our CNN-based imaging feature identification model. We also plan to combine CNN-based imaging features with clinical risk factors for improving the accuracy of the prediction of TKR. Additionally, we plan to understand underlying learning behavior of CNN using visualization tools13-15, this will allow us to determine where extracted features have emerged and how they are related.

Acknowledgements

This work was performed under the rubric of the Center for Advanced Imaging Innovation and Research (CAI2R, www.cai2r.net), an NIBIB Biomedical Technology Resource Center (NIH P41 EB017183).

References

1. Peterfy CG, Schneider E, Nevitt M. The osteoarthritis initiative: report on the design rationale for the magnetic resonance imaging protocol for the knee. Osteoarthritis Cartilage (2008) 16:1433–1441.

2. Hochberg, M.C. et al. Quality of life and radiographic severity of knee osteoarthritis predict total knee arthroplasty: data from the osteoarthritis initiative. Osteoarthritis and Cartilage (2013) Volume 21 , S11.

3. Barr, Andrew & Dube, Bright & M A Hensor, Elizabeth & Kingsbury, Sarah & Peat, George & A Bowes, Mike & D Sharples, Linda & G Conaghan, Philip. The relationship between three-dimensional knee MRI bone shape and total knee replacement-A case control study: Data from the Osteoarthritis Initiative. Rheumatology (Oxford, England) (2016) 55. 1585-1593. 10.1093/rheumatology/kew191

4. Kumm, Jaanika & Roemer, Frank & Guermazi, Ali & Turkiewicz, Aleksandra & Englund, Martin. Natural History of Intrameniscal Signal Intensity on Knee MR Images: Six Years of Data from the Osteoarthritis Initiative. Radiology (2016) 278. 142905. 10.1148/radiol.2015142905

5. Pedoia, Valentina, et al. Predicting Osteoarthritis Radiographic Incidence by Coupling Quantitative Compositional MRI and Deep Learning. ISMRM (2016).

6. Rosenbaum PR, Rubin DB. The central role of the propensity score in observational studies for causal effects. Biometrika (1983) 70:4155.

7. Glorot, Xavier & Bengio, Yoshua. Understanding the difficulty of training deep feedforward neural networks. AISTATS (2010).

8. Nair, Vinod & Hinton, Geoffrey E. Rectified Linear Units Improve Restricted Boltzmann Machines. ICML (2010) 807-814.

9. Ioffe, Sergey & Szegedy, Christian. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift. arXiv:1502.03167v3 (2015).

10. Lin, Min & Chen, Qiang & Yan, Shuicheng. Network in Network. arXiv:1312.4400v3 (2014).

11. D. P. Kingma and J. Ba. Adam: A Method for Stochastic Optimization. CoRR, vol. abs/1412.6 (2014).

12. DeLong, ER & DeLong, DM & Clarke-Pearson, DL. Comparing the areas under two or more corelated receiver operating characteristic curves: a nonparametric approach. Biometrics (1988) 837-845.

13. Josinski, J., Clune, J., Nguyen, A., Fuchs, T. & Lipson, H. Understanding Neural Networks Through Deep Visualization. Int. Conf. Mach. Learn. - Deep Learn. Work. 2015 12 (2015).

14. Nguyen, A., Yosinski, J. & Clune, J. Multifaceted Feature Visualization: Uncovering the Different Types of Features Learned By Each Neuron in Deep Neural Networks. arXiv: 1602.03616 23 (2016).

15. Zeiler, M. & Fergus, R. Visualizing and understanding convolutional networks. Comput. Vision–ECCV 2014 8689, 818–833 (2014).

Figures

Figure 1: The above figure outlines our approach to the problem of predicting total knee replacement. We first take MR images of size 444x444x36 and normalize them to have zero-mean unit-variance per image prior to CNN training and inference. We train our 3-D deep CNN with the binary cross-entropy loss function using Adam11 as an optimizer.

Figure 2: This table provides summary statistics of the confounding variables for the matched case-control pairs in our study.

Figure 3: The architecture of the 3D CNN used in this work. Each convolutional layer is followed by batch-normalization along the filter axis then ReLU is used for an activation.

Figure 4: The ROC curves of the proposed CNN model and the logistic regression model2. ROC curves from both models indicate the improved predictive power of TKR using 3D CNN with structural MRIs.

Proc. Intl. Soc. Mag. Reson. Med. 26 (2018)
2744