Automatic segmentation both in the whole prostate gland and the peripheral zone is a meaningful work, because there are different evaluation criteria for different regions according to prostate imaging reporting and data system's advice. Here we show a new method base on deep learning which can get the prostate outer contour and the peripheral zone contour fast and accurately without any manual intervention. The mean segmentation accuracies for 262 images are 94.87% ( the whole prostate gland) and 85.66% (the peripheral zone). Even in some extreme cases, such as hyperplasia and cancer, our method shows relatively good performance.
Image Acquisitions
Clinical routine MRI was conducted on a cohort of 163 subjects, including healthy subjects, prostate cancer patients and prostate hyperplasia patients. The imaging was performed on a 3.0 T Ingenia system (Philips Healthcare, the Netherlands), with acquisition of T2-weighted, diffusion-weighted and dynamic contrast-enhanced images, with standard imaging protocol. For each subjects, 5 or 6 typical slices were selected as the data set (a grand total of 862 typical slices), and 70% of the total was selected as the train data set randomly .
Algorithm description
U-net is a U shape of fully convolutional network proposed by Ronneberger et al. in 2015[3]. It consists of a contracting path to capture context and a symmetric expanding path that enables retain details before maxpooling. As we can see in Fig1, we cascaded two U-net, the first network segment the whole prostate gland, and the second network segment the peripheral zone.
The concrete steps are as follows:
Accuracy in Our In-House Images
Fig. 2 shows some examples of segmentation result on T2W image. All manual segmentation results in our dataset were outlined by three experts with more than five-year-experience. We evaluated our algorithm with manual outlining by Dice coefficient, over-segmentation coefficient and less-segmentation coefficient. Then, only use one U-net in original images and use one U-net but cropped ROI are compared with our proposed method in experiments. (P.S. The whole prostate gland needs to be segmented first before cascaded in our method, so the result is the same as the second method in Table1) The results show in Fig. 2 and Table 1 indicates that the proposed method produced accurate and detailed segmentations both in the whole prostate gland and the peripheral zone.
Computational Time:
The mean segmentation time for one slice is less than 0.1 second on a Core i5, GTX1060 PC.
Discussion
It is well to know that the prostate gland accounts for a small part on the whole T2W image, and the peripheral zone is much smaller and has different features. Thus, through cropped the ROI based on DWI and cascaded two U-net can decreases the search space and save a lot of training time. As a result, the segmentation accuracy both in the whole prostate gland and the peripheral zone can be improved. Even in some extreme cases, like Fig.2(c) to (e), there are hypointensity shadows with hazy border in peripheral zone areas, our method can still get relatively good results. Similarly, in Table1 we can find that the minimum Dice coefficient in our method is higher. It means that our method has a good robustness. On the other hands, Relatively decreased performance always show in patients with severe lesions that the shape of the prostate have been greatly deformed, Therefore, expanding the proportion of cancer patients in train data set is is very necessary in the future .
[1]. Siegel R, Ma J, Zou Z, et al. Cancer statistics, 2014.[J]. Ca A Cancer Journal for Clinicians, 2014, 64(1):9.
[2]. Villers A, Lemaitre L, Haffner J, et al. Current status of MRI for the diagnosis, staging and prognosis of prostate cancer: implications for focal therapy and active surveillance[J]. Current opinion in urology, 2009, 19(3): 274-282.
[3]. Ronneberger O, Fischer P, Brox T. U-Net: Convolutional Networks for Biomedical Image Segmentation[C]// International Conference on Medical Image Computing and Computer-Assisted Intervention. Springer, Cham, 2015:234-241.