Three-dimensional (3D) prostate image segmentation is useful for cancer diagnosis and therapy guidance, but can be time-consuming to perform manually and involves varying levels of difficulty and interoperator variability within the prostatic base, midgland (MG), and apex. In this study, the authors measured accuracy and interobserver variability in the segmentation of the prostate on T2-weighted endorectal magnetic resonance (MR) imaging within the whole gland (WG), and separately within the apex, midgland, and base regions.
The authors collected MR images from 42 prostate cancer patients. Prostate border delineation was performed manually by one observer on all images and by two other observers on a subset of ten images. The authors used complementary boundary-, region-, and volume-based metrics [mean absolute distance (MAD), Dice similarity coefficient (DSC), recall rate, precision rate, and volume difference (ΔV)] to elucidate the different types of segmentation errors that they observed. Evaluation for expert manual and semiautomatic segmentation approaches was carried out. Compared to manual segmentation, the authors’ semiautomatic approach reduces the necessary user interaction by only requiring an indication of the anteroposterior orientation of the prostate and the selection of prostate center points on the apex, base, and midgland slices. Based on these inputs, the algorithm identifies candidate prostate boundary points using learned boundary appearance characteristics and performs regularization based on learned prostate shape information.
The semiautomated algorithm required an average of 30 s of user interaction time (measured for nine operators) for each 3D prostate segmentation. The authors compared the segmentations from this method to manual segmentations in a single-operator (mean whole gland MAD = 2.0 mm, DSC = 82%, recall = 77%, precision = 88%, and ΔV = − 4.6 cm3) and multioperator study (mean whole gland MAD = 2.2 mm, DSC = 77%, recall = 72%, precision = 86%, and ΔV = − 4.0 cm3). These results compared favorably with observed differences between manual segmentations and a simultaneous truth and performance level estimation reference for this data set (whole gland differences as high as MAD = 3.1 mm, DSC = 78%, recall = 66%, precision = 77%, and ΔV = 15.5 cm3). The authors found that overall, midgland segmentation was more accurate and repeatable than the segmentation of the apex and base, with the base posing the greatest challenge.
The main conclusions of this study were that (1) the semiautomated approach reduced interobserver segmentation variability; (2) the segmentation accuracy of the semiautomated approach, as well as the accuracies of recently published methods from other groups, were within the range of observed expert variability in manual prostate segmentation; and (3) further efforts in the development of computer-assisted segmentation would be most productive if focused on improvement of segmentation accuracy and reduction of variability within the prostatic apex and base.
This work was supported by the Ontario Institute for Cancer Research and the Ontario Research Fund. A. Fenster holds a Canada Research Chair in Biomedical Engineering and acknowledges the support of the Canada Research Chair Program. A. D. Ward holds a Cancer Care Ontario Research Chair in Cancer Imaging.
1. INTRODUCTION 2. MATERIALS AND METHODS 2.A. Materials 2.B. Semiautomated segmentation 2.B.1. Training 2.B.2. Segmentation 2.C. Validation metrics 2.C.1. Mean absolute distance 2.C.2. Dice similarity coefficient 2.C.3. Recall rate 2.C.4. Precision rate 2.C.5. Volume difference (ΔV) 3. EXPERIMENTS 3.A. Interoperator variability: Manual segmentation 3.B. Accuracy and interoperator variability: Semiautomatic segmentation 3.C. Sensitivity to initialization: Semiautomatic segmentation 3.D. Sources of variability: Semiautomatic segmentation 4. RESULTS 4.A. Interoperator variability: Manual segmentation 4.B. Accuracy and interoperator variability: Semiautomatic segmentation 4.C. Sensitivity to initialization: Semiautomatic segmentation 4.C.1. Sensitivity to center point selection 4.C.2. Sensitivity to anteroposterior symmetry axes selection 4.D. Sources of variability: Semiautomatic segmentation 5. DISCUSSION 5.A. Interoperator variability: Manual segmentation 5.B. Accuracy and interoperator variability: Semiautomatic segmentation 5.C. Sensitivity to initialization: Semiautomatic segmentation 5.D. Sources of variability: Semiautomatic segmentation 5.E. Limitations 5.F. Conclusions
Data & Media loading...
Article metrics loading...
Full text loading...