Positron emission tomography (PET) is playing an increasing role in radiotherapy treatment planning. However, despite progress, robust algorithms for PET and multimodal image segmentation are still lacking, especially if the algorithm were extended to image-guided and adaptive radiotherapy (IGART). This work presents a novel multimodality segmentation algorithm using the Jensen-Rényi divergence (JRD) to evolve the geometric level set contour. The algorithm offers improved noise tolerance which is particularly applicable to segmentation of regions found in PET and cone-beam computed tomography.
A steepest gradient ascent optimization method is used in conjunction with the JRD and a level set active contour to iteratively evolve a contour to partition an image based on statistical divergence of the intensity histograms. The algorithm is evaluated using PET scans of pharyngolaryngeal squamous cell carcinoma with the corresponding histological reference. The multimodality extension of the algorithm is evaluated using 22 PET/CT scans of patients with lung carcinoma and a physical phantom scanned under varying image quality conditions.
The average concordance index (CI) of the JRD segmentation of the PET images was 0.56 with an average classification error of 65%. The segmentation of the lung carcinoma images had a maximum diameter relative error of 63%, 19.5%, and 14.8% when using CT, PET, and combined PET/CT images, respectively. The estimated maximal diameters of the gross tumor volume (GTV) showed a high correlation with the macroscopically determined maximal diameters, with aR 2 value of 0.85 and 0.88 using the PET and PET/CT images, respectively. Results from the physical phantom show that the JRD is more robust to image noise compared to mutual information and region growing.
The JRD has shown improved noise tolerance compared to mutual information for the purpose of PET image segmentation. Presented is a flexible framework for multimodal image segmentation that can incorporate a large number of inputs efficiently for IGART.
The authors would like to acknowledge Greg Twork for his assistance in collecting the CBCT data and John Kildea for assisting with data collection. The authors also thank Dr. Dekker and Dr. De Ruysscher (MAASTRO clinic, Maastricht, The Netherlands) and Dr. Lee (Université Catholique de Louvain, Brussels, Belgium) for providing the clinical PET datasets. Funding was provided by the Natural Sciences and Engineering Research Council of Canada (NSERC-RGPIN 397711-11) and the Research Institute of the McGill University Health Centre. Dr. Zaidi is also supported by the Swiss National Science Foundation under Grant No. SNSF 31003A-135576 and Geneva Cancer League. The authors of this paper have no direct financial connection to the any commercial identities mentioned in the paper and no competing interests to disclose.
II. MATERIALS AND METHODS
II.A.1. Level sets
II.A.2. The Jensen Rényi divergence
II.B. Datasets and validation
II.B.1. Experimental phantom studies
II.C. Clinical studies
II.C.1. Louvain database
II.C.2. MAASTRO database
II.C.3. Validation metrics
II.C.4. Hardware and implementation
III.A. Phantom studies
III.B. Clinical PET evaluation: Louvain database
III.C. Clinical PET/CT evaluation: MAASTRO database
- Positron emission tomography
- Medical imaging
- Computed tomography
- Medical image noise
Data & Media loading...
Article metrics loading...
Full text loading...