Volume 42, Issue 7, July 2015
- medical physics letter
- radiation therapy physics
- radiation imaging physics
- radiation measurement physics
- magnetic resonance physics
- nuclear medicine physics
- ultrasound physics
- thermotherapy physics
- tissue measurements
- anatomy and physiology
- radiation protection physics
Index of content:
Pulmonary positron emission tomography (PET) imaging is confounded by blurring artifacts caused by respiratory motion. These artifacts degrade both image quality and quantitative accuracy. In this paper, the authors present a complete data acquisition and processing framework for respiratory motion compensated image reconstruction (MCIR) using simultaneous whole body PET/magnetic resonance (MR) and validate it through simulation and clinical patient studies.Methods:
The authors have developed an MCIR framework based on maximum a posteriori or MAP estimation. For fast acquisition of high quality 4D MR images, the authors developed a novel Golden-angle RAdial Navigated Gradient Echo (GRANGE) pulse sequence and used it in conjunction with sparsity-enforcing k-t FOCUSS reconstruction. The authors use a 1D slice-projection navigator signal encapsulated within this pulse sequence along with a histogram-based gate assignment technique to retrospectively sort the MR and PET data into individual gates. The authors compute deformation fields for each gate via nonrigid registration. The deformation fields are incorporated into the PET data model as well as utilized for generating dynamic attenuation maps. The framework was validated using simulation studies on the 4D XCAT phantom and three clinical patient studies that were performed on the Biograph mMR, a simultaneous whole body PET/MR scanner.Results:
The authors compared MCIR (MC) results with ungated (UG) and one-gate (OG) reconstruction results. The XCAT study revealed contrast-to-noise ratio (CNR) improvements for MC relative to UG in the range of 21%–107% for 14 mm diameter lung lesions and 39%–120% for 10 mm diameter lung lesions. A strategy for regularization parameter selection was proposed, validated using XCAT simulations, and applied to the clinical studies. The authors’ results show that the MC image yields 19%–190% increase in the CNR of high-intensity features of interest affected by respiratory motion relative to UG and a 6%–51% increase relative to OG.Conclusions:
Standalone MR is not the traditional choice for lung scans due to the low proton density, high magnetic susceptibility, and low relaxation time in the lungs. By developing and validating this PET/MR pulmonary imaging framework, the authors show that simultaneous PET/MR, unique in its capability of combining structural information from MR with functional information from PET, shows promise in pulmonary imaging.
42(2015); http://dx.doi.org/10.1118/1.4919281View Description Hide Description
- MEDICAL PHYSICS LETTER
42(2015); http://dx.doi.org/10.1118/1.4922206View Description Hide DescriptionPurpose:
Developed herein is a three-dimensional (3D) flow contrast imaging system leveraging advancements in the extension of laser speckle contrast imaging theories to deep tissues along with our recently developed finite-element diffuse correlation tomography (DCT) reconstruction scheme. This technique, termed speckle contrast diffuse correlation tomography (scDCT), enables incorporation of complex optical property heterogeneities and sample boundaries. When combined with a reflectance-based design, this system facilitates a rapid segue into flow contrast imaging of larger, in vivo applications such as humans.Methods:
A highly sensitive CCD camera was integrated into a reflectance-based optical system. Four long-coherence laser source positions were coupled to an optical switch for sequencing of tomographic data acquisition providing multiple projections through the sample. This system was investigated through incorporation of liquid and solid tissue-like phantoms exhibiting optical properties and flow characteristics typical of human tissues. Computer simulations were also performed for comparisons. A uniquely encountered smear correction algorithm was employed to correct point-source illumination contributions during image capture with the frame-transfer CCD and reflectance setup.Results:
Measurements with scDCT on a homogeneous liquid phantom showed that speckle contrast-based deep flow indices were within 12% of those from standard DCT. Inclusion of a solid phantom submerged below the liquid phantom surface allowed for heterogeneity detection and validation. The heterogeneity was identified successfully by reconstructed 3D flow contrast tomography with scDCT. The heterogeneity center and dimensions and averaged relative flow (within 3%) and localization were in agreement with actuality and computer simulations, respectively.Conclusions:
A custom cost-effective CCD-based reflectance 3D flow imaging system demonstrated rapid acquisition of dense boundary data and, with further studies, a high potential for translatability to real tissues with arbitrary boundaries. A requisite correction was also found for measurements in the fashion of scDCT to recover accurate speckle contrast of deep tissues.
- RADIATION THERAPY PHYSICS
42(2015); http://dx.doi.org/10.1118/1.4921615View Description Hide DescriptionPurpose:
This work presents a method for fast volumetric modulated arc therapy (VMAT) adaptation in response to interfraction anatomical variations. Additionally, plan parameters extracted from the adapted plans are used to verify the quality of these plans. The methods were tested as a prostate class solution and compared to replanning and to their current clinical practice.Methods:
The proposed VMAT adaptation is an extension of their previous intensity modulated radiotherapy (IMRT) adaptation. It follows a direct (forward) planning approach: the multileaf collimator (MLC) apertures are corrected in the beam’s eye view (BEV) and the monitor units (MUs) are corrected using point dose calculations. All MLC and MU corrections are driven by the positions of four fiducial points only, without need for a full contour set. Quality assurance (QA) of the adapted plans is performed using plan parameters that can be calculated online and that have a relation to the delivered dose or the plan quality. Five potential parameters are studied for this purpose: the number of MU, the equivalent field size (EqFS), the modulation complexity score (MCS), and the components of the MCS: the aperture area variability (AAV) and the leaf sequence variability (LSV). The full adaptation and its separate steps were evaluated in simulation experiments involving a prostate phantom subjected to various interfraction transformations. The efficacy of the current VMAT adaptation was scored by target mean dose (CTVmean), conformity (CI95%), tumor control probability (TCP), and normal tissue complication probability (NTCP). The impact of the adaptation on the plan parameters (QA) was assessed by comparison with prediction intervals (PI) derived from a statistical model of the typical variation of these parameters in a population of VMAT prostate plans (n = 63). These prediction intervals are the adaptation equivalent of the tolerance tables for couch shifts in the current clinical practice.Results:
The proposed adaptation of a two-arc VMAT plan resulted in the intended CTVmean (Δ ≤ 3%) and TCP (ΔTCP ≤ 0.001). Moreover, the method assures the intended CI95% (Δ ≤ 11%) resulting in lowered rectal NTCP for all cases. Compared to replanning, their adaptation is faster (13 s vs 10 min) and more intuitive. Compared to the current clinical practice, it has a better protection of the healthy tissue. Compared to IMRT, VMAT is more robust to anatomical variations, but it is also less sensitive to the different correction steps. The observed variations of the plan parameters in their database included a linear dependence on the date of treatment planning and on the target radius. The MCS is not retained as QA metric due to a contrasting behavior of its components (LSV and AAV). If three out of four plan parameters (MU, EqFS, AAV, and LSV) need to lie inside a 50% prediction interval (3/4—50%PI), all adapted plans will be accepted. In contrast, all replanned plans do not meet this loose criterion, mainly because they have no connection to the initially optimized and verified plan.Conclusions:
A direct (forward) VMAT adaptation performs equally well as (inverse) replanning but is faster and can be extended to real-time adaptation. The prediction intervals for the machine parameters are equivalent to the tolerance tables for couch shifts in the current clinical practice. A 3/4—50%PI QA criterion accepts all the adapted plans but rejects all the replanned plans.
Development and evaluation of aperture-based complexity metrics using film and EPID measurements of static MLC openings42(2015); http://dx.doi.org/10.1118/1.4921733View Description Hide DescriptionPurpose:
Complexity metrics have been suggested as a complement to measurement-based quality assurance for intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT). However, these metrics have not yet been sufficiently validated. This study develops and evaluates new aperture-based complexity metrics in the context of static multileaf collimator (MLC) openings and compares them to previously published metrics.Methods:
This study develops the converted aperture metric and the edge area metric. The converted aperture metric is based on small and irregular parts within the MLC opening that are quantified as measured distances between MLC leaves. The edge area metric is based on the relative size of the region around the edges defined by the MLC. Another metric suggested in this study is the circumference/area ratio. Earlier defined aperture-based complexity metrics—the modulation complexity score, the edge metric, the ratio monitor units (MU)/Gy, the aperture area, and the aperture irregularity—are compared to the newly proposed metrics. A set of small and irregular static MLC openings are created which simulate individual IMRT/VMAT control points of various complexities. These are measured with both an amorphous silicon electronic portal imaging device and EBT3 film. The differences between calculated and measured dose distributions are evaluated using a pixel-by-pixel comparison with two global dose difference criteria of 3% and 5%. The extent of the dose differences, expressed in terms of pass rate, is used as a measure of the complexity of the MLC openings and used for the evaluation of the metrics compared in this study. The different complexity scores are calculated for each created static MLC opening. The correlation between the calculated complexity scores and the extent of the dose differences (pass rate) are analyzed in scatter plots and using Pearson’s r-values.Results:
The complexity scores calculated by the edge area metric, converted aperture metric, circumference/area ratio, edge metric, and MU/Gy ratio show good linear correlation to the complexity of the MLC openings, expressed as the 5% dose difference pass rate, with Pearson’s r-values of −0.94, −0.88, −0.84, −0.89, and −0.82, respectively. The overall trends for the 3% and 5% dose difference evaluations are similar.Conclusions:
New complexity metrics are developed. The calculated scores correlate to the complexity of the created static MLC openings. The complexity of the MLC opening is dependent on the penumbra region relative to the area of the opening. The aperture-based complexity metrics that combined either the distances between the MLC leaves or the MLC opening circumference with the aperture area show the best correlation with the complexity of the static MLC openings.
Investigating the limits of PET/CT imaging at very low true count rates and high random fractions in ion-beam therapy monitoring42(2015); http://dx.doi.org/10.1118/1.4921995View Description Hide DescriptionPurpose:
External beam radiotherapy with protons and heavier ions enables a tighter conformation of the applied dose to arbitrarily shaped tumor volumes with respect to photons, but is more sensitive to uncertainties in the radiotherapeutic treatment chain. Consequently, an independent verification of the applied treatment is highly desirable. For this purpose, the irradiation-induced β +-emitter distribution within the patient is detected shortly after irradiation by a commercial full-ring positron emission tomography/x-ray computed tomography (PET/CT) scanner installed next to the treatment rooms at the Heidelberg Ion-Beam Therapy Center (HIT). A major challenge to this approach is posed by the small number of detected coincidences. This contribution aims at characterizing the performance of the used PET/CT device and identifying the best-performing reconstruction algorithm under the particular statistical conditions of PET-based treatment monitoring. Moreover, this study addresses the impact of radiation background from the intrinsically radioactive lutetium-oxyorthosilicate (LSO)-based detectors at low counts.Methods:
The authors have acquired 30 subsequent PET scans of a cylindrical phantom emulating a patientlike activity pattern and spanning the entire patient counting regime in terms of true coincidences and random fractions (RFs). Accuracy and precision of activity quantification, image noise, and geometrical fidelity of the scanner have been investigated for various reconstruction algorithms and settings in order to identify a practical, well-suited reconstruction scheme for PET-based treatment verification. Truncated listmode data have been utilized for separating the effects of small true count numbers and high RFs on the reconstructed images. A corresponding simulation study enabled extending the results to an even wider range of counting statistics and to additionally investigate the impact of scatter coincidences. Eventually, the recommended reconstruction scheme has been applied to exemplary postirradiation patient data-sets.Results:
Among the investigated reconstruction options, the overall best results in terms of image noise, activity quantification, and accurate geometrical recovery were achieved using the ordered subset expectation maximization reconstruction algorithm with time-of-flight (TOF) and point-spread function (PSF) information. For this algorithm, reasonably accurate (better than 5%) and precise (uncertainty of the mean activity below 10%) imaging can be provided down to 80 000 true coincidences at 96% RF. Image noise and geometrical fidelity are generally improved for fewer iterations. The main limitation for PET-based treatment monitoring has been identified in the small number of true coincidences, rather than the high intrinsic random background. Application of the optimized reconstruction scheme to patient data-sets results in a 25% − 50% reduced image noise at a comparable activity quantification accuracy and an improved geometrical performance with respect to the formerly used reconstruction scheme at HIT, adopted from nuclear medicine applications.Conclusions:
Under the poor statistical conditions in PET-based treatment monitoring, improved results can be achieved by considering PSF and TOF information during image reconstruction and by applying less iterations than in conventional nuclear medicine imaging. Geometrical fidelity and image noise are mainly limited by the low number of true coincidences, not the high LSO-related random background. The retrieved results might also impact other emerging PET applications at low counting statistics.
Maximizing the probability of satisfying the clinical goals in radiation therapy treatment planning under setup uncertainty42(2015); http://dx.doi.org/10.1118/1.4921998View Description Hide DescriptionPurpose:
This paper introduces a method that maximizes the probability of satisfying the clinical goals in intensity-modulated radiation therapy treatments subject to setup uncertainty.Methods:
The authors perform robust optimization in which the clinical goals are constrained to be satisfied whenever the setup error falls within an uncertainty set. The shape of the uncertainty set is included as a variable in the optimization. The goal of the optimization is to modify the shape of the uncertainty set in order to maximize the probability that the setup error will fall within the modified set. Because the constraints enforce the clinical goals to be satisfied under all setup errors within the uncertainty set, this is equivalent to maximizing the probability of satisfying the clinical goals. This type of robust optimization is studied with respect to photon and proton therapy applied to a prostate case and compared to robust optimization using an a priori defined uncertainty set.Results:
Slight reductions of the uncertainty sets resulted in plans that satisfied a larger number of clinical goals than optimization with respect to a priori defined uncertainty sets, both within the reduced uncertainty sets and within the a priori, nonreduced, uncertainty sets. For the prostate case, the plans taking reduced uncertainty sets into account satisfied 1.4 (photons) and 1.5 (protons) times as many clinical goals over the scenarios as the method taking a priori uncertainty sets into account.Conclusions:
Reducing the uncertainty sets enabled the optimization to find better solutions with respect to the errors within the reduced as well as the nonreduced uncertainty sets and thereby achieve higher probability of satisfying the clinical goals. This shows that asking for a little less in the optimization sometimes leads to better overall plan quality.
Comparison of breathing gated CT images generated using a 5DCT technique and a commercial clinical protocol in a porcine model42(2015); http://dx.doi.org/10.1118/1.4922201View Description Hide DescriptionPurpose:
To demonstrate that a “5DCT” technique which utilizes fast helical acquisition yields the same respiratory-gated images as a commercial technique for regular, mechanically produced breathing cycles.Methods:
Respiratory-gated images of an anesthetized, mechanically ventilated pig were generated using a Siemens low-pitch helical protocol and 5DCT for a range of breathing rates and amplitudes and with standard and low dose imaging protocols. 5DCT reconstructions were independently evaluated by measuring the distances between tissue positions predicted by a 5D motion model and those measured using deformable registration, as well by reconstructing the originally acquired scans. Discrepancies between the 5DCT and commercial reconstructions were measured using landmark correspondences.Results:
The mean distance between model predicted tissue positions and deformably registered tissue positions over the nine datasets was 0.65 ± 0.28 mm. Reconstructions of the original scans were on average accurate to 0.78 ± 0.57 mm. Mean landmark displacement between the commercial and 5DCT images was 1.76 ± 1.25 mm while the maximum lung tissue motion over the breathing cycle had a mean value of 27.2 ± 4.6 mm. An image composed of the average of 30 deformably registered images acquired with a low dose protocol had 6 HU image noise (single standard deviation) in the heart versus 31 HU for the commercial images.Conclusions:
An end to end evaluation of the 5DCT technique was conducted through landmark based comparison to breathing gated images acquired with a commercial protocol under highly regular ventilation. The techniques were found to agree to within 2 mm for most respiratory phases and most points in the lung.
Toward a clinical application of ex situ boron neutron capture therapy for lung tumors at the RA-3 reactor in Argentina42(2015); http://dx.doi.org/10.1118/1.4922158View Description Hide DescriptionPurpose:
Many types of lung tumors have a very poor prognosis due to their spread in the whole organ volume. The fact that boron neutron capture therapy (BNCT) would allow for selective targeting of all the nodules regardless of their position, prompted a preclinical feasibility study of ex situ BNCT at the thermal neutron facility of RA-3 reactor in the province of Buenos Aires, Argentina. (l)-4p-dihydroxy-borylphenylalanine fructose complex (BPA-F) biodistribution studies in an adult sheep model and computational dosimetry for a human explanted lung were performed to evaluate the feasibility and the therapeutic potential of ex situ BNCT.Methods:
Two kinds of boron biodistribution studies were carried out in the healthy sheep: a set of pharmacokinetic studies without lung excision, and a set that consisted of evaluation of boron concentration in the explanted and perfused lung. In order to assess the feasibility of the clinical application of ex situ BNCT at RA-3, a case of multiple lung metastases was analyzed. A detailed computational representation of the geometry of the lung was built based on a real collapsed human lung. Dosimetric calculations and dose limiting considerations were based on the experimental results from the adult sheep, and on the most suitable information published in the literature. In addition, a workable treatment plan was considered to assess the clinical application in a realistic scenario.Results:
Concentration-time profiles for the normal sheep showed that the boron kinetics in blood, lung, and skin would adequately represent the boron behavior and absolute uptake expected in human tissues. Results strongly suggest that the distribution of the boron compound is spatially homogeneous in the lung. A constant lung-to-blood ratio of 1.3 ± 0.1 was observed from 80 min after the end of BPA-F infusion. The fact that this ratio remains constant during time would allow the blood boron concentration to be used as a surrogate and indirect quantification of the estimated value in the explanted healthy lung. The proposed preclinical animal model allowed for the study of the explanted lung. As expected, the boron concentration values fell as a result of the application of the preservation protocol required to preserve the lung function. The distribution of the boron concentration retention factor was obtained for healthy lung, with a mean value of 0.46 ± 0.14 consistent with that reported for metastatic colon carcinoma model in rat perfused lung. Considering the human lung model and suitable tumor control probability for lung cancer, a promising average fraction of controlled lesions higher than 85% was obtained even for a low tumor-to-normal boron concentration ratio of 2.Conclusions:
This work reports for the first time data supporting the validity of the ovine model as an adequate human surrogate in terms of boron kinetics and uptake in clinically relevant tissues. Collectively, the results and analysis presented would strongly suggest that ex situ whole lung BNCT irradiation is a feasible and highly promising technique that could greatly contribute to the treatment of metastatic lung disease in those patients without extrapulmonary spread, increasing not only the expected overall survival but also the resulting quality of life.
Development of an accurate EPID-based output measurement and dosimetric verification tool for electron beam therapy42(2015); http://dx.doi.org/10.1118/1.4922400View Description Hide DescriptionPurpose:
To develop an efficient and robust tool for output measurement and absolute dose verification of electron beam therapy by using a high spatial-resolution and high frame-rate amorphous silicon flat panel electronic portal imaging device (EPID).Methods:
The dosimetric characteristics of the EPID, including saturation, linearity, and ghosting effect, were first investigated on a Varian Clinac 21EX accelerator. The response kernels of the individual pixels of the EPID to all available electron energies (6, 9, 12, 16, and 20 MeV) were calculated by using Monte Carlo (MC) simulations, which formed the basis to deconvolve an EPID raw images to the incident electron fluence map. The two-dimensional (2D) dose distribution at reference depths in water was obtained by using the constructed fluence map with a MC simulated pencil beam kernel with consideration of the geometric and structural information of the EPID. Output factor measurements were carried out with the EPID at a nominal source–surface distance of 100 cm for 2 × 2, 3 × 3, 6 × 6, 10 × 10, and 15 × 15 cm2 fields for all available electron energies, and the results were compared with that measured in a solid water phantom using film and a Farmer-type ion chamber. The dose distributions at a reference depth specific to each energy and the flatness and symmetry of the 10 × 10 cm2 electron beam were also measured using EPID, and the results were compared with ion chamber array and water scan measurements. Finally, three patient cases with various field sizes and irregular cutout shapes were also investigated.Results:
EPID-measured dose changed linearly with the monitor units and showed little ghosting effect for dose rate up to 600 MU/min. The flatness and symmetry measured with the EPID were found to be consistent with ion chamber array and water scan measurements. The EPID-measured output factors for standard square fields of 2 × 2, 3 × 3, 6 × 6, 10 × 10, 15 × 15 cm2 agreed with film and ion chamber measurements. The average discrepancy between EPID and ion chamber/film measurements was 0.81% ± 0.60% (SD) and 1.34% ± 0.75%, respectively. For the three clinical cases, the difference in output between the EPID- and ion chamber array measured values was found to be 1.13% ± 0.11%, 0.54% ± 0.10%, and 0.74% ± 0.11%, respectively. Furthermore, the γ-index analysis showed an excellent agreement between the EPID- and ion chamber array measured dose distributions: 100% of the pixels passed the criteria of 3%/3 mm. When the γ-index was set to be 2%/2 mm, the pass rate was found to be 99.0% ± 0.07%, 98.2% ± 0.14%, and 100% for the three cases.Conclusions:
The EPID dosimetry system developed in this work provides an accurate and reliable tool for routine output measurement and dosimetric verification of electron beam therapy. Coupled with its portability and ease of use, the proposed system promises to replace the current film-based approach for fast and reliable assessment of small and irregular electron field dosimetry.
42(2015); http://dx.doi.org/10.1118/1.4922687View Description Hide DescriptionPurpose:
The authors studied the respiratory motion effect in a single step-shoot intensity modulated radiotherapy (IMRT) to assess the basic properties of the uncertainty in the delivered dose due to the unknown starting phase of the motion.Methods:
Using computer simulations, the motion-averaged dose for open beams with various field sizes was calculated for two one-dimensional trajectories with different motion amplitudes at 20 equally spaced starting phases. The properties of the standard deviation (SD) of delivered dose were analyzed. The dependence of SD on the field size, motion amplitude, and delivery time was investigated and experimentally validated. To study effect of number of small monitor unit (MU) segments on the dose uncertainty, the authors generated 1000 pairs of multisegment beams such that each pair consists of two beams with the same total MU and different segment MU. The SD at the central axis point was compared for each pair.Results:
The authors proved that the direct time-dependent dose accumulation can be calculated using a convolution formula for a single fraction step-shoot IMRT treatment. Single segment simulation showed that the maximum dose uncertainty occurred symmetrically at the beam penumbra for a sinusoidal motion. For other sinusoidal motions (sin2n n > 1), the maximum dose uncertainty occurred at asymmetrical locations and may be beyond the penumbra region. The SD of relative dose periodically varied with delivery time with decreasing envelope for both motion trajectories. The SD of absolute dose was a periodic function of the delivery time for a given field size and motion amplitude and was proved to be true for any periodic motion. The SD reduced to zero when the delivery time was an integer multiple of the motion period. Analytical function was found to fit the delivery time dependence of the SD for motions studied in this paper and was verified with experimental data and an irregular motion. The dose uncertainty increased with motion amplitude and decreased slowly with field size. Simulations for 1000 beam pairs with multiple segments demonstrated that the probability that more small MU segments did not introduce larger dose uncertainty at central axis point for three cutoff small MUs (2.5/5/10) was 55.8%/51.9%/43.4% and 54.6%/54.4%/45%, respectively, for a sin and a sin4 motion in a conventional treatment. These probabilities became 53.6%/50.9%/47.2% and 51.0%/50.2%/47.8%, respectively, for a hypofractionated treatment.Conclusions:
The periodic dependence of the dose uncertainty on the delivery time can be modeled with an analytic function, and the functional form is independent of motion trajectories in this paper. The relation of dose uncertainty between different dose schemes can be obtained using this function. The dose uncertainty at central axis point for a beam with more small MU segments may not be greater than a beam with less small MU segments. By varying the dose rate of each segment such that the delivery time is close to the integer multiples of the motion period, the interplay effect can be reduced.
A data-mining framework for large scale analysis of dose-outcome relationships in a database of irradiated head and neck cancer patients42(2015); http://dx.doi.org/10.1118/1.4922686View Description Hide DescriptionPurpose:
To develop a hypothesis-generating framework for automatic extraction of dose-outcome relationships from an in-house, analytic oncology database.Methods:
Dose–volume histograms (DVH) and clinical outcomes have been routinely stored to the authors’ database for 684 head and neck cancer patients treated from 2007 to 2014. Database queries were developed to extract outcomes that had been assessed for at least 100 patients, as well as DVH curves for organs-at-risk (OAR) that were contoured for at least 100 patients. DVH curves for paired OAR (e.g., left and right parotids) were automatically combined and included as additional structures for analysis. For each OAR-outcome combination, only patients with both OAR and outcome records were analyzed. DVH dose points, , at a given normalized volume threshold Vt were stratified into two groups based on severity of toxicity outcomes after treatment completion. The probability of an outcome was modeled at each Vt = [0%, 1%, …, 100%] by logistic regression. Notable OAR-outcome combinations were defined as having statistically significant regression parameters (p < 0.05) and an odds ratio of at least 1.05 (5% increase in odds per Gy).Results:
A total of 57 individual and combined structures and 97 outcomes were queried from the database. Of all possible OAR-outcome combinations, 17% resulted in significant logistic regression fits (p < 0.05) having an odds ratio of at least 1.05. Further manual inspection revealed a number of reasonable models based on either reported literature or proximity between neighboring OARs. The data-mining algorithm confirmed the following well-known OAR-dose/outcome relationships: dysphagia/larynx, voice changes/larynx, esophagitis/esophagus, xerostomia/parotid glands, and mucositis/oral mucosa. Several surrogate relationships, defined as OAR not directly attributed to an outcome, were also observed, including esophagitis/larynx, mucositis/mandible, and xerostomia/mandible.Conclusions:
Prospective collection of clinical data has enabled large-scale analysis of dose-outcome relationships. The current data-mining framework revealed both known and novel dosimetric and clinical relationships, underscoring the potential utility of this analytic approach in hypothesis generation. Multivariate models and advanced, 3D dosimetric features may be necessary to further evaluate the complex relationship between neighboring OAR and observed outcomes.
- RADIATION IMAGING PHYSICS
Multiatlas whole heart segmentation of CT data using conditional entropy for atlas ranking and selection42(2015); http://dx.doi.org/10.1118/1.4921366View Description Hide DescriptionPurpose:
Cardiac computed tomography (CT) is widely used in clinical diagnosis of cardiovascular diseases. Whole heart segmentation (WHS) plays a vital role in developing new clinical applications of cardiac CT. However, the shape and appearance of the heart can vary greatly across different scans, making the automatic segmentation particularly challenging. The objective of this work is to develop and evaluate a multiatlas segmentation (MAS) scheme using a new atlas ranking and selection algorithm for automatic WHS of CT data. Research on different MAS strategies and their influence on WHS performance are limited. This work provides a detailed comparison study evaluating the impacts of label fusion, atlas ranking, and sizes of the atlas database on the segmentation performance.Methods:
Atlases in a database were registered to the target image using a hierarchical registration scheme specifically designed for cardiac images. A subset of the atlases were selected for label fusion, according to the authors’ proposed atlas ranking criterion which evaluated the performance of each atlas by computing the conditional entropy of the target image given the propagated atlas labeling. Joint label fusion was used to combine multiple label estimates to obtain the final segmentation. The authors used 30 clinical cardiac CT angiography (CTA) images to evaluate the proposed MAS scheme and to investigate different segmentation strategies.Results:
The mean WHS Dice score of the proposed MAS method was 0.918 ± 0.021, and the mean runtime for one case was 13.2 min on a workstation. This MAS scheme using joint label fusion generated significantly better Dice scores than the other label fusion strategies, including majority voting (0.901 ± 0.276, p < 0.01), locally weighted voting (0.905 ± 0.0247, p < 0.01), and probabilistic patch-based fusion (0.909 ± 0.0249, p < 0.01). In the atlas ranking study, the proposed criterion based on conditional entropy yielded a performance curve with higher WHS Dice scores compared to the conventional schemes (p < 0.03). In the atlas database study, the authors showed that the MAS using larger atlas databases generated better performance curves than the MAS using smaller ones, indicating larger atlas databases could produce more accurate segmentation.Conclusions:
The authors have developed a new MAS framework for automatic WHS of CTA and investigated alternative implementations of MAS. With the proposed atlas ranking algorithm and joint label fusion, the MAS scheme is able to generate accurate segmentation within practically acceptable computation time. This method can be useful for the development of new clinical applications of cardiac CT.
42(2015); http://dx.doi.org/10.1118/1.4921417View Description Hide DescriptionPurpose:
The automatic exposure control (AEC) modes of most full field digital mammography (FFDM) systems are set up to hold pixel value (PV) constant as breast thickness changes. This paper proposes an alternative AEC mode, set up to maintain some minimum detectability level, with the ultimate goal of improving object detectability at larger breast thicknesses.Methods:
The default “opdose” AEC mode of a Siemens MAMMOMAT Inspiration FFDM system was assessed using poly(methyl methacrylate) (PMMA) of thickness 20, 30, 40, 50, 60, and 70 mm to find the tube voltage and anode/filter combination programmed for each thickness; these beam quality settings were used for the modified AEC mode. Detectability index (d′), in terms of a non-prewhitened model observer with eye filter, was then calculated as a function of tube current-time product (mAs) for each thickness. A modified AEC could then be designed in which detectability never fell below some minimum setting for any thickness in the operating range. In this study, the value was chosen such that the system met the achievable threshold gold thickness (Tt ) in the European guidelines for the 0.1 mm diameter disc (i.e., Tt ≤ 1.10 μm gold). The default and modified AEC modes were compared in terms of contrast-detail performance (Tt ), calculated detectability (d′), signal-difference-to-noise ratio (SDNR), and mean glandular dose (MGD). The influence of a structured background on object detectability for both AEC modes was examined using a CIRS BR3D phantom. Computer-based CDMAM reading was used for the homogeneous case, while the images with the BR3D background were scored by human observers.Results:
The default opdose AEC mode maintained PV constant as PMMA thickness increased, leading to a reduction in SDNR for the homogeneous background 39% and d′ 37% in going from 20 to 70 mm; introduction of the structured BR3D plate changed these figures to 22% (SDNR) and 6% (d′), respectively. Threshold gold thickness (0.1 mm diameter disc) for the default AEC mode in the homogeneous background increased by 62% in going from 20 to 70 mm PMMA thickness; in the structured background, the increase was 39%. Implementation of the modified mode entailed an increase in mAs at PMMA thicknesses >40 mm; the modified AEC held threshold gold thickness constant above 40 mm PMMA with a maximum deviation of 5% in the homogeneous background and 3% in structured background. SDNR was also held constant with a maximum deviation of 4% and 2% for the homogeneous and the structured background, respectively. These results were obtained with an increase of MGD between 15% and 73% going from 40 to 70 mm PMMA thickness.Conclusions:
This work has proposed and implemented a modified AEC mode, tailored toward constant detectability at larger breast thickness, i.e., above 40 mm PMMA equivalent. The desired improvement in object detectability could be obtained while maintaining MGD within the European guidelines achievable dose limit. (A study designed to verify the performance of the modified mode using more clinically realistic data is currently underway.)
A novel computer aided breast mass detection scheme based on morphological enhancement and SLIC superpixel segmentation42(2015); http://dx.doi.org/10.1118/1.4921612View Description Hide DescriptionPurpose:
To develop a computer-aided detection (CAD) scheme for mass detection on digitized mammograms that achieves a high sensitivity while maintaining a low false positive (FP) rate using morphological enhancement and simple linear iterative clustering (SLIC) method.Methods:
The authors developed a multiple stage method for breast mass detection. The proposed CAD scheme consists of five major components: (1) preprocessing based on morphological enhancement, which enhances mass-like patterns while removing unrelated background clutters, (2) segmentation of mass candidates based on the SLIC method, which groups mass and background tissue into different regions, (3) prescreening of suspicious regions using rule-based classification that eliminates regions unlikely to represent masses, (4) potential lesion contour refinement based on distance regularized level set evolution, and (5) FP reduction based on feature extraction and an ensemble of undersampled support vector machines. Two datasets were built to design and evaluate the system: a mass dataset containing 187 cases (386 mammograms) and a nonmass dataset containing 88 mammograms. All cases were acquired from the digital database for screening mammography (DDSM). Approximately two thirds of the available masses were used for training the system, and the remaining masses and nonmass dataset were used for testing.Results:
Testing of the proposed CAD system on the mass dataset yielded a mass-based sensitivity of 98.55%, 97.10%, 92.75% at 0.84, 0.63, 0.55 FP mark/image, respectively. Tested on the nonmass dataset, the scheme showed a FP rate of 0.55, 0.34, 0.30 mark/image.Conclusions:
The results indicate that the system is promising in improving the performance of current CAD systems by reducing FP rate while achieving relatively high sensitivity.
42(2015); http://dx.doi.org/10.1118/1.4921734View Description Hide DescriptionPurpose:
To determine inter-related factors that contribute substantially to measurement error of pulmonary nodule measurements with CT by assessing a large-scale dataset of phantom scans and to quantitatively validate the repeatability and reproducibility of a subset containing nodules and CT acquisitions consistent with the Quantitative Imaging Biomarker Alliance (QIBA) metrology recommendations.Methods:
The dataset has about 40 000 volume measurements of 48 nodules (5–20 mm, four shapes, three radiodensities) estimated by a matched-filter estimator from CT images involving 72 imaging protocols. Technical assessment was performed under a framework suggested by QIBA, which aimed to minimize the inconsistency of terminologies and techniques used in the literature. Accuracy and precision of lung nodule volume measurements were examined by analyzing the linearity, bias, variance, root mean square error (RMSE), repeatability, reproducibility, and significant and substantial factors that contribute to the measurement error. Statistical methodologies including linear regression, analysis of variance, and restricted maximum likelihood were applied to estimate the aforementioned metrics. The analysis was performed on both the whole dataset and a subset meeting the criteria proposed in the QIBA Profile document.Results:
Strong linearity was observed for all data. Size, slice thickness × collimation, and randomness in attachment to vessels or chest wall were the main sources of measurement error. Grouping the data by nodule size and slice thickness × collimation, the standard deviation (3.9%–28%), and RMSE (4.4%–68%) tended to increase with smaller nodule size and larger slice thickness. For 5, 8, 10, and 20 mm nodules with reconstruction slice thickness ≤0.8, 3, 3, and 5 mm, respectively, the measurements were almost unbiased (−3.0% to 3.0%). Repeatability coefficients (RCs) were from 6.2% to 40%. Pitch of 0.9, detail kernel, and smaller slice thicknesses yielded better (smaller) RCs than those from pitch of 1.2, medium kernel, and larger slice thicknesses. Exposure showed no impact on RC. The overall reproducibility coefficient (RDC) was 45%, and reduced to about 20%–30% when the slice thickness and collimation were fixed. For nodules and CT imaging complying with the QIBA Profile (QIBA Profile subset), the measurements were highly repeatable and reproducible in spite of variations in nodule characteristics and imaging protocols. The overall measurement error was small and mostly due to the randomness in attachment. The bias, standard deviation, and RMSE grouped by nodule size and slice thickness × collimation in the QIBA Profile subset were within ±3%, 4%, and 5%, respectively. RCs are within 11% and the overall RDC is equal to 11%.Conclusions:
The authors have performed a comprehensive technical assessment of lung nodule volumetry with a matched-filter estimator from CT scans of synthetic nodules and identified the main sources of measurement error among various nodule characteristics and imaging parameters. The results confirm that the QIBA Profile set is highly repeatable and reproducible. These phantom study results can serve as a bound on the clinical performance achievable with volumetric CT measurements of pulmonary nodules.
42(2015); http://dx.doi.org/10.1118/1.4921805View Description Hide DescriptionPurpose:
The aim of this study is to generate spatially varying half value layers (HVLs) that can be used to construct virtual equivalent source models of computed tomography (CT) x-ray sources for use in Monte Carlo CT dose computations.Methods:
To measure the spatially varying HVLs, the authors combined a cylindrical HVL measurement technique with the characterization of bowtie filter relative attenuation (COBRA) geometry. An apparatus given the name “HVL Jig” was fabricated to accurately position a real-time dosimeter off-isocenter while surrounded by concentric cylindrical aluminum filters (CAFs). In this geometry, each projection of the rotating x-ray tube is filtered by an identical amount of high-purity (type 1100 H-14) aluminum while the stationary radiation dose probe records an air kerma rate versus time waveform. The CAFs were progressively nested to acquire exposure data at increasing filtrations to calculate the HVL. Using this dose waveform and known setup geometry, each timestamp was related to its corresponding fan angle. Data were acquired using axial CT protocols (i.e., rotating tube and stationary patient table) at energies of 80, 100, and 120 kVp on a single CT scanner. These measurements were validated against the more laborious conventional step-and-shoot approach (stationary x-ray tube).Results:
At each energy, HVL data points from the COBRA-cylinder technique were fit to a trendline and compared with the conventional approach. The average relative difference in HVL between the two techniques was 1.3%. There was a systematic overestimation in HVL due to scatter contamination.Conclusions:
The described method is a novel, rapid, accurate, and noninvasive approach that allows one to acquire the spatially varying fluence and HVL data using a single experimental setup in a minimum of three scans. These measurements can be used to characterize the CT beam in terms of the angle-dependent fluence and energy spectra along the bowtie filter direction, which can serve as input for accurate CT dose computations.
42(2015); http://dx.doi.org/10.1118/1.4922204View Description Hide DescriptionPurpose:
The authors analyze a recently proposed polyenergetic version of the simultaneous algebraic reconstruction technique (SART). This algorithm, denoted polyenergetic SART (pSART), replaces the monoenergetic forward projection operation used by SART with a postlog, polyenergetic forward projection, while leaving the rest of the algorithm unchanged. While the proposed algorithm provides good results empirically, convergence of the algorithm was not established mathematically in the original paper.Methods:
The authors analyze pSART as a nonlinear fixed point iteration by explicitly computing the Jacobian of the iteration. A necessary condition for convergence is that the spectral radius of the Jacobian, evaluated at the fixed point, is less than one. A short proof of convergence for SART is also provided as a basis for comparison.Results:
The authors show that the pSART algorithm is not guaranteed to converge, in general. The Jacobian of the iteration depends on several factors, including the system matrix and how one models the energy dependence of the linear attenuation coefficient. The authors provide a simple numerical example that shows that the spectral radius of the Jacobian matrix is not guaranteed to be less than one. A second set of numerical experiments using realistic CT system matrices, however, indicates that conditions for convergence are likely to be satisfied in practice.Conclusions:
Although pSART is not mathematically guaranteed to converge, their numerical experiments indicate that it will tend to converge at roughly the same rate as SART for system matrices of the type encountered in CT imaging. Thus, the authors conclude that the algorithm is still a useful method for reconstruction of polyenergetic CT data.
42(2015); http://dx.doi.org/10.1118/1.4922203View Description Hide DescriptionPurpose:
This work aims to develop a robust and efficient method to track the fuzzy borders between liver and the abutted organs where automatic liver segmentation usually suffers, and to investigate its applications in automatic liver segmentation on noncontrast-enhanced planning computed tomography (CT) images.Methods:
In order to track the fuzzy liver–chestwall and liver–heart borders where oversegmentation is often found, a starting point and an ending point were first identified on the coronal view images; the fuzzy border was then determined as a geodesic curve constructed by minimizing the gradient-weighted path length between these two points near the fuzzy border. The minimization of path length was numerically solved by fast-marching method. The resultant fuzzy borders were incorporated into the authors’ automatic segmentation scheme, in which the liver was initially estimated by a patient-specific adaptive thresholding and then refined by a geodesic active contour model. By using planning CT images of 15 liver patients treated with stereotactic body radiation therapy, the liver contours extracted by the proposed computerized scheme were compared with those manually delineated by a radiation oncologist.Results:
The proposed automatic liver segmentation method yielded an average Dice similarity coefficient of 0.930 ± 0.015, whereas it was 0.912 ± 0.020 if the fuzzy border tracking was not used. The application of fuzzy border tracking was found to significantly improve the segmentation performance. The mean liver volume obtained by the proposed method was 1727 cm3, whereas it was 1719 cm3 for manual-outlined volumes. The computer-generated liver volumes achieved excellent agreement with manual-outlined volumes with correlation coefficient of 0.98.Conclusions:
The proposed method was shown to provide accurate segmentation for liver in the planning CT images where contrast agent is not applied. The authors’ results also clearly demonstrated that the application of tracking the fuzzy borders could significantly reduce contour leakage during active contour evolution.
42(2015); http://dx.doi.org/10.1118/1.4922200View Description Hide DescriptionPurpose:
The goal of this paper is to address three problems existing in vessel extraction of murine hindlimb. First, the bone can hardly be separated from blood vessels because the intensity of contrast enhanced blood vessels is similar to that of bones. Second, as an automatic blood vessel segmentation method, the vesselness method is sensitive to sharp boundaries, resulting in false positive effect in nonvascular regions. Finally, thin blood vessels are always broken after segmentation because of the low signal-to-noise ratio.Methods:
The proposed automatic segmentation method for bone and blood vessel in this paper includes three important modules. (1) To eliminate the interference of bones on the segmentation of blood vessels, the authors employ split Bregman method to segment bones in the first place. (2) The authors propose an edge extension strategy to cope with the false positive effect of the vesselness method on the sharp boundaries of hindlimb after the removal of bones. Then, the authors segment the blood vessels using the vesselness method combined with multiscale bi-Gaussian filtering. (3) The authors reconnect the broken blood vessels after segmentation based on centerline and morphological dilation.Results:
The bones’ segmentation from the murine hindlimbs was conducted using the split Bregman, manual, and thresholding methods, respectively. Compared with the thresholding method, the split Bregman method could finely segment the bones from blood vessels, and the results were comparable to that of manual segmentation. After removing bones, the vesselness method combined with the bi-Gaussian filtering with and without edge extension was performed. The vesselness results with the edge extension strategy could effectively eliminate the false positive effect on sharp boundaries in nonvascular regions. Some of the blood vessels segmented by thresholding from the vesselness results were disconnected. Thus, the authors employed the vascular connection method based on centerline and morphological dilation to connect the broken blood vessels. Compared with the vascular connection utilizing the spatial-variant and -invariant morphological closing methods, the proposed vascular connection method reconnected the broken blood vessels and meanwhile maintained the nonbroken ones unchanged.Conclusions:
Our proposed method is suitable for the segmentation of bones and blood vessels in murine hindlimbs. For the segmentation of bones, the split Bregman method improves the distinguishability between bones and blood vessels, since both the intensity information and the geometrical size are exploited. For the segmentation of blood vessels, vesselness method with the edge extension strategy eliminates the false positive effect on the nonvascular sharp boundaries. After segmentation, the proposed vascular connection method based on centerline and morphological dilation can reconnect the broken blood vessels without affecting the nonbroken ones.
Technical Note: Relation between dual-energy subtraction of CT images for electron density calibration and virtual monochromatic imaging42(2015); http://dx.doi.org/10.1118/1.4921999View Description Hide DescriptionPurpose:
For accurate tissue inhomogeneity correction in radiotherapy treatment planning, the author previously proposed a simple conversion of the energy-subtracted computed tomography (CT) number to an electron density (ΔHU–ρe conversion), which provides a single linear relationship between ΔHU and ρe over a wide ρe range. The purpose of the present study was to reveal the relation between the ΔHU image for ρe calibration and a virtually monochromatic CT image by performing numerical analyses based on the basis material decomposition in dual-energy CT.Methods:
The author determined the weighting factor, α 0, of the ΔHU–ρe conversion through numerical analyses of the International Commission on Radiation Units and Measurements Report-46 human body tissues using their attenuation coefficients and given ρe values. Another weighting factor, α(E), for synthesizing a virtual monochromatic CT image from high- and low-kV CT images, was also calculated in the energy range of 0.03 < E < 5 MeV, assuming that cortical bone and water were the basis materials. The mass attenuation coefficients for these materials were obtained using the xcom photon cross sections database. The effective x-ray energies used to calculate the attenuation were chosen to imitate a dual-source CT scanner operated at 80–140 and 100–140 kV/Sn.Results:
The determined α 0 values were 0.455 for 80–140 kV/Sn and 0.743 for 100–140 kV/Sn. These values coincided almost perfectly with the respective maximal points of the calculated α(E) curves located at approximately 1 MeV, in which the photon-matter interaction in human body tissues is exclusively the incoherent (Compton) scattering.Conclusions:
The ΔHU image could be regarded substantially as a CT image acquired with monoenergetic 1-MeV photons, which provides a linear relationship between CT numbers and electron densities.