Index of content:
Volume 41, Issue 5, May 2014
A potential side effect of inline MRI-linac systems is electron contamination focusing causing a high skin dose. In this work, the authors reexamine this prediction for an open bore 1 T MRI system being constructed for the Australian MRI-Linac Program. The efficiency of an electron contamination deflector (ECD) in purging electron contamination from the linac head is modeled, as well as the impact of a helium gas region between the deflector and phantom surface for lowering the amount of air-generated contamination.Methods:
Magnetic modeling of the 1 T MRI was used to generate 3D magnetic field maps both with and without the presence of an ECD located immediately below the MLC’s. Forty-seven different ECD designs were modeled and for each the magnetic field map was imported into Geant4 Monte Carlo simulations including the linac head, ECD, and a 30 × 30 × 30 cm3 water phantom located at isocenter. For the first generation system, the x-ray source to isocenter distance (SID) will be 160 cm, resulting in an 81.2 cm long air gap from the base of the ECD to the phantom surface. The first 71.2 cm was modeled as air or helium gas, with the latter encased between two windows of 50 μm thick high density polyethlyene. 2D skin doses (at 70 μm depth) were calculated across the phantom surface at 1 × 1 mm2 resolution for 6 MV beams of field size of 5 × 5, 10 × 10, and 20 × 20 cm2.Results:
The skin dose was predicted to be of similar magnitude as the generic systems modeled in previous work, 230% to 1400% of for 5 × 5 to 20 × 20 cm2, respectively. Inclusion of the ECD introduced a nonuniformity to the MRI imaging field that ranged from ∼20 to ∼140 ppm while the net force acting on the ECD ranged from ∼151 N to ∼1773 N. Various ECD designs were 100% efficient at purging the electron contamination into the ECD magnet banks; however, a small percentage were scattered back into the beam and continued to the phantom surface. Replacing a large portion of the extended air-column between the ECD and phantom surface with helium gas is a key element as it significantly minimized the air-generated contamination. When using an optimal ECD and helium gas region, the 70 μm skin dose is predicted to increase moderately inside a small hot spot over that of the case with no magnetic field present for the jaw defined square beams examined here. These increases include from 12% to 40% of for 5 × 5 cm2, 18% to 55% of for 10 × 10 cm2, and from 23% to 65% of for 20 × 20 cm2.Conclusions:
Coupling an efficient ECD and helium gas region below the MLCs in the 160 cm isocenter MRI-linac system is predicted to ameliorate the impact electron contamination focusing has on skin dose increases. An ECD is practical as its impact on the MRI imaging distortion is correctable, and the mechanical forces acting on it manageable from an engineering point of view.
- VISION 20/20
Vision 20/20: The role of Raman spectroscopy in early stage cancer detection and feasibility for application in radiation therapy response assessment41(2014); http://dx.doi.org/10.1118/1.4870981View Description Hide Description
Raman spectroscopy is an optical technique capable of identifying chemical constituents of a sample by their unique set of molecular vibrations. Research on the applicability of Raman spectroscopy in the differentiation of cancerous versus normal tissues has been ongoing for many years, and has yielded successful results in the context of prostate, breast, brain, skin, and head and neck cancers as well as pediatric tumors. Recently, much effort has been invested on developing noninvasive “Raman” probes to provide real-time diagnosis of potentially cancerous tumors. In this regard, it is feasible that the Raman technique might one day be used to provide rapid, minimally invasive real-time diagnosis of tumors in patients. Raman spectroscopy is relatively new to the field of radiation therapy. Recent work involving cell lines has shown that the Raman technique is able to identify proteins and other markers affected by radiation therapy. Although this work is preliminary, one could ask whether or not the Raman technique might be used to identify molecular markers that predict radiation response. This paper provides a brief review of Raman spectroscopic investigations in cancer detection, benefits and limitations of this method, advances in instrument development, and also preliminary studies related to the application of this technology in radiation therapy response assessment.
41(2014); http://dx.doi.org/10.1118/1.4871620View Description Hide Description
Due to rapid advances in radiation therapy (RT), especially image guidance and treatment adaptation, a fast and accurate segmentation of medical images is a very important part of the treatment. Manual delineation of target volumes and organs at risk is still the standard routine for most clinics, even though it is time consuming and prone to intra- and interobserver variations. Automated segmentation methods seek to reduce delineation workload and unify the organ boundary definition. In this paper, the authors review the current autosegmentation methods particularly relevant for applications in RT. The authors outline the methods’ strengths and limitations and propose strategies that could lead to wider acceptance of autosegmentation in routine clinical practice. The authors conclude that currently, autosegmentation technology in RT planning is an efficient tool for the clinicians to provide them with a good starting point for review and adjustment. Modern hardware platforms including GPUs allow most of the autosegmentation tasks to be done in a range of a few minutes. In the nearest future, improvements in CT-based autosegmentation tools will be achieved through standardization of imaging and contouring protocols. In the longer term, the authors expect a wider use of multimodality approaches and better understanding of correlation of imaging with biology and pathology.