Volume 34, Issue 6, June 2007
Index of content:
- Imaging Symposium: Room L100F
- Advances in X‐ray Imaging
34(2007); http://dx.doi.org/10.1118/1.2761220View Description Hide Description
Since Roentgen discovered x‐ray and performed the first x‐ray imaging over 100 years ago, x‐ray medical imaging has always been based on the biological tissue's differences in x‐ray attenuation. However, x‐ray‐tissue interaction causes x‐ray phase shifts as well, and the phase‐shift differences between different tissues are about one thousand times larger than their attenuation differences. The phase‐sensitive x‐ray imaging hence allows not only visualizing the tissues with very low attenuation‐contrast, but also quantifying tissue's projected electron densities by means of the phase retrieval.
In this presentation we first introduce the concept of spatialcoherence of x‐ray wavefield and elucidate the mechanism of x‐ray phase‐contrast formation. We then review recent progress in the in‐line phase—sensitive x‐ray imaging.
1. Elucidate the mechanism of x‐ray phase—contrast formation.
2. Review recent progress in the in‐line phase—sensitive x‐ray imaging.
34(2007); http://dx.doi.org/10.1118/1.2761221View Description Hide Description
Purpose: To develop a new field emission x‐ray source with multi‐pixel and multiplexing capability for high speed computed tomographyMethod and Materials: The new x‐ray source utilizes the carbon nanotubes(CNTs) as the “cold cathode” to replace the thermionic cathode used in the conventional x‐ray tube. The x‐ray tube current is generated by applying an external electrical field to extract the electrons from the carbon nanotubes. By controlling the triggering signal, x‐ray radiation with programmable waveforms can be readily generated. One‐ and two‐dimensionally pixilated x‐ray source with spatially distributed focal spots is constructed by using a matrix addressable multi‐pixel CNTcathode. The electronics for controlling and integration of the x‐ray source have been demonstrated. Results: A new x‐ray source based on the CNTfield emission technology has been developed. The x‐ray source is capable of generating the flux, energy and spatial resolution required for medical imaging. A micro‐focus x‐ray source with 30–50μm isotropic focal spot size has also been developed. A micro‐CT scanner using the field emission x‐ray source was constructed for imaging for small animal models with respiratory and cardiac gating capability. The results from preliminary studies will be discussed Conclusion: The CNT based field emission x‐ray source offers unique capabilities that are attractive for high speed CTimaging such as spatially distributed x‐ray pixels. It enables collection of multiple projection images from different viewing angles without mechanical motion either sequentially or simultaneously which can potentially lead to stationary gantry‐free CT scanners for high resolution and high‐speed imaging.
34(2007); http://dx.doi.org/10.1118/1.2761222View Description Hide Description
Digital tomosynthesis is one of the most exciting recent developments in breast imaging. By modifying existing full field digital mammography systems, one can achieve this type of limited‐angle cone‐beam CTimaging which produces 3D slice images of the breast. Overlapping dense tissue in mammography is one of the most common causes for unnecessary callbacks as well as missed cancers. Since the 3D images remove such overlapping tissue, breast tomosynthesis can improve radiologists' specificity by obviating unnecessary callbacks. It can also improve sensitivity by allowing easier detection and characterization of breast cancers which might otherwise be obscured. Most remarkably, tomosynthesis can achieve all this with a scan that is comparable to the speed, resolution, cost, and dose of conventional mammography. For these reasons, tomosynthesis stands poised as the only imaging technique with the potential to completely replace the current role of mammography as the primary tool in breast cancer screening and diagnosis.
This presentation will cover both the hype and hope surrounding breast tomosynthesis. From a medical physics perspective, the latest results will be reviewed from recent studies to optimize radiographic techniques, acquisition modes, and reconstruction algorithms. In addition, emerging results will be surveyed from advanced applications including display/visualization, computer aided detection, and contrast enhanced tomosynthesis. Finally, the clinical promise and risks of this new technology will be discussed using initial clinical trial results.
This research was supported in part by research grants from Siemens Medical Solutions, US Army Breast Cancer Research Program, and NIH/NCI.
1. Understand the difference between breast tomosynthesis and dedicated breast CT.
2. Appreciate the many medical physics issues involved in the development and optimization of breast tomosynthesis.
3. Understand the clinical promise and concerns of using breast tomosynthesis.
34(2007); http://dx.doi.org/10.1118/1.2761224View Description Hide Description
While the vast majority of x‐ray imaging procedures use a single x‐ray source, a growing number require measurement of x‐ray transmission from different locations. These exams are generally performed by moving a single x‐ray source to many positions. Examples include computed tomography(CT) and tomosynthesis. At the same time, several technologies are now becoming more available that enable to construction of source arrays, i.e., a large number of sources that can be turned on in sequence. The purpose of this presentation is to explore the capabilities and benefits of imaging with these source arrays.
In CT, small deflections of the x‐ray focal spot (on the order of a millimeter) have been used to improve data sampling. Multiple sources or much larger deflections (on the order of centimeters) in the axial direction can be quite useful for increasing the thickness of the imaging slab in a single scan while reducing cone beam artifacts. Distributing sources in the circumferential direction can dramatically reduce the mechanical rotation needed to obtain a complete data set and therefore can shorten the scan time. The recently introduced dual source CT system is the most recent incarnation in this direction. The limiting case is CT scanning with no mechanical rotation, as in electron beamCT. An array of sources separated in the circumferential direction can also be used to reduce the size of the detector array. This can lead to a reduction in detected x‐ray scatter and perhaps in the adoption of more advanced detector technologies. It also provides a reasonable method for optimization of the incident intensity distribution, thereby potentially reducing the dose to the patient while also tailoring the distribution of quantum noise across the image.
In tomosynthesis, as in CT,measurements from different directions are used to produce tomographic information, albeit with imperfect spatial separation due to data insufficiency. Nonetheless, tomosynthesis can be powerful in many settings where obtaining complete CT data is difficult, and is gaining popularity. Data acquisition with mechanical motion of a single source is limiting both in the needed imaging time and in the number of projections that can be obtained. A small number of projections makes the blurring function (the “crosstalk” from one location to another from imperfect spatial separation) very discrete and potentially distracting to the viewer. Source arrays can be used to produce tomosynthesisimages with very short imaging time and smooth blurring functions, both of which are preferred.
Thus, there are significant benefits from the use of distributed sources. The author anticipates that these source arrays will become more widely adopted.
Research sponsored by GE Healthcare.
1. become familiar with the types of source arrays and their capabilities.
2. understand the benefits of source arrays in computed tomography.
3. understand the benefits of source arrays in tomosynthesis.
- Current Status and Future Development in Breast Ultrasound Imaging
34(2007); http://dx.doi.org/10.1118/1.2761264View Description Hide Description
Methods for imaging the elastic properties of soft tissues have undergone rapid development in recent years. This is in part facilitated by the ever‐increasing capabilities of ultrasound imaging systems. This presentation will describe the basic mechanics of soft tissues and terminology used in elasticityimaging (e.g., stress, strain, elastic modulus). Key developments in motion tracking for elasticityimaging will be reviewed with emphasis on the current state of the art in motion tracking algorithms for accurate estimation of displacement fields. That discussion sets the stage for an analysis of observations of soft tissue mechanics as seen via real‐time elasticity (mechanical strain) images of breast tissues during relatively large deformations. For example, many benign lesions tend to lose contrast in strain images as deformation increases. That behavior can be understood given the measured mechanical properties of in vitro tissue samples. Unlike benign lesions most canceroustumors tend to maintain a large negative contrast with increasing deformation, and that too can be understood from in vitro measurements. The utility of this information is summarized with a review of recent clinical trials of breast elasticityimaging. Specifically, a recent multi‐institutional, multi‐observer study has demonstrated that elasticityimaging increases the diagnostic confidence of breast ultrasoundradiologists when attempting to classify a tumor as either “benign” or “malignant”. The presentation will conclude with a brief discussion of the prospects for future enhancements and improvements in elasticityimage formation and information content.
1. Understand the basic vocabulary used to describe simple solid mechanics.
2. Understand the methods used to form real‐time elasticityimages using ultrasound and the potential for significant improvements in the future.
3. Understand the basic criteria for interpreting breast elasticityimages.
34(2007); http://dx.doi.org/10.1118/1.2761265View Description Hide Description
In this paper, we describe the principles of an imaging technique that produces a map of the mechanical response of an object to a force applied at each point. The method uses ultrasound radiation force to remotely exert a localized oscillating stress field at a desired frequency within (or on the surface of) an object. Harmonic radiation force is produced by mixing two ultrasound beams of different frequencies at their focal point. The resulting radiation force occurs at the difference and the sum of the two frequencies. In response to this force, a part of the object vibrates. The size of this part and the motion pattern depend on object viscoelastic and reflection characteristics. The acoustic field resulting from object vibration at the difference frequency, which we refer to as “acoustic emissions,” is detected by a sensitive hydrophone and used to form the image of the object that represents magnitude or phase or frequency content of the signal at each point over a raster scanned region. This method benefits from the high spatial definition of ultrasound radiation force and high motion‐detection sensitivity offered by the hydrophone. The images have no speckle and are of high contrast. We call this technique ultrasound‐stimulated vibroacoustography (USVA). The method has been applied to imaging lesions in breast, detection of calcification in vessels, detection of brachytherapy seeds in prostate, and modal vibration of vessels.
1. Understand radiation pressure.
2. Understand vibro‐acoustography.
3. Understand possible applications of vibro‐acoustography in Medical Imaging.
34(2007); http://dx.doi.org/10.1118/1.2761267View Description Hide Description
Several ultrasonic parameters are known to have temperature dependence including the speed of sound,attenuation, nonlinearity parameter, etc. Local changes in tissue temperature can be estimated noninvasively using diagnostic pulse‐echo ultrasound. A number of temperature estimation methods have been proposed since the early 1990s with varying levels of success. One method that was investigated by numerous groups is based on minute changes in echo locations due to temperature‐dependent changes in the speed of sound and thermal expansion. Both in vitro and in vivo results have been obtained to demonstrate the feasibility of the method. However, several limitations have been shown to limit the use of this method in practice. In this paper, we describe a true two‐dimensional displacement tracking algorithm with robust temperature estimation from real‐time 2D pulse‐echo ultrasound. The method employs a physics‐based Kalman filter derived from the transient bioheat transfer equation (BHTE). The filter is shown to be effective in rejecting displacement artifacts from tissue motion as well as those resulting from the thermal lens effects.
This lecture will provide a description of the BHTE and the system approach leading to the design of the Kalman filter for tracking the temperature data based on the observed displacements in echo location in the region of interest. The tracking properties of the filter and its ability to reject artifacts from the observed displacements will be illustrated. One‐dimensional and two‐dimensional versions of the filter will be presented and contrasted in terms of their ability to reject displacement artifacts. Comparisons with previously published algorithms will be given using in vivo and in vitro data.
We will also present results on using the Kamlan filter methodology in tracking changes in tissue absorption associated with lesion formation using high intensity focused ultrasound (HIFU). This technique may lead to quantitative parameter imaging for monitoring and control of HIFU treatments under ultrasoundimage guidance.
The Educational Objectives:
1. Understand the origin of temperature‐dependence of relevant acoustic parameter and their effects on observed pulse‐echo data.
2. Understand the limitations of noninvasive temperature estimation using pulse‐echo data.
3. Understand the connection between the physical model for temperature evolution and robust temperature and parameter tracking in tissue media.
- Breast Imaging: Updates on Technology Beyond Mammography
34(2007); http://dx.doi.org/10.1118/1.2761331View Description Hide Description
Screen film mammography will gradually give way to digital mammographysystems, which have been shown in the DMIST trial to have slightly better performance for women with dense breasts. New developments in breast imaging technology have advanced beyond mammography towards potentially interesting and potentially better technologies. In this presentation, some of the basic science foundations for these advanced technology systems will be discussed. Mimicking the changes which are occurring in general radiology practice, where tomographic imaging modalities such as computed tomography(CT) are replacing more traditional projection radiographic techniques, tomographic methods for breast imaging are also becoming practical with advancements in digital detectorsystems and computer‐based algorithms. The presentations in this symposium will highlight some of the advancements which have taken place in these and other areas.
A number of groups around the country have been developing computed tomographysystems for breast imaging based upon flat‐panel detectors, using cone‐beam CT of the pendant breast. While demonstration of the comparative performance of breast CT compared to digital mammography is perhaps several years into the future, clinical images acquired during Phase II breast CT trials may shed light on the ultimate performance of breast CT in relationship to digital mammography. One of the central motivations for tomographic imaging of the breast is that the anatomical noise due to the normal breast parenchyma is reduced, due to the tomographic nature of breast CT. Although the noise power spectrum (NPS) is typically used to characterize the quantum and electronic noise characteristics of an imagingsystem, this metric can also be used to evaluate the anatomical noise characteristics of imagingsystems. Towards this end, a Phase II clinical trial on breast CT was used to acquire projection imaging data (mammographic equivalent) as well as reconstructedCTimages of the breast. NPS analysis was used to characterize the frequency‐dependent noise characteristics of the anatomical noise in these different imaging modalities. Consistent with the work of Burgess, it was found that mammographic projection image data has a noise power spectrum which obeys a power spectrum, where NPS (F) = alpha × F∧‐beta, where the beta term is approximately 2.8. Theoretical predictions suggest that breast CTimages would result in a similar power law relationship with the anatomical noise in the breast, although the value of beta would be reduced by a numerical value of 1 (beta′ = beta − 1). Using a dataset of images from 40 patients undergoing breast CT, the NPS was calculated on projection breast images (“mammograms”) as well as the reconstructed breast CTimages, and the results indicate that the reduction in the value of beta is on the order of 1.3 (beta′ = beta − 1.3). These data will be presented, and following based upon Burgess' work, it is argued that breast CTimages may have better detection performance than digital mammography, based upon these assessments of anatomical noise.
One of the limitations of mammography as well as breast CT is that these modalities image the anatomical nature of the breast. Breast cancer lesions, therefore, must be identified based upon their morphologic characteristics. With the advent of molecular medicine techniques, positron emission tomography(PET) of the breast is being explored in a number of institutions. A PET/CT system designed specifically for imaging the breast was evaluated in a number of patients. A high resolution PETdetectorsystem was fabricated and integrated into an existing breast CT scanner. In this preliminary investigation, a number of patients were imaged using the PET/CT system for the breast. A description of the technology development, as well as an expose of the hybrid image datasets will be presented. While these data are too preliminary to make statements about the ultimate clinical utility of PET/CT of the breast, the incorporation of molecular imaging techniques for breast cancer evaluation and staging represents a significant step towards the development of more sensitive and patient‐specific imaging protocols.
The incorporation of computer‐aided diagnosis (CAD) systems into digital mammographysystems has been a significant advance of the use of image processing to augment the observer detection performance of radiologists. While recent publications have described the limitations of CAD routines in the hands of radiologists who may not specialize in breast imaging, there is a general consensus in the breast imaging community that CAD has a role to play and will most likely have an increasingly important role to play in breast cancerdetection. The development of CAD routines, with a description of past and present performance levels, will be described. While the use of CAD in projection digital mammography is most common, researchers in this field are developing advanced CAD algorithms for the detection of breast cancer in 3‐dimensional tomographic data of the breast, including magnetic resonance imaging (MRI) and breast CT. The development of CAD routines in projection mammography as well as tomographic breast imagingsystems will be described.
Breast imaging is at an interesting crossroads, where the advancements in technology in other areas of human imaging are being applied to breast cancer screening, diagnosis, and staging. In this symposium, a description of the potential advancements based upon these new technologies will be presented by leaders in their respective fields. While there is some degree of speculation in these research‐oriented presentations, this symposium may well chronicle the advancements of breast CTimaging and diagnostic systems 10 years into the future.
- Molecular Imaging: Biomarkers
34(2007); http://dx.doi.org/10.1118/1.2761374View Description Hide Description
The application of MR imaging biomarkers in pre‐clinical studies as well as Phase I/II clinical trials has greatly expanded over the past few years. Such MR biomarkers include measures obtained from dynamic contrast agent enhanced MRI (DCE‐MRI), dynamic susceptibility change MRI (DSC‐MRI), diffusionMRI, blood oxygen level dependent (BOLD) MRI, and in vivo spectroscopy. The symposium session will introduce each of these techniques and provide representative pre‐clinical and clinical trial results using each technique. In addition, more novel applications, such as MR molecular imaging techniques, will be introduced. Finally, a summary of unmet needs of MR imaging biomarkers, specifically the need for standardization, quality control, and change assessment criteria, will be presented. These needs must be addressed in order to
1. To understand the physical basis, applications, and current limitations of DCE‐MRI, DSC‐MRI, diffusionMRI, BOLD MRI, and MR spectroscopy techniques currently employed to obtain non‐invasive MR biomarkers.
2. To understand the need for standardization, quality control, and change assessment tools/criteria development.
34(2007); http://dx.doi.org/10.1118/1.2761375View Description Hide Description
Recently, the role of PET and PET/CT imaging in oncology has grown dramatically. The following characteristics of PET make it an ideal imaging modality for providing functional information that can be incorporated into radiation treatment planning:
1. Ability to image nanomolar tracer concentrations.
2. Excellent tissue penetration.
3. Ability to use tracers analogous from the chemical point of view to their naturally‐occurring counterparts.
However, while general paradigm of biologically‐conformal radiation therapy has been proposed long time ago, we are still far from its implementation in routine clinical radiation treatment planning. Even in the case of such a well established tracer as 18FDG, we are still not certain of how to incorporate the information content of an FDG PET image into radiation treatment plan. While multiple approaches to FDG PET image segmentation designed to allow for tumor delineation have been proposed, all of them are failing to take into account the following factors:
1. Complexity of the tumor boundary (even assuming that the boundary does exist).
2. Highly heterogeneous morphology of the lesion.
3. Variability of intratumoral FDG uptake even in homogeneous animaltumormodels.
We believe that in order to facilitate further incorporation of PET imaging in radiation treatment planning, the following developments have to take place:
1. Implementation of higher standards of tracer validation. It is not sufficient to show that a tracer designed to image a certain function, like cell division, or an environmental parameter, like hypoxia, is characterized by high tumor uptake. Instead, it is necessary to demonstrate and in‐vivo that the tracer is in fact binding to the desired target. This can be done by performing carefully designed validation studies utilizing animaltumormodels and patient tumortissue specimens.
2. Discrete approaches to PET image segmentation (tumor vs. normal tissue, hypoxia vs. normoxia) have to be dropped in favor of probabilistic approaches. For example, gradual change of FDG uptake from low level in normal tissue to high level in the lesion has to be interpreted as a gradual change in probability of finding a tumor cell, rather than used to randomly assign location of a step‐like target boundary.
The overall goal of this presentation is to provide a short overview of the role of PET in radiation therapy treatment planning and to outline some of the research directions that should allow for the development of PET‐based biologically‐conformal radiation therapy.
34(2007); http://dx.doi.org/10.1118/1.2761376View Description Hide Description
The metabolic information provided by PET and SPECT has great potential for diagnosing and staging disease, customizing treatmentdoses for a particular patient, and tracking a patients' response to treatment. However, quantifying the information in these images remains challenging. This presentation will review the sources of variability in the data, grouped into three broad categories: patient — related factors (e.g. body habitus, medications); scanner‐related factors (spatial and energy resolution, sensitivity, data acquisition mode (2D or 3D), attenuation and scatter correction method, image reconstruction algorithm, respiratory and cardiac motion‐correction method); and operator‐related factors such as acquisition and reconstruction protocols, instrument and imagequality control, instrument calibrations, and method of image analysis. The impact of these variables on quantification will be discussed, along with methods for minimizing those that are controllable.
The presentation will conclude with a review of current efforts by government agencies, professional organizations, academic institutions, and sponsors of multicenter trials to grapple with the additional complexities that arise from combining data from multiple patients at multiple sites. Particular emphasis will be placed on the author's experience as a member of the PETQuality Assurance Committee at the American College of Radiology Imaging Network (ACRIN) PET Core Lab, which has credentialed more than 100 PET scanners for participation in quantitative PET multicenter trials.
Research partially supported by ACRIN. ACRIN is sponsored by the National Cancer Institute and receives additional support from industry partners, other governmental agencies and the ACRIN Fund for Imaging Innovation.
NOTE: ACRIN is sponsored by the National Cancer Institute and receives additional support from industry partners, other governmental agencies and the ACRIN Fund for Imaging Innovation.
1. Understand the factors that affect the variability and accuracy of PET and SPECT biomarker quantification, with particular emphasis on scanner‐related parameters.
2. Understand the importance of minimizing variability in order to enhance the ability to distinguish signal from noise.
3. Understand the efforts currently underway to standardize acquisition and processing protocols and to monitor equipment performance through credentialing and periodic quality control checks.
34(2007); http://dx.doi.org/10.1118/1.2761377View Description Hide Description
Imaging as a biomarker for drug response is becoming an increasing important area of research. There are many sources of uncertainty in the use of imaging as a biomarker for the assessment of drug response. For example, biological variability is a factor that is drug, organ,tumor and patient dependent and thus best addressed through carefully designed clinical trials such as those proposed using and NCI and an NIH wide biomarker initiatives.
However there is also measurement variability associated with the imaging data collection and analysis across different commercial platforms and uncertainty in the performance of different software tools employed to measure therapy response, such as the measurement of change in image‐relatedcomputer extracted features over time. These hardware and software sources of uncertainty often force an increase in the size of clinical drug trials, and ideally should not be a variable in measurement of drug response. The development of standardized methods to physically characterize these sources of uncertainty would stimulate the development of improved imaging methods and software tools as recommended by a recent Trans Agency Workshop organized by NIST:
This presentation will review current and potential funding opportunities for medical physicists to become engaged in the development of imaging systems and methods and related imaging standards that have the potential of being used in imaging trials for drug and therapy response.
- Advances in CT Hardware and Algorithms
34(2007); http://dx.doi.org/10.1118/1.2761498View Description Hide Description
Advancements in CT technology invariably lead to new clinical applications. Helical scanning opened up anatomy outside of the brain. Multislice technology allowed fast volumetric scanning. Sixteen slice introduced vascular applications. Sixty‐four slice scanners made cardiac scanning a matter of routine practice. In recent years, each of the evolutionary steps in CT technology has sought to improve the way helical scanning is accomplished.
Toshiba Medical Systems has developed a Beta version of a 256 slice CT system to cover nearly 13 cm of anatomy in a single rotation with 0.5 mm slices to allow the complete organ coverage in a single rotation. This system can acquire images of an entire volume at a single, instantaneous time point which significantly reduces motion artifact and eliminates contrast phase differences within the volume. Since this system does not require helical acquisition for volumetric imaging, it will deliver significantly less dose for CT coronary angiography exams as well as reduced dose in most other applications.
With its wide volume coverage, the 256 slice system promises to revolutionize the way we approach acute stroke patients, the way we look at myocardial perfusion imaging, and the way we image other moving body parts such as the lung during respiration and peripheral joint motion.
34(2007); http://dx.doi.org/10.1118/1.2761499View Description Hide Description
The first commercial CT system, introduced in 1972 under the leadership of Nobel Laureate Sir Godfrey Hounsfield, was designed as a Head‐only scanner. Despite long scan times and low image quality, it revolutionized Medicaldiagnostics with it's ability to “see” the internal structures of the human brain. In just the few years that followed, there began the commercial release of full‐body scanners which provided this gift of “sight” for all organs. This dramatic increase in overall utility immediately replaced the concept of single‐organ devices. For almost 30 years, CT systems have remained general purpose machines that are used for a great variety of diagnostic applications. Advances such as high speed gantries, helical scanning and multi‐slice detectors, have brought CT to the point where it is now able to produce diagnosticCardiacimages. But full‐body scanners remain large, expensive, complex, and costly to maintain. To this fact, there has recently emerged several dedicated extremity scanners such as for Head/Neck, Maxiofacial, ENT, Dental, Spine, Hands/Feet, and Breast. This lecture will discuss the clinical advantages, limitations, and technology that these devices bring to the market. It will also address the challenges that these devices present to Medical Physics. Join in and learn of the “Renaissance to Hounsfield”!
34(2007); http://dx.doi.org/10.1118/1.2761500View Description Hide Description
Computed tomography(CT) has been employed as a versatile visualization tool in a wide variety of applications to human and small animal imaging, industrial non‐destructive detection, material research, and security scan. The fast‐imaging capability and superior spatial/contrast resolution offered by modern CT result in tremendous opportunities for developing additional applications and imaging protocols. In the last few years or so, there have been unprecedented breakthroughs in the development of innovative cone‐beam imaging algorithms for obtaining volumetric images in cone‐beam CT. In this presentation, I will discuss the algorithm advances in cone‐beam CT in the last few years. Emphasis will be placed on targeted imaging of region of interest (ROI) and on image reconstruction from incomplete data in CT. Examples will be used to illustrate that the current and potential applications promised by the new development of CT algorithms. Finally, I will briefly use examples to illustrate that, although the algorithms to be discussed were developed for CT, they can readily be generalized to addressing image reconstruction problems in other imaging modalities, including MRI,nuclear medicine imaging, and phase‐contrast CT.
34(2007); http://dx.doi.org/10.1118/1.2761501View Description Hide Description
Clinical X‐ray computed tomography has grown in importance for all of its applications, but most importantly for evaluation of the head, chest, abdomen, pelvis, and cardiovascular system.CT delivers an increasing fraction of the overall population radiation dose. Many limitations are present due to sensitivity to motion, metal artifacts, patient size, and limited functional information with a relatively high radiation dose. There are technical developments which promise to reduce these constraints, but at a significant cost. Most important are large area detectors with 64 to 256 rows of detectors, multiple energy channels, algorithmic improvements, and multimodality systems (especially PET/CT). CT is now the essential (and often the only) radiologic imaging procedure needed to manage many patients with acute or chronic diseases. Its speed and versatility, as well as reliability and simplicity of operation ensure that its role will continue for the foreseeable future. CT is used extensively for emergencies, cardiovascular, pulmonary, gastrointestinal, endocrine, neurological, orthopedic and other applications. Further technology development is aimed at common applications where reimbursement for CT scanning services is available or will likely become available. Multicenter clinical trials are underway that compare cardiacCT with other modalities, especially SPECT and cardiac catheterization. The most demanding CT applications are cardiovascular, where complex motion and small morphologic features coexist. Clinical cardiacCT consists of bolus intravenous contrast material injection with EKG gating and simultaneous x‐ray scanning. Larger area detectors and higher frame acquisition rates partly address but don't solve all of the problems encountered due to respiratory, random body, and cardiac motion, in a spectrum of patients from infant to massively obese adult sizes (< 1 kg to 250 kg or more). The challenges and pitfalls in CT will be delineated and evaluated relative to current and future technology.
- Breakthrough in MRI: Technology and Applications
34(2007); http://dx.doi.org/10.1118/1.2761546View Description Hide Description
During the past several years there has been a growing body of research developments that might be considered the beginning of a “Post Nyquist Era” in medical imaging. Investigators at Stanford and Cal Tech have shown that the iterative non‐Fourier reconstruction “Compressed Sensing” method can be used to reconstruct single images with far less than the required number of Nyquist samples [1,2] for image data sets that satisfy certain sparsity requirements. The recent developments in parallel imaging also permit the reconstruction of relatively artifact free images by synthesizing missing k‐space information using sensitivity profiles from multiple coils[3,4]. Our group has been investigating the use of VIPR, a vastly undersampled radial imaging technique [5,6] that almost immediately provides full spatial resolution after a small number of excitations at the expense of streak artifacts. In 3D these artifacts are quite incoherent and provide little degradation of image quality. Data undersampling factors of several hundred relative to the Nyquist criterion have been achieved. Recently VIPR has been combined with a new reconstruction method called HYPR (HighlY constrained PRojection reconstruction)  that exploits the spatio‐‘temporal’ redundancy in medical image sequences involving any serial change in an imaging variable such as time, echo time, diffusion tensor encoding direction, etc. Using HYPR in combination with VIPR, angular undersampling factors on the order of 1000 in phase contrast angiography have been achieved with good image quality. A fundamental characteristic of HYPR is its ability to provide higher SNR than competing acceleration technique. The HYPR technique has been applied to X‐ray CT angiography where clinically acceptable time‐resolved angiographic image series have been reconstructed using 1/46th of the conventional x‐ray dose.
1. To understand the advantages of 3D radial undersampling for increasing spatial and temporal resolution.
2. To understand how HYPR provides further acceleration with substantial preservation of SNR.
3. To understand how HYPR may be applied in a wide variety of medical imaging applications.
WE‐D‐L100F‐02: Parallel Magnetic Resonance Imaging (or, Scanners, Cell Phones, and the Surprising Guises of Modern Tomography)34(2007); http://dx.doi.org/10.1118/1.2761547View Description Hide Description
Today, parallel data acquisition approaches are used widely in MRI, both for clinical diagnostic imaging and for research applications. Whereas in a traditional sequential MRI scan, data are collected one point and one line at a time in the presence of varying magnetic field gradients, parallel MRI uses spatial information from arrays of radiofrequency detector coils to acquire multiple data points simultaneously, thereby circumventing basic limits on imaging speed and efficiency associated with traditional sequential approaches. The use of RF coil information in combination with the traditional Fourier information available from field gradients increases the complexity of image reconstruction. In fact, parallel MRimage reconstruction may be represented as a generalized linear inverse problem. This formulation highlights connections with other modalities as well as shedding light on both the potential and the limitations of parallel imaging. In this talk, the fundamentals of parallel MR image acquisition and reconstruction will be reviewed. Analogies with X‐ray computed tomography, MIMO wireless communication, and magnetoencephalography will be explored, and some future directions in parallel MRreconstruction algorithms, hardware design, and clinical applications will be surveyed.
1. Understand the basic physical principles of parallel MRdata acquisition and the basic mathematical principles of parallel MRimage reconstruction.
2. Recognize analogies with other imaging modalities and communication technologies.
3. Identify fundamental physical (electrodynamic) limits of performance of parallel MRI systems.
4. Appreciate some of the most common current and the most promising future clinical applications of parallel MRI.
WE‐D‐L100F‐03: Direct and Indirect Magnetic Resonance Visualization of Tissue Architecture and Function: From Micro to Nanostructure34(2007); http://dx.doi.org/10.1118/1.2761548View Description Hide Description
“Form follows function” is one of the most fundamental principles underlying evolution of all organisms. Thus the desire to visualize tissue architecture has been a key driver behind all forms of microscopy starting with the magnifying lens, and leading to optical and, eventually, electron microscopy. During the past two decades methods have emerged that allow nondestructive imaging of the internal 3D structure of tissues by micro magnetic resonance (μ‐MRI) and computed tomography (μ‐CT) at resolutions of 5–50μm. MRI's unique sensitivity to biologic processes such as the interaction of water with biomolecules makes it particularly attractive as an investigational tool in biomedicine. However, the practically achievable resolution is determined by signal‐to‐noise and ultimately, diffusion, and in vivo by our ability to correct for physiologic motion. Though below the resolution limit of k‐space μ‐MRI, indirect detection techniques such as q‐space imaging, which exploit restricted diffusion, can be shown to provide quantitative information at sub‐μm resolution in some instances. This lecture intends to provide an overview of the methodology and to discuss various applications ranging from quantifying the architectural and mechanical changes of trabecular bone architecture in response to intervention in humans, to measurement of axon diameters in the mouse spinal cord by q‐space MRI using 50T/m home‐built gradients.
- Imaging for TherapyAssessment
34(2007); http://dx.doi.org/10.1118/1.2761550View Description Hide Description
The most commonly used method to assess treatment response relies on measuring tumor sizes before and after treatment and classifying tumor anatomical shrinkage according to RECIST or WHO criteria. However, there is a considerable variability between individual studies and the same response rate can be associated with completely different survival rates. Furthermore, it is known that changes in tumor biological function significantly precede gross anatomical tumor changes. Positron emission tomography(PET) is the most sensitive, specific and versatile imaging modality that can be used for this purpose.
‐Fluoro‐deoxyglucose (FDG) is the most commonly used PETimaging agent for treatment assessment. FDG shows regions of active glucose metabolism, which are typically decreased after tumor cells die in response to antineoplastic therapies. Besides FDG, several other PET agents exist, which are more specific in cell targeting, and can image different aspects of biological response to therapy. Cell proliferation, apoptosis and angiogenesis are processes that are typically affected by antineoplastic therapies. Currently, the most promising non‐FDG agent is ‐Fluorodeoxythymidine (FLT) as a marker of cell proliferation.PETimaging agents of apoptosis and angiogenesis are still mostly limited to preclinical studies. Use of PETimaging for treatment assessment imposes special requirements on image acquisition,reconstruction and analysis.
FDG‐PET has shown an extreme promise to assess treatment efficacy, both after, as well as during the course of therapy. Depending on the metabolic response, it is possible to classify patients into metabolic responders, which have typically much longer survival rates than metabolic non‐responders. Unfortunately, FDG is not without problems. Two of the most severe are (1) radiation‐induced inflammation during radiation therapy and (2) metabolic flare that occurs early after the start of some chemotherapies. FLT‐PET, which assesses proliferative response, seems to overcome these problems; however, more clinical studies are needed to prove its wider applicability. Reproducible and accurate PETimage acquisition,reconstruction and analysis are the key components required for quantitative PETimaging, which provides foundation for treatment assessment.
This symposium reviews the status of treatment assessment studies that involve repeat PETimaging during and after therapy. It discusses advantages and disadvantages of FDG and non‐FDG‐PET imaging for treatment assessment. It also emphasizes importance of the appropriate PETimaging acquisition, reconstruction and analysis that forms the basis for PETimage quantification.
1. Overview of FDG‐PET imaging for treatment assessment.
2. Overview of non‐FDG‐PET imaging for treatment assessment.
3. Review PETimage acquisition,reconstruction and analysis for treatment assessment.
4. Discuss future of PETimaging for treatment assessment.