Index of content:
Volume 34, Issue 6, June 2007
- Imaging Symposium: Room L100F
- Advanced Topics in CT: Who's Minding the Dose?
34(2007); http://dx.doi.org/10.1118/1.2761700View Description Hide Description
This three‐part symposium will provide an update on the important topic of CTradiationdose, which continues to receive increased scrutiny from a growing proportion of the medical field.
The first talk is entitled “Radiation dose considerations in pediatric CTimaging,” and will be delivered by Dr. Donald Frush, an eminent expert in pediatric radiology. There are unique considerations when performing CT in infants and children. These considerations include the sensitivity of children to radiation,radiationdose exposure and dose assessment for this technology. The special technical considerations make pediatric CT in many ways more demanding than in adults. Accurate dose assessment is important for determining potential risks essential for the risk‐benefit ratio in this younger population.
The second talk is entitled “CT measurements: The good, the bad, and the ugly,” and will be delivered by Dr. John Boone, an esteemed medical physicist. The theory behind the current method for assessing radiationdose in CT (the CTdose index) will be reviewed. This method has many positive features (“the good”), and a growing number of limitations (“the bad”). Several groups have been formed to develop a different approach to assessing CTradiationdose, and many issues must be carefully considered (“the ugly”. Dr. Boone will provide his perspective on these and other interesting developments in the area of CTdose assessment.
The third talk is entitled “Custom patient‐based CTradiationdose estimations,” and will be delivered by Dr. Michael McNitt‐Gray, also an esteemed medical physicist. An approach that appears to be gaining in popularity for CT patient dose estimation is the use of computermodels and scan simulations. This type of methodology requires a substantial investment in program development, but can result in a very flexible approach for exploring alternative technique design strategies. The use of computer modeling may find a more widespread role in CTdose assessment in the future, so it is important to understand its limitations in addition to its advantages.
- Advances in CT Hardware and Algorithms
34(2007); http://dx.doi.org/10.1118/1.2761498View Description Hide Description
Advancements in CT technology invariably lead to new clinical applications. Helical scanning opened up anatomy outside of the brain. Multislice technology allowed fast volumetric scanning. Sixteen slice introduced vascular applications. Sixty‐four slice scanners made cardiac scanning a matter of routine practice. In recent years, each of the evolutionary steps in CT technology has sought to improve the way helical scanning is accomplished.
Toshiba Medical Systems has developed a Beta version of a 256 slice CT system to cover nearly 13 cm of anatomy in a single rotation with 0.5 mm slices to allow the complete organ coverage in a single rotation. This system can acquire images of an entire volume at a single, instantaneous time point which significantly reduces motion artifact and eliminates contrast phase differences within the volume. Since this system does not require helical acquisition for volumetric imaging, it will deliver significantly less dose for CT coronary angiography exams as well as reduced dose in most other applications.
With its wide volume coverage, the 256 slice system promises to revolutionize the way we approach acute stroke patients, the way we look at myocardial perfusion imaging, and the way we image other moving body parts such as the lung during respiration and peripheral joint motion.
34(2007); http://dx.doi.org/10.1118/1.2761499View Description Hide Description
The first commercial CT system, introduced in 1972 under the leadership of Nobel Laureate Sir Godfrey Hounsfield, was designed as a Head‐only scanner. Despite long scan times and low image quality, it revolutionized Medicaldiagnostics with it's ability to “see” the internal structures of the human brain. In just the few years that followed, there began the commercial release of full‐body scanners which provided this gift of “sight” for all organs. This dramatic increase in overall utility immediately replaced the concept of single‐organ devices. For almost 30 years, CT systems have remained general purpose machines that are used for a great variety of diagnostic applications. Advances such as high speed gantries, helical scanning and multi‐slice detectors, have brought CT to the point where it is now able to produce diagnosticCardiacimages. But full‐body scanners remain large, expensive, complex, and costly to maintain. To this fact, there has recently emerged several dedicated extremity scanners such as for Head/Neck, Maxiofacial, ENT, Dental, Spine, Hands/Feet, and Breast. This lecture will discuss the clinical advantages, limitations, and technology that these devices bring to the market. It will also address the challenges that these devices present to Medical Physics. Join in and learn of the “Renaissance to Hounsfield”!
34(2007); http://dx.doi.org/10.1118/1.2761500View Description Hide Description
Computed tomography(CT) has been employed as a versatile visualization tool in a wide variety of applications to human and small animal imaging, industrial non‐destructive detection, material research, and security scan. The fast‐imaging capability and superior spatial/contrast resolution offered by modern CT result in tremendous opportunities for developing additional applications and imaging protocols. In the last few years or so, there have been unprecedented breakthroughs in the development of innovative cone‐beam imaging algorithms for obtaining volumetric images in cone‐beam CT. In this presentation, I will discuss the algorithm advances in cone‐beam CT in the last few years. Emphasis will be placed on targeted imaging of region of interest (ROI) and on image reconstruction from incomplete data in CT. Examples will be used to illustrate that the current and potential applications promised by the new development of CT algorithms. Finally, I will briefly use examples to illustrate that, although the algorithms to be discussed were developed for CT, they can readily be generalized to addressing image reconstruction problems in other imaging modalities, including MRI,nuclear medicine imaging, and phase‐contrast CT.
34(2007); http://dx.doi.org/10.1118/1.2761501View Description Hide Description
Clinical X‐ray computed tomography has grown in importance for all of its applications, but most importantly for evaluation of the head, chest, abdomen, pelvis, and cardiovascular system.CT delivers an increasing fraction of the overall population radiation dose. Many limitations are present due to sensitivity to motion, metal artifacts, patient size, and limited functional information with a relatively high radiation dose. There are technical developments which promise to reduce these constraints, but at a significant cost. Most important are large area detectors with 64 to 256 rows of detectors, multiple energy channels, algorithmic improvements, and multimodality systems (especially PET/CT). CT is now the essential (and often the only) radiologic imaging procedure needed to manage many patients with acute or chronic diseases. Its speed and versatility, as well as reliability and simplicity of operation ensure that its role will continue for the foreseeable future. CT is used extensively for emergencies, cardiovascular, pulmonary, gastrointestinal, endocrine, neurological, orthopedic and other applications. Further technology development is aimed at common applications where reimbursement for CT scanning services is available or will likely become available. Multicenter clinical trials are underway that compare cardiacCT with other modalities, especially SPECT and cardiac catheterization. The most demanding CT applications are cardiovascular, where complex motion and small morphologic features coexist. Clinical cardiacCT consists of bolus intravenous contrast material injection with EKG gating and simultaneous x‐ray scanning. Larger area detectors and higher frame acquisition rates partly address but don't solve all of the problems encountered due to respiratory, random body, and cardiac motion, in a spectrum of patients from infant to massively obese adult sizes (< 1 kg to 250 kg or more). The challenges and pitfalls in CT will be delineated and evaluated relative to current and future technology.
- Advances in X‐ray Imaging
34(2007); http://dx.doi.org/10.1118/1.2761220View Description Hide Description
Since Roentgen discovered x‐ray and performed the first x‐ray imaging over 100 years ago, x‐ray medical imaging has always been based on the biological tissue's differences in x‐ray attenuation. However, x‐ray‐tissue interaction causes x‐ray phase shifts as well, and the phase‐shift differences between different tissues are about one thousand times larger than their attenuation differences. The phase‐sensitive x‐ray imaging hence allows not only visualizing the tissues with very low attenuation‐contrast, but also quantifying tissue's projected electron densities by means of the phase retrieval.
In this presentation we first introduce the concept of spatialcoherence of x‐ray wavefield and elucidate the mechanism of x‐ray phase‐contrast formation. We then review recent progress in the in‐line phase—sensitive x‐ray imaging.
1. Elucidate the mechanism of x‐ray phase—contrast formation.
2. Review recent progress in the in‐line phase—sensitive x‐ray imaging.
34(2007); http://dx.doi.org/10.1118/1.2761221View Description Hide Description
Purpose: To develop a new field emission x‐ray source with multi‐pixel and multiplexing capability for high speed computed tomographyMethod and Materials: The new x‐ray source utilizes the carbon nanotubes(CNTs) as the “cold cathode” to replace the thermionic cathode used in the conventional x‐ray tube. The x‐ray tube current is generated by applying an external electrical field to extract the electrons from the carbon nanotubes. By controlling the triggering signal, x‐ray radiation with programmable waveforms can be readily generated. One‐ and two‐dimensionally pixilated x‐ray source with spatially distributed focal spots is constructed by using a matrix addressable multi‐pixel CNTcathode. The electronics for controlling and integration of the x‐ray source have been demonstrated. Results: A new x‐ray source based on the CNTfield emission technology has been developed. The x‐ray source is capable of generating the flux, energy and spatial resolution required for medical imaging. A micro‐focus x‐ray source with 30–50μm isotropic focal spot size has also been developed. A micro‐CT scanner using the field emission x‐ray source was constructed for imaging for small animal models with respiratory and cardiac gating capability. The results from preliminary studies will be discussed Conclusion: The CNT based field emission x‐ray source offers unique capabilities that are attractive for high speed CTimaging such as spatially distributed x‐ray pixels. It enables collection of multiple projection images from different viewing angles without mechanical motion either sequentially or simultaneously which can potentially lead to stationary gantry‐free CT scanners for high resolution and high‐speed imaging.
34(2007); http://dx.doi.org/10.1118/1.2761222View Description Hide Description
Digital tomosynthesis is one of the most exciting recent developments in breast imaging. By modifying existing full field digital mammography systems, one can achieve this type of limited‐angle cone‐beam CTimaging which produces 3D slice images of the breast. Overlapping dense tissue in mammography is one of the most common causes for unnecessary callbacks as well as missed cancers. Since the 3D images remove such overlapping tissue, breast tomosynthesis can improve radiologists' specificity by obviating unnecessary callbacks. It can also improve sensitivity by allowing easier detection and characterization of breast cancers which might otherwise be obscured. Most remarkably, tomosynthesis can achieve all this with a scan that is comparable to the speed, resolution, cost, and dose of conventional mammography. For these reasons, tomosynthesis stands poised as the only imaging technique with the potential to completely replace the current role of mammography as the primary tool in breast cancer screening and diagnosis.
This presentation will cover both the hype and hope surrounding breast tomosynthesis. From a medical physics perspective, the latest results will be reviewed from recent studies to optimize radiographic techniques, acquisition modes, and reconstruction algorithms. In addition, emerging results will be surveyed from advanced applications including display/visualization, computer aided detection, and contrast enhanced tomosynthesis. Finally, the clinical promise and risks of this new technology will be discussed using initial clinical trial results.
This research was supported in part by research grants from Siemens Medical Solutions, US Army Breast Cancer Research Program, and NIH/NCI.
1. Understand the difference between breast tomosynthesis and dedicated breast CT.
2. Appreciate the many medical physics issues involved in the development and optimization of breast tomosynthesis.
3. Understand the clinical promise and concerns of using breast tomosynthesis.
34(2007); http://dx.doi.org/10.1118/1.2761224View Description Hide Description
While the vast majority of x‐ray imaging procedures use a single x‐ray source, a growing number require measurement of x‐ray transmission from different locations. These exams are generally performed by moving a single x‐ray source to many positions. Examples include computed tomography(CT) and tomosynthesis. At the same time, several technologies are now becoming more available that enable to construction of source arrays, i.e., a large number of sources that can be turned on in sequence. The purpose of this presentation is to explore the capabilities and benefits of imaging with these source arrays.
In CT, small deflections of the x‐ray focal spot (on the order of a millimeter) have been used to improve data sampling. Multiple sources or much larger deflections (on the order of centimeters) in the axial direction can be quite useful for increasing the thickness of the imaging slab in a single scan while reducing cone beam artifacts. Distributing sources in the circumferential direction can dramatically reduce the mechanical rotation needed to obtain a complete data set and therefore can shorten the scan time. The recently introduced dual source CT system is the most recent incarnation in this direction. The limiting case is CT scanning with no mechanical rotation, as in electron beamCT. An array of sources separated in the circumferential direction can also be used to reduce the size of the detector array. This can lead to a reduction in detected x‐ray scatter and perhaps in the adoption of more advanced detector technologies. It also provides a reasonable method for optimization of the incident intensity distribution, thereby potentially reducing the dose to the patient while also tailoring the distribution of quantum noise across the image.
In tomosynthesis, as in CT,measurements from different directions are used to produce tomographic information, albeit with imperfect spatial separation due to data insufficiency. Nonetheless, tomosynthesis can be powerful in many settings where obtaining complete CT data is difficult, and is gaining popularity. Data acquisition with mechanical motion of a single source is limiting both in the needed imaging time and in the number of projections that can be obtained. A small number of projections makes the blurring function (the “crosstalk” from one location to another from imperfect spatial separation) very discrete and potentially distracting to the viewer. Source arrays can be used to produce tomosynthesisimages with very short imaging time and smooth blurring functions, both of which are preferred.
Thus, there are significant benefits from the use of distributed sources. The author anticipates that these source arrays will become more widely adopted.
Research sponsored by GE Healthcare.
1. become familiar with the types of source arrays and their capabilities.
2. understand the benefits of source arrays in computed tomography.
3. understand the benefits of source arrays in tomosynthesis.
- Breakthrough in MRI: Technology and Applications
34(2007); http://dx.doi.org/10.1118/1.2761546View Description Hide Description
During the past several years there has been a growing body of research developments that might be considered the beginning of a “Post Nyquist Era” in medical imaging. Investigators at Stanford and Cal Tech have shown that the iterative non‐Fourier reconstruction “Compressed Sensing” method can be used to reconstruct single images with far less than the required number of Nyquist samples [1,2] for image data sets that satisfy certain sparsity requirements. The recent developments in parallel imaging also permit the reconstruction of relatively artifact free images by synthesizing missing k‐space information using sensitivity profiles from multiple coils[3,4]. Our group has been investigating the use of VIPR, a vastly undersampled radial imaging technique [5,6] that almost immediately provides full spatial resolution after a small number of excitations at the expense of streak artifacts. In 3D these artifacts are quite incoherent and provide little degradation of image quality. Data undersampling factors of several hundred relative to the Nyquist criterion have been achieved. Recently VIPR has been combined with a new reconstruction method called HYPR (HighlY constrained PRojection reconstruction)  that exploits the spatio‐‘temporal’ redundancy in medical image sequences involving any serial change in an imaging variable such as time, echo time, diffusion tensor encoding direction, etc. Using HYPR in combination with VIPR, angular undersampling factors on the order of 1000 in phase contrast angiography have been achieved with good image quality. A fundamental characteristic of HYPR is its ability to provide higher SNR than competing acceleration technique. The HYPR technique has been applied to X‐ray CT angiography where clinically acceptable time‐resolved angiographic image series have been reconstructed using 1/46th of the conventional x‐ray dose.
1. To understand the advantages of 3D radial undersampling for increasing spatial and temporal resolution.
2. To understand how HYPR provides further acceleration with substantial preservation of SNR.
3. To understand how HYPR may be applied in a wide variety of medical imaging applications.
WE‐D‐L100F‐02: Parallel Magnetic Resonance Imaging (or, Scanners, Cell Phones, and the Surprising Guises of Modern Tomography)34(2007); http://dx.doi.org/10.1118/1.2761547View Description Hide Description
Today, parallel data acquisition approaches are used widely in MRI, both for clinical diagnostic imaging and for research applications. Whereas in a traditional sequential MRI scan, data are collected one point and one line at a time in the presence of varying magnetic field gradients, parallel MRI uses spatial information from arrays of radiofrequency detector coils to acquire multiple data points simultaneously, thereby circumventing basic limits on imaging speed and efficiency associated with traditional sequential approaches. The use of RF coil information in combination with the traditional Fourier information available from field gradients increases the complexity of image reconstruction. In fact, parallel MRimage reconstruction may be represented as a generalized linear inverse problem. This formulation highlights connections with other modalities as well as shedding light on both the potential and the limitations of parallel imaging. In this talk, the fundamentals of parallel MR image acquisition and reconstruction will be reviewed. Analogies with X‐ray computed tomography, MIMO wireless communication, and magnetoencephalography will be explored, and some future directions in parallel MRreconstruction algorithms, hardware design, and clinical applications will be surveyed.
1. Understand the basic physical principles of parallel MRdata acquisition and the basic mathematical principles of parallel MRimage reconstruction.
2. Recognize analogies with other imaging modalities and communication technologies.
3. Identify fundamental physical (electrodynamic) limits of performance of parallel MRI systems.
4. Appreciate some of the most common current and the most promising future clinical applications of parallel MRI.
WE‐D‐L100F‐03: Direct and Indirect Magnetic Resonance Visualization of Tissue Architecture and Function: From Micro to Nanostructure34(2007); http://dx.doi.org/10.1118/1.2761548View Description Hide Description
“Form follows function” is one of the most fundamental principles underlying evolution of all organisms. Thus the desire to visualize tissue architecture has been a key driver behind all forms of microscopy starting with the magnifying lens, and leading to optical and, eventually, electron microscopy. During the past two decades methods have emerged that allow nondestructive imaging of the internal 3D structure of tissues by micro magnetic resonance (μ‐MRI) and computed tomography (μ‐CT) at resolutions of 5–50μm. MRI's unique sensitivity to biologic processes such as the interaction of water with biomolecules makes it particularly attractive as an investigational tool in biomedicine. However, the practically achievable resolution is determined by signal‐to‐noise and ultimately, diffusion, and in vivo by our ability to correct for physiologic motion. Though below the resolution limit of k‐space μ‐MRI, indirect detection techniques such as q‐space imaging, which exploit restricted diffusion, can be shown to provide quantitative information at sub‐μm resolution in some instances. This lecture intends to provide an overview of the methodology and to discuss various applications ranging from quantifying the architectural and mechanical changes of trabecular bone architecture in response to intervention in humans, to measurement of axon diameters in the mouse spinal cord by q‐space MRI using 50T/m home‐built gradients.
- Breast Imaging: Updates on Technology Beyond Mammography
34(2007); http://dx.doi.org/10.1118/1.2761331View Description Hide Description
Screen film mammography will gradually give way to digital mammographysystems, which have been shown in the DMIST trial to have slightly better performance for women with dense breasts. New developments in breast imaging technology have advanced beyond mammography towards potentially interesting and potentially better technologies. In this presentation, some of the basic science foundations for these advanced technology systems will be discussed. Mimicking the changes which are occurring in general radiology practice, where tomographic imaging modalities such as computed tomography(CT) are replacing more traditional projection radiographic techniques, tomographic methods for breast imaging are also becoming practical with advancements in digital detectorsystems and computer‐based algorithms. The presentations in this symposium will highlight some of the advancements which have taken place in these and other areas.
A number of groups around the country have been developing computed tomographysystems for breast imaging based upon flat‐panel detectors, using cone‐beam CT of the pendant breast. While demonstration of the comparative performance of breast CT compared to digital mammography is perhaps several years into the future, clinical images acquired during Phase II breast CT trials may shed light on the ultimate performance of breast CT in relationship to digital mammography. One of the central motivations for tomographic imaging of the breast is that the anatomical noise due to the normal breast parenchyma is reduced, due to the tomographic nature of breast CT. Although the noise power spectrum (NPS) is typically used to characterize the quantum and electronic noise characteristics of an imagingsystem, this metric can also be used to evaluate the anatomical noise characteristics of imagingsystems. Towards this end, a Phase II clinical trial on breast CT was used to acquire projection imaging data (mammographic equivalent) as well as reconstructedCTimages of the breast. NPS analysis was used to characterize the frequency‐dependent noise characteristics of the anatomical noise in these different imaging modalities. Consistent with the work of Burgess, it was found that mammographic projection image data has a noise power spectrum which obeys a power spectrum, where NPS (F) = alpha × F∧‐beta, where the beta term is approximately 2.8. Theoretical predictions suggest that breast CTimages would result in a similar power law relationship with the anatomical noise in the breast, although the value of beta would be reduced by a numerical value of 1 (beta′ = beta − 1). Using a dataset of images from 40 patients undergoing breast CT, the NPS was calculated on projection breast images (“mammograms”) as well as the reconstructed breast CTimages, and the results indicate that the reduction in the value of beta is on the order of 1.3 (beta′ = beta − 1.3). These data will be presented, and following based upon Burgess' work, it is argued that breast CTimages may have better detection performance than digital mammography, based upon these assessments of anatomical noise.
One of the limitations of mammography as well as breast CT is that these modalities image the anatomical nature of the breast. Breast cancer lesions, therefore, must be identified based upon their morphologic characteristics. With the advent of molecular medicine techniques, positron emission tomography(PET) of the breast is being explored in a number of institutions. A PET/CT system designed specifically for imaging the breast was evaluated in a number of patients. A high resolution PETdetectorsystem was fabricated and integrated into an existing breast CT scanner. In this preliminary investigation, a number of patients were imaged using the PET/CT system for the breast. A description of the technology development, as well as an expose of the hybrid image datasets will be presented. While these data are too preliminary to make statements about the ultimate clinical utility of PET/CT of the breast, the incorporation of molecular imaging techniques for breast cancer evaluation and staging represents a significant step towards the development of more sensitive and patient‐specific imaging protocols.
The incorporation of computer‐aided diagnosis (CAD) systems into digital mammographysystems has been a significant advance of the use of image processing to augment the observer detection performance of radiologists. While recent publications have described the limitations of CAD routines in the hands of radiologists who may not specialize in breast imaging, there is a general consensus in the breast imaging community that CAD has a role to play and will most likely have an increasingly important role to play in breast cancerdetection. The development of CAD routines, with a description of past and present performance levels, will be described. While the use of CAD in projection digital mammography is most common, researchers in this field are developing advanced CAD algorithms for the detection of breast cancer in 3‐dimensional tomographic data of the breast, including magnetic resonance imaging (MRI) and breast CT. The development of CAD routines in projection mammography as well as tomographic breast imagingsystems will be described.
Breast imaging is at an interesting crossroads, where the advancements in technology in other areas of human imaging are being applied to breast cancer screening, diagnosis, and staging. In this symposium, a description of the potential advancements based upon these new technologies will be presented by leaders in their respective fields. While there is some degree of speculation in these research‐oriented presentations, this symposium may well chronicle the advancements of breast CTimaging and diagnostic systems 10 years into the future.
- Current Status and Future Development in Breast Ultrasound Imaging
34(2007); http://dx.doi.org/10.1118/1.2761264View Description Hide Description
Methods for imaging the elastic properties of soft tissues have undergone rapid development in recent years. This is in part facilitated by the ever‐increasing capabilities of ultrasound imaging systems. This presentation will describe the basic mechanics of soft tissues and terminology used in elasticityimaging (e.g., stress, strain, elastic modulus). Key developments in motion tracking for elasticityimaging will be reviewed with emphasis on the current state of the art in motion tracking algorithms for accurate estimation of displacement fields. That discussion sets the stage for an analysis of observations of soft tissue mechanics as seen via real‐time elasticity (mechanical strain) images of breast tissues during relatively large deformations. For example, many benign lesions tend to lose contrast in strain images as deformation increases. That behavior can be understood given the measured mechanical properties of in vitro tissue samples. Unlike benign lesions most canceroustumors tend to maintain a large negative contrast with increasing deformation, and that too can be understood from in vitro measurements. The utility of this information is summarized with a review of recent clinical trials of breast elasticityimaging. Specifically, a recent multi‐institutional, multi‐observer study has demonstrated that elasticityimaging increases the diagnostic confidence of breast ultrasoundradiologists when attempting to classify a tumor as either “benign” or “malignant”. The presentation will conclude with a brief discussion of the prospects for future enhancements and improvements in elasticityimage formation and information content.
1. Understand the basic vocabulary used to describe simple solid mechanics.
2. Understand the methods used to form real‐time elasticityimages using ultrasound and the potential for significant improvements in the future.
3. Understand the basic criteria for interpreting breast elasticityimages.
34(2007); http://dx.doi.org/10.1118/1.2761265View Description Hide Description
In this paper, we describe the principles of an imaging technique that produces a map of the mechanical response of an object to a force applied at each point. The method uses ultrasound radiation force to remotely exert a localized oscillating stress field at a desired frequency within (or on the surface of) an object. Harmonic radiation force is produced by mixing two ultrasound beams of different frequencies at their focal point. The resulting radiation force occurs at the difference and the sum of the two frequencies. In response to this force, a part of the object vibrates. The size of this part and the motion pattern depend on object viscoelastic and reflection characteristics. The acoustic field resulting from object vibration at the difference frequency, which we refer to as “acoustic emissions,” is detected by a sensitive hydrophone and used to form the image of the object that represents magnitude or phase or frequency content of the signal at each point over a raster scanned region. This method benefits from the high spatial definition of ultrasound radiation force and high motion‐detection sensitivity offered by the hydrophone. The images have no speckle and are of high contrast. We call this technique ultrasound‐stimulated vibroacoustography (USVA). The method has been applied to imaging lesions in breast, detection of calcification in vessels, detection of brachytherapy seeds in prostate, and modal vibration of vessels.
1. Understand radiation pressure.
2. Understand vibro‐acoustography.
3. Understand possible applications of vibro‐acoustography in Medical Imaging.
34(2007); http://dx.doi.org/10.1118/1.2761267View Description Hide Description
Several ultrasonic parameters are known to have temperature dependence including the speed of sound,attenuation, nonlinearity parameter, etc. Local changes in tissue temperature can be estimated noninvasively using diagnostic pulse‐echo ultrasound. A number of temperature estimation methods have been proposed since the early 1990s with varying levels of success. One method that was investigated by numerous groups is based on minute changes in echo locations due to temperature‐dependent changes in the speed of sound and thermal expansion. Both in vitro and in vivo results have been obtained to demonstrate the feasibility of the method. However, several limitations have been shown to limit the use of this method in practice. In this paper, we describe a true two‐dimensional displacement tracking algorithm with robust temperature estimation from real‐time 2D pulse‐echo ultrasound. The method employs a physics‐based Kalman filter derived from the transient bioheat transfer equation (BHTE). The filter is shown to be effective in rejecting displacement artifacts from tissue motion as well as those resulting from the thermal lens effects.
This lecture will provide a description of the BHTE and the system approach leading to the design of the Kalman filter for tracking the temperature data based on the observed displacements in echo location in the region of interest. The tracking properties of the filter and its ability to reject artifacts from the observed displacements will be illustrated. One‐dimensional and two‐dimensional versions of the filter will be presented and contrasted in terms of their ability to reject displacement artifacts. Comparisons with previously published algorithms will be given using in vivo and in vitro data.
We will also present results on using the Kamlan filter methodology in tracking changes in tissue absorption associated with lesion formation using high intensity focused ultrasound (HIFU). This technique may lead to quantitative parameter imaging for monitoring and control of HIFU treatments under ultrasoundimage guidance.
The Educational Objectives:
1. Understand the origin of temperature‐dependence of relevant acoustic parameter and their effects on observed pulse‐echo data.
2. Understand the limitations of noninvasive temperature estimation using pulse‐echo data.
3. Understand the connection between the physical model for temperature evolution and robust temperature and parameter tracking in tissue media.
- Image Performance‐Can it be Predicted?
34(2007); http://dx.doi.org/10.1118/1.2761744View Description Hide Description
In CT, PET/CT and SPECT/CT imaging research, we are often involved in the design, development and evaluation of instrumentation, data acquisition methods, and image reconstruction and processing techniques. Ideally, clinical trials using patient images from actual imaging systems should be used in the evaluation studies. In practice, clinical trials are difficult to perform due to the difficulties and the high costs in acquiring ‘good quality’ clinical images and physicians' time to read them. Most importantly, there is a lack of known ‘truth’ in most clinical images. In the past, although computer simulation methods allowed generation of a large number of images with known ‘truth’, they suffered from the availability of computergenerated phantoms that realistically model human anatomy and physiological functions and the ability to generate simulate data that accurately represent data acquired from actual medical imaging systems. We have developed a 4D computergenerated phantom that are based on the Visible Human data and cardiac and respiratory gated MRI and CT data from normal human subjects. The non‐uniform rational b‐splines (NURB) computer graphics tools were used to allow accurate modeling of the shapes of 3D anatomical structures and generation of collections of phantoms that represent variations in anatomical structures and physiological functions found in different patient populations. Also, analytical and Monte Carlo simulation methods have been developed to provide data that accurately model the imaging system geometry, and photon interactions in the phantom and within the imaging system. Applications of the simulation tools to CT, PET/CT and SPECT/CT imaging research will be demonstrated with sample research projects. They demonstrate the potential utility of the simulation tools in a wide variety of applications in the research and development of instrumentation, image reconstruction and processing methods of different medical imaging modalities in the years to come.
34(2007); http://dx.doi.org/10.1118/1.2761745View Description Hide Description
Annual screening with mammography is the best known method for early detection of breast carcinoma and is known to reduce breast cancer mortality by approximately 25%. Nonetheless, the diagnostic accuracy of mammography is not perfect. Studies have shown that 30% of cancers are not detected, and 70–90% of biopsies recommended based on mammographic studies turn out to be negative. One of the limiting problems with mammography is that the recorded 2D image represents the superposition of the 3D breast, thus normal anatomical breast structure can combine with useful diagnostic information in such a way to impede visualization of breast tumors. One technique for improving visualization of breast tissue is tomographic imaging of the breast. There has been much interest of late in tomographic breast imaging methods such as tomosynthesis and computed tomography(CT). Tomographic breast imaging systems are complex imaging devices, and there are a number of system and acquisition parameters that should be evaluated for optimal performance. One powerful approach for optimizing and evaluating such systems is to use computer simulationmodels.
This presentation will discuss a computer simulation methodology developed to model tomographic breast imaging modalities using a cesium iodine (CsI ) based amorphous silicon flat‐panel detector. The simulation is divided into three stages: 1) modeling the x‐ray spectra typically used for each modality, and scaling the x‐ray fluence to provide the appropriate radiation dose, 2) determining the x‐ray transmission through the breast model, and 3) modeling the signal and noise propagation through the CsI based detector. Another important aspect that will be discussed is the modeling of 3D breast structure and breast tumors. Examples showing how this computer simulation can be used to evaluate tomographic breast imaging systems will be presented.
1. To understand the issues involved in developing computermodels for tomographic flat‐panel breast imaging systems.
2. To understand the issues involved in developing 3D breast tissue and tumormodels.
3. To learn about how computer simulationmodels can be used in exploring tomographic breast imaging systems.
34(2007); http://dx.doi.org/10.1118/1.2761746View Description Hide Description
Purpose: Cardiovascular disease is considered the leading cause of death in the US, accounting for 38% of all deaths, with an estimated direct and indirect cost of almost $400 billion. Coronary artery disease is principally identified and diagnosed using contrast angiography and cardiac CT imaging acquisitions. The success of the detection task depends on the physicians' capability to diagnose the presence of this disease and also relies very much on the image quality. The quality of the imaging modalities is also a very important component in coronary artery disease treatment‐planning, follow‐up, and in image‐guidedcardiac and vascular interventions as it directs and influences the physicians' options and actions. Our goal is to examine those imaging modalities and develop recommendations for patient‐ and case‐specific imaging protocols. Methods and Materials: We developed a virtual catheterization laboratory which includes an imagingsystem simulator, patient models, and a virtual radiologist. The imagingsystem simulators use Monte Carlo techniques for the x‐ray and particle transport and detection. The patient models are male and female anthropomorphic phantoms, which include detailed anatomical descriptions of each organ, including high resolution hearts with realistic statistical models of coronary artery pathology. The virtual radiologist uses mathematical and computer models to simulate human detection performance. By simulating multiple images we can test system parameters such as the geometry, resolution, scatter, and beam quality which reduce the x‐ray dose and iodine contrast quantities administered to the patient. Results: We developed a realistic x‐ray imaging simulation suite to evaluate and optimize coronary artery disease diagnosis and treatment. Conclusions: The development of improved personalized imaging acquisition protocols using modeling tools has the ultimate goal to reduce patient mortality rates and improve treatment outcomes related to cardiovascular disease.
- Imaging for TherapyAssessment
34(2007); http://dx.doi.org/10.1118/1.2761550View Description Hide Description
The most commonly used method to assess treatment response relies on measuring tumor sizes before and after treatment and classifying tumor anatomical shrinkage according to RECIST or WHO criteria. However, there is a considerable variability between individual studies and the same response rate can be associated with completely different survival rates. Furthermore, it is known that changes in tumor biological function significantly precede gross anatomical tumor changes. Positron emission tomography(PET) is the most sensitive, specific and versatile imaging modality that can be used for this purpose.
‐Fluoro‐deoxyglucose (FDG) is the most commonly used PETimaging agent for treatment assessment. FDG shows regions of active glucose metabolism, which are typically decreased after tumor cells die in response to antineoplastic therapies. Besides FDG, several other PET agents exist, which are more specific in cell targeting, and can image different aspects of biological response to therapy. Cell proliferation, apoptosis and angiogenesis are processes that are typically affected by antineoplastic therapies. Currently, the most promising non‐FDG agent is ‐Fluorodeoxythymidine (FLT) as a marker of cell proliferation.PETimaging agents of apoptosis and angiogenesis are still mostly limited to preclinical studies. Use of PETimaging for treatment assessment imposes special requirements on image acquisition,reconstruction and analysis.
FDG‐PET has shown an extreme promise to assess treatment efficacy, both after, as well as during the course of therapy. Depending on the metabolic response, it is possible to classify patients into metabolic responders, which have typically much longer survival rates than metabolic non‐responders. Unfortunately, FDG is not without problems. Two of the most severe are (1) radiation‐induced inflammation during radiation therapy and (2) metabolic flare that occurs early after the start of some chemotherapies. FLT‐PET, which assesses proliferative response, seems to overcome these problems; however, more clinical studies are needed to prove its wider applicability. Reproducible and accurate PETimage acquisition,reconstruction and analysis are the key components required for quantitative PETimaging, which provides foundation for treatment assessment.
This symposium reviews the status of treatment assessment studies that involve repeat PETimaging during and after therapy. It discusses advantages and disadvantages of FDG and non‐FDG‐PET imaging for treatment assessment. It also emphasizes importance of the appropriate PETimaging acquisition, reconstruction and analysis that forms the basis for PETimage quantification.
1. Overview of FDG‐PET imaging for treatment assessment.
2. Overview of non‐FDG‐PET imaging for treatment assessment.
3. Review PETimage acquisition,reconstruction and analysis for treatment assessment.
4. Discuss future of PETimaging for treatment assessment.