Volume 34, Issue 6, June 2007
Index of content:
- Imaging Continuing Education Course: Room L100F
CE‐Imaging: The Physics and Technology of Radiography — I
34(2007); http://dx.doi.org/10.1118/1.2761188View Description Hide Description
RadiationDoses in DR: Part 1 — Status report on AAPM TG116: A Recommended Standard Detector Exposure Index in Digital Radiography.
This presentation will briefly cover image detector exposure indices in digital radiography. The best way to understand what these indices tell us about image quality is to understand how they are related to image noise in digital radiography and the appropriateness of the technique factors used to make an exposure and create an image. The phenomenon of “Exposure creep” will be reviewed and an update will be provided on the AAPM's and the International Electrotechnical Commission's efforts to improve the tools available to cope with it.
1. Understand how over‐ and under‐ exposures are manifested in digital radiography.
2. Recognize the advantages and disadvantages of wide exposure latitude in digital radiography and ho it may impact patient dose.
3. Be able to explain the root cause of “exposure creep” and identify clinical methods to reduce it.
4. Understand current detector exposure indices in use today, how they are related to image quality, and how they are defined by various manufacturers.
5. Recognize the problem caused by having multiple index definitions in clinical use.
6. Be familiar with the exposure indices being proposed by the AAPM (TG116) and the IEC (WG43).
7. Understand why it is important to allow the exposure index to change with corrections applied to an incorrectly identified VOI by the technologist after the image is acquired.
8. Be familiar with the concept of a “Relative” exposure index or “Deviation Index”.
9. Understand the importance of maintaining exposure index logs and tracking them over time.
10. Understand the importance of establishing clear rules for repeats for the technologists and what the AAPM (TG116) will recommend for them.
11. RadiationDoses in DR: Part 2 — Towards Optimizing CR/DR Techniques.
Unlike their film/screen forerunners, CR/DR systems provide radiology with the potential of optimizing receptor exposures—and therefore patient exposures—on an exam‐specific basis. A necessary step towards this goal is assessing and monitoring receptor exposure levels currently being used. Issues associated with this task, including the current lack CR/DR exposure index standards, imprecise and/or incorrect CR/DR calibration by service personnel will be reviewed.
Practical problems encountered in measuring and comparing receptor exposures among different systems will be discussed, along with suggestions for medical physicists working this area.
Upon completion of this presentation, the attendee will:
1. Understand that optimum doses for CR and DR may be customized to specific examinations and clinical operations.
2. Understand the importance of monitoring current practice to the development of optimum radiographic techniques.
3. Understand the importance of exposure indices in stabilizing clinical techniques.
4. Recognize some problems encountered in current clinical practice.
5. Understand some of the issues related to measuring and comparing receptor exposures on different systems.
6. Understand some ideas being developed to optimize CR/DR techniques.
CE‐Imaging: The Physics and Technology of Fluoroscopy — I
MO‐B‐L100F‐01: Technical Advances of Fluoroscopy with Special Interests in Automatic Dose Rate Control Logic of Cardiovascular Angiography Systems34(2007); http://dx.doi.org/10.1118/1.2761200View Description Hide Description
In the past decade, the fluoroscopic systems equipped with image intensifiers had benefited from various technical advances in x‐ray tubes, x‐ray generator design, and application of spectral shaping filter. While the photoconductor (or, phosphor plate) x‐ray detectors and signal capture thin‐film‐ transistor (TFT) arrays and charged couple detector(CCD)devices are analog in nature, not until the advent of flat panel image receptor (FPIR), the fluoroscopy procedures would become a totally digital process through out the entire imaging chain.
However, irrespective of the fluoroscopic image receptor; i.e., whether it is the image intensifier (I.I.) or a flat panel image receptor (FPIR), and whether the image signals are “analog” or “digital” in nature, the entire imaging system is under the command of the automatic dose rate control (ADRC) system. In fact, the ADRC is the command center of the fluoroscopic operation (Fluoroscopy Mode), and the image acquisition operation (Acquisition Mode). For the present day ADRC to function properly, there were three relatively important technical innovations that needed to be “in place”. They are (1) high heat capacity x‐ray tube, (2) the medium frequency inverter type generator with high performance switching capability, and (3) the patient dose reduction spectral shaping filter. These three underlying technologies were tied together through the ADRC logic, installed first on cardiovascular angiography systems, so that the patients receiving cardiovascular angiography procedures can benefit from “lower patient dose” with “high image quality” with optimal contrast.
In the center of the ADRC logic is the “fluoroscopy curves” which determine the exact behavior of the fluoroscopyimaging chain. Examples of how the “fluoroscopy curves” can be reconstructed from the experimental arrangement and the data sets obtained from the same will be demonstrated. It will be shown that a given cardiovascular imaging system functions and responds as a totally different imaging system depending on the fluoroscopy curve that is loaded as the default system curve.
In this presentation, the focus is on the “fluoroscopic mode” of the ADRC, and the advances in the ADRC design is reviewed starting with the early models of automatic brightness control (ABC) and automatic gain control (AGC) equipped fluoroscopy, and leading to the present day ADRC design. Thus, the main thrust of this presentation is to reveal and evaluate the details of the ADRC logic as seen from equipment operation and fluoroscopists' point of views, and to show how the ADRC can be evaluated to understand the challenges of various clinical applications are met with different “fluoroscopy curves”.
1. To understand the basic ABC and ADRC system design.
2. To understand why the present day ADRC is able to reduce patient air kerma and yet still provide “high” contrastimages.
3. To understand the importance of “fluoroscopy curves” in the application of clinical procedures.
CE‐Imaging: The Physics and Technology of Radiography — II
34(2007); http://dx.doi.org/10.1118/1.2761301View Description Hide Description
Diagnosticmedical physicists have traditionally played a key role in the radiology department's Quality Assurance (QA) program. Procedures for QA of digital radiography (DR) systems are less well‐established than those designed for conventional screen‐film radiography. Some methods that were appropriate for conventional systems do not yield accurate results for digital systems. Methods that rely on images on film cannot be performed in a facility devoid of chemical processing. Methods that do not consider effects of all stages of the digital imaging chain from acquisition to display are likely to provide erroneous results. Digital systems often do not provide data in a convenient form for assessing performance. Nevertheless, medical physicists are responsible for designing and directing efforts to assure the safety and efficacy of DR.
Development of standards for performance verification of digital radiographic systems has been outpaced by commercialization of the technology. Practitioners, operators, vendors, and medical physicists are uncertain as to how to configure, calibrate, and verify systems for the best diagnostic quality images at the lowest practical radiation exposure to the patient. Several manufacturers have fielded commercial systems for performance verification, but long term data indicating trends and action limits is rare. Sources of definitive information on practical clinical methods are also scarce in the scientific literature. This has led to the current situation where clinical imaging operations are routinely conducted using systems that are not optimized.
The good news is that methods in common practice for conventional radiography can be applied with modification to digital radiography. Some of these methods are described in recent literature. The AAPM has recognized the dilemma and has initiated an effort to provide practical guidance to clinical physicists, i.e. Task Group 150.
This lecture will discuss some components of a QA program for DR, including adaptation of conventional QA processes. The lecture will discuss methods of evaluating system performance and provide examples of potential interferences. The lecture will review commercial devices for performance evaluation and their limitations. Finally, some ongoing standardization efforts will be described.
1. Review components of a QA program and show how they apply to DR.
2. Understand how some conventional tests should be modified for a digital radiographic system integrated into an electronic image management system.
3. Identify key references and standards that can be useful in QA of DR.
CE‐Imaging: The Physics and Technology of Fluoroscopy — II
34(2007); http://dx.doi.org/10.1118/1.2761314View Description Hide Description
Fluoroscopically guided interventional procedures continue to grow both in numbers and in complexity. When individual procedures are clinically justified, their performance should be optimized to minimize the overall (radiogenic and non radiogenic) risks to both patients and staff. Medical physicists' contributions to this objective are in proportion to their knowledge and understanding of the interventional environment.
This lecture will start with an overview of staff radiation safety. It will then emphasize patient radiation management, including the outline of an appropriate clinical program. It will conclude with a brief discussion on QA topics pertinent to interventional fluoroscopy.
1. Understand the range of risks and benefits associated with interventional fluoroscopy.
2. Establish appropriate staff and patient radiation management programs.
3. Improve the routine physics QA process for interventional fluoroscopes.
CE‐Imaging: The Physics and Technology of Radiography — III
34(2007); http://dx.doi.org/10.1118/1.2761463View Description Hide Description
In digital radiography, enhancement processing is used to transform ‘For Processing’ images to ‘For Presentation’ images that are intended for viewing. The processes used have become an essential element of image quality. They include exposure recognition, grayscale rendition, edge restoration, noise reduction, and broad area equalization. The numeric methods used will be reviewed and related to current commercial image processingsolutions.
Often, the appearance of processed digital radiographs is adjusted to best depict the structures of particular body parts. The appearance desired at a particular medical center typically results from interaction between the equipment supplier, medical physicist, and radiologists. Differences associated with thoracic, abdominal, skeletal, and breast imaging will be illustrated.
1. Understand the sequence of steps used to process digital radiographs for presentation on a workstation.
2. Learn how certain commercial systems implement processing.
3. Understand how enhancement processing can be adjusted to achieve the appearance desired for different body parts.
CE‐Imaging: Multimodality Medical Imaging — I
34(2007); http://dx.doi.org/10.1118/1.2761479View Description Hide Description
The most common 3D visualization applications in clinical diagnostic radiology now are cardiac and colon, but the technology is often used in vascular, orthopedics, craniofacial deformities, thoracic, and many other areas. In the past, dedicated workstations and highly skilled operators were required to produce CT or MR angiograms, body region or organ surface views, and analysis products. Many centers have dedicated technologists who work in a “3D lab” to extract vessel trees, disarticulate limbs, synthesize transparencies and cine sequences. These services are now reimbursed by 3rd party payors, and almost all CT & MRI scanners today have integrated post‐processing tools. Since generation of advanced visualizations is not a good use of scanner console time if the process cannot be accomplished unattended in a short time, new enterprise systems has been introduced that allow thin clients located anywhere to employ software tools and generate custom views interactively. In the surgeon's office, at the oncology clinic, or even in the operating room, it is now feasible to manipulate 3D imagedata sets. Very specialized tools for white matter tractography, MR spectroscopic imaging, functional MRI, and myocardial perfusion are examples of new software agents tailored to subspecialty requirements. Several vendors provide post‐processing services for prosthesis or implant custom sizing and design, based on 3D imagedata sets, where the advanced visualizationimages are quantified and synthesized in their facilities.
4D imaging and higher dimensions are increasingly common, since almost all cardiacCT and MRI examinations require them. Whole organ and body region perfusion, in the brain and elsewhere employ 4D methods. Multimodality and multitemporal data sets, as well as multispectral (e.g., dual energy 3D data, for example) are acquired and analyzed with software tools that focus on specific clinical issues in neuroradiology and cardiothoracic practice.
So, the most important current trends involve routinely collecting 3D and 4D data sets which are intended for more than simple subjective image review, but are destined to be post‐processed and analyzed with enterprise software tools on thin clients, so immediate results can be obtained that are tailored to subspecialty needs.
1. To understand the sources of 3D and 4D images in clinical diagnostic radiology, including the data acquisition systems and protocols.
2. To learn how advanced visualization and image analysis are evolving from dedicated workstations to enterprise software applications.
3. To see how subspecialized software tools for highly specific imaging applications are used in clinical radiology.
4. To explain 4D datasets are used for cardiac and neuroimaging, where the image analysis results are often quantitative rather than morphologic.
34(2007); http://dx.doi.org/10.1118/1.2761480View Description Hide Description
Medical imaging technologies that combine the capabilities of two or more imaging modalities are becoming increasingly important in the diagnosis, staging, treatment, and monitoring of disease. In particular, the combination of morphological imaging modalities (e.g., x‐ray, CT,ultrasound, and MR) with functional or molecular imaging modalities (e.g., optical, PET, SPECT, and functional MR) offers synergistic advances. This session provides an overview of technical and physical aspects of such developments, reviews the most prevalent technologies under consideration, addresses the challenges and limitations of such technologies, and discusses the opportunities for future research in multi‐modality imaging. Example multi‐modality approaches include x‐ray / ultrasound,CT / PET, MR / PET, and Optical / CT. Applications ranging from pre‐clinical imaging to tumor staging and treatment response monitoring are addressed.
CE‐Imaging: The Physics and Technology of Radiography — IV
34(2007); http://dx.doi.org/10.1118/1.2761608View Description Hide Description
Until recently, advanced image quality metrics such as the frequency‐dependent detective quantum efficiency, DQE (f), have been employed by a relatively small cadre of expert imaging scientists to compare the performance of digital radiographysystems, often using differing methods. To address the potential for differences in measured DQE(f) arising from differences in test methodology, in 2003 the International Electro‐technical Commission (IEC) published the first in a series of performance standards to define techniques for measuring the DQE(f) and its constitutent metrics, the modulation transfer function,MTF(f), and noise power spectrum, NPS(f). More recently, the increasing prevalence of digital radiographicsystems in the clinical environment and availability of both commercial and public‐domain resources have made DQE(f) evaluations both more accessible and increasingly relevant to the clinical medical physicist. This course will review the measurement and analysis methods for DQE evaluations of both general digital radiography and digital mammographysystems, using the relevant IEC standard method as a guide.
1. Review the definition of the DQE(f) and its constitutent metrics the MTF(f) and NPS(f).
2. Gain an understanding of the IEC standard method for measuring detective quantum efficiency, for both general digital radiography and digital mammographyimagingsystems.
3. Learn which commercial and public‐domain resources are available for measuring the DQE(f) of digital radiographicimagingsystems.
CE‐Imaging: Multimodality Medical Imaging — II
34(2007); http://dx.doi.org/10.1118/1.2761624View Description Hide Description
In oncology,imaging studies play an increasingly important role in assessing patients' response to treatment. Serial CT scans of a patient are evaluated for changes in the number and size of tumors, while serial PET scans are assessed for changes in the metabolic activity of the lesions. The advent of combined CT and PET systems streamlines the fusion of these anatomic and functional images. However there are many confounding effects that complicate quantification in 3D PET/CT imaging.
This lecture will review these sources of variability in the data, grouped into three broad categories: patient‐related factors (e.g. dose, uptake time before imaging, blood glucose level, body habitus); instrument‐related factors (spatial and energy resolution, sensitivity, data acquisition mode (2D or 3D), attenuation and scatter correction method, image reconstruction algorithm, respiratory and cardiac motion‐correction method); and operator‐related factors such as acquisition and reconstruction protocols, instrument and imagequality control, instrument calibrations, and method of image analysis. The impact of these variables on quantification will be discussed, with particular emphasis on how to minimize those that are controllable.
The presentation will conclude with a review of current efforts by government agencies, professional organizations, academic institutions, and sponsors of multicenter trials to grapple with the additional complexities that arise from combining data from multiple patients at multiple sites. Research partially supported by the American College of Radiology Imaging Network (ACRIN).
1. Understand the scope of the problems inherent in using PET/CT imaging to detect changes in patient response to treatment.
2. Understand the factors that affect the variability and accuracy of PET/CT quantification.
3. Understand the importance of minimizing variability in order to enhance the ability to identify non‐responders to treatment over evershorter time intervals.
4. Understand the efforts currently underway to standardize acquisition and processing protocols and to monitor equipment performance through credentialing and periodic quality control checks.