Index of content:
Volume 33, Issue 6, June 2006
- Imaging Symposium: Room 330 D
- Advances in Breast Imaging
33(2006); http://dx.doi.org/10.1118/1.2241503View Description Hide Description
Despite development and advances of new modalities, x‐ray mammography remains the main screening tool for early detection of breast cancers. It also continues to play an important role in the diagnosis and management of breast cancers. X‐ray mammography has relied on the use of high resolution screen/film combinations. Such techniques, although improved over the years, have the drawbacks of all the inconveniences and inflexibility associated with the use of film for image acquisition, storage and display. Motivated by search for better image quality and a general move towards filmless radiology, various digital mammography techniques have been developed, investigated and commercialized. These techniques can be largely divided into four types: amorphous silicon and selenium flat panel detector,amorphous silicon and cesium iodide flat panel detector, CCD and cesium iodide based slot scanning system, and storage phosphor imaging technique with dual‐side image read out. Each of these systems has its unique advantages and disadvantages and achieves various degrees of clinical implementation. Typical of all digital mammography techniques is the potential compatibility with digital image archival, retrieval and distribution systems. Furthermore, the acquisition of breast images in digital format has begun to facilitate the development and investigation of many advanced imaging techniques. Among them, dual‐energy mammography techniques have been developed to quantify breast tissue composition or to separate calcifications from the overlying tissue structures. Stereo‐mammography has been developed to use two projection views to provide a 3‐D perspective of the breast tissue structures, thus reducing the problem of overlapping structures. Tomosynthesis imaging pushes the idea of 3‐D imaging further by acquiring 10–25 projection views and use them to synthesize images to depict the breast structures as a number of thick layers. More recently, cone beam breast CT has been developed and investigated to scan the breast in a dedicated manner and provide true 3‐D images of the breast. Along a different direction, the development and investigation of various contrast mammography techniques have allowed x‐ray imaging to be used to image and study breast vasculatures as possible indication of breast cancer.
In this presentation, we will try to achieve the following educational objectives:
1. Review the development and investigation of major digital mammography techniques.
2. Review the development and investigation of various advanced breast x‐ray imaging techniques.
Research sponsored by the National Cancer Institute and the National Institute of Biomedical Imaging and Bioengineering.
33(2006); http://dx.doi.org/10.1118/1.2241504View Description Hide Description
Both film‐screen and digital mammography are subject to a number of fundamental limitations related to the projection process, whereby 2D images are produced of the 3D breast anatomy.Mammography can produce artifactual densities from the superposition of normal tissues that are separated in space; although only visible in a single view, these often appear sufficiently suspicious to necessitate a biopsy, leading to a loss in specificity. Furthermore, true lesions in mammograms can be masked by superimposed normal tissue and thereby rendered undetectable; this reduces the sensitivity of mammography.
Numerous tomographic methods have been proposed to overcome these limitations, including digital breast tomosynthesis (DBT). DBT is a tomographic imaging technique in which a set of tomographic images can be reconstructed from a limited number of x‐ray projection images. DBT has the potential to mitigate against both the superposition of non‐adjacent tissue (false positive densities) and the masking of real lesions (false negatives) observed in projection mammography, while also providing a simple means of localizing lesions in 3D. A preliminary retrospective study of breast tomosynthesis by Rafferty et al. has demonstrated a 16% increase in sensitivity and 85% decrease in false positives as compared to digital mammography. Our own experience at the University of Pennsylvania with 51 patients has provided supported anecdotal evidence.
DBT also offers the potential for functional imaging. Breast tumor growth and metastasis are accompanied by the development of new blood vessels. We have used a modified GE 2000D under IRB approval, to gain initial experience in contrast‐enhanced DBT (CE‐DBT). To date we have acquired 13 CE‐DBT clinical cases. Suspicious enhancing lesions were demonstrated with CE‐DBT in 10 of 11 cases of pathology‐proven breast cancer. The cases illustrated that CE‐DBT could provide information in concordance with multimodality imaging evaluation. The pre‐contrast tomosynthesisimages demonstrated lesion morphology and border characteristics in greater detail than the digital mammographyimages, and the CE‐DBT data sets demonstrated vascular characteristics of the breast lesions of interest that were consistent with the vascular information provided by MR. In addition, quantitative evaluation of contrast uptake is anticipated to be more easily standardized with CE‐DBT due to the linear relationship between attenuation and contrast concentration compared to MRI.
In this presentation, the following education objectives will be addressed:
1. Review the development and design of digital breast tomosynthesis systems.
2. Evaluate the results of the existing DBT clinical trials.
3. Examine advanced applications of DBT including contrast‐enhanced DBT and CAD.
4. Compare DBT to other x‐ray tomographic imaging modalities.
TU‐C‐330D‐03: Computed Tomography of the Breast: Design, Fabrication, Characterization, and Initial Clinical Testing33(2006); http://dx.doi.org/10.1118/1.2241505View Description Hide Description
Purpose: Although there is overwhelming evidence that mammographic screening has led to a reduction in breast cancer mortality, most breast imaging experts agree that screening technology could be improved. We have developed a dedicated breast CTscanner which may be appropriate for breast cancer screening in some groups of women. Methods and Materials: A breast CTscanner was designed and fabricated in our laboratory using off‐the‐shelf components (an x‐ray system, flat panel detector, and motor) and custom‐manufactured parts. Image quality was assessed subjectively in terms of artifacts, and by conventional metrics (MTF for spatial resolution, RMS noise, etc.). Radiation dose levels were adjusted to be comparable to two‐view mammography, using a series of physical measurements and Monte Carlo computations. Evaluation in patients has begun with both Phase I and Phase II clinical trials. Results: The spatial resolution of the bCT system exceeds that of commercial scanners, with a 10 percent MTF corresponding to XX inverse mm (center of field, 80 kVp, 500 views). Noise metrics demonstrates that the scanner performs in a quantum limited manner. Ten healthy volunteers have been scanned, and as of this writing 35 women with BIRADS 4 or 5 diagnoses have been scanned. Subjective evaluation of image quality clearly indicates detail not seen mammographically. The volume dataset (300 512 × 512 pixel images) can be displayed in coronal, axial, saggital or any arbitrary view angle. Conclusions: The clinical evaluation of the bCT system is underway, and early subjective results have generated interesting images with excellent anatomical depiction. The use of contrast agents has added a functional component to breast CTimaging. Further patient accrual with subsequent quantitative (ROC) analysis is needed.
1. Status of breast CT implementation at UC Davis.
- Biomedical Imaging Research Opportunities Workshops (BIROW): A Guiding Consortium of Imaging Societies with NIH Support
TH‐D‐330D‐01: BIROW — Biomedical Imaging Research Opportunities Workshop: Intersociety Project to Accelerate Biomedical Imaging Discovery and Application33(2006); http://dx.doi.org/10.1118/1.2241901View Description Hide Description
The Biomedical Imaging Research Opportunities Workshops (BIROW) held over the past 4 years examined the leading edge of research opportunities in biomedical imaging, to recommend strategies to advance imaging research and thereby improve imaging applications in preventive medicine,medical diagnosis and treatment. Funded through grants from the National Institute of Biomedical Imaging and Bioengineering (NIBIB) and the Whitaker Foundation, and held each spring in Bethesda MD, the BIROW workshops were sponsored by 5 organizations: the American Association of Physicists in Medicine, Radiological Society of North America,Biomedical Engineering Society, Academy for Radiology Research, and the American Institute of Medical and Biological Engineers.
This session introduces the purpose and accomplishments of BIROW and plans for its future, followed by discussions by several leading researchers on topics of substantial current interest in biomedical imaging research: Image‐Guided Therapy (Drs. Bourland, Pelizzari, Ling and Jaffray); Small Animal Imaging Systems (Drs. Fullerton, Eckelman and Patt); Medical Imaging Technology: From Concept to Clinic (Drs. Hendee, Mulvaney, Mackie and Fenster); and Informatics and Computerized Image Analysis (Drs. Nagy, McNitt‐Gray and Chan). Participants in this session will gain insight into the types of research challenges and opportunities present in BIROW workshops through the presentation of these examples.
1:30 – 1:50
Introduction Session: Purpose of BIROW
1:50 – 3:20
Small Animal Imaging Systems
4:00 – 5:30
Medical Imaging Technology: From Concept to Clinic Informatics and Computerized Image Analysis.
- DR in Practice
33(2006); http://dx.doi.org/10.1118/1.2241751View Description Hide Description
Unlike conventional radiographic receptors (screen‐film), digital systems do not generally have a fixed speed, but respond over a wide range of receptor dose. Therefore, specifying and monitoring receptor dose is an important component of quality assurance for digital radiography. Digital radiographs are currently characterized in terms of a variety of incompatible, vendor specific speed or dose metrics, which make it difficult for users to monitor receptor dose or to inter‐compare receptor dose from different systems. The lack of a universal vendor‐independent measure of receptor dose has prompted the formation of AAPM Task Group 116, which is currently working to “Standardize an Image Receptor Dose Index for Digital Radiography”.
This talk will review the currently available vendor‐specific measures of receptor dose, describe and illustrate a set of desirable attributes that should characterize a universal receptor dose metric, and explore some of the pitfalls and opportunities that having a universal metric afford. Digital radiography offers a substantial opportunity to optimize and standardize radiographic practice well beyond what could be done with conventional imaging. These along with some of the pitfalls that must be avoided will be illustrated by comparing several specific proposals for measuring receptor dose applied to a large set of clinical images for which body part, projection, thickness, technique factors (kV, mAs and SID) were documented. These alternative approaches will be compared in terms of their conformance to the desirable attributes developed at the outset.
A well‐designed universal measure of receptor dose enables a vision for digital radiography that encompasses improved patient care through optimized and consistent image quality and dose. The concepts reviewed in this presentation are expected to influence the development of international standards for receptor dose anticipated within the next few years.
Research sponsored by the Eastman Kodak Company.
1. Understand the translation of screen‐film speed and dose concepts into a digital environment.
2. Understand the relationship among the many measures of dose, including the vendor specific measures of receptor dose.
3. Appreciate the need for and the desirable attributes of a universal vendor‐independent measure of receptor dose.
4. Understanding the use and interpretation of receptor dose in clinical practice.
33(2006); http://dx.doi.org/10.1118/1.2241752View Description Hide Description
Ideally, we would like for clinical digital radiography (DR) systems to produce faithful reproductions of radiographic projections, acquired at optimum gain, and rendered appropriately for diagnostic interpretation. We also expect uniform performance from identical DR systems. We would like DR systems to maintain their optimum performance indefinitely, or at least to alert us when their performance falls below reasonable limits. We expect DR systems to be user‐friendly, to accommodate operator Quality Control (QC) activities, and to enable the operator to correct errors without the need for a repeated exposure. In reality, all DR systems fall short of these ideal expectations. Awareness of causes and corrective actions for non‐ideal behavior is important for Medical Physicists in order to assist clinical practice with DR.
The talk will consider the effects of discrete detector elements, gain compensation maps, and “ghost” images on acquisition of the radiographic projection. Variation in gain and SNR with exposure will be discussed. Sources of interferences with image segmentation will be described including hardware, shielding, and the challenge of large and small patients. The influence of operator technical errors will be explored. The value of automated quality control self‐tests in assessing and maintaining optimum performance will be examined. The ability of the operator to intervene in the event of a sub‐standard image will be considered. Positive and negative aspects of operator modification will be mentioned. Discrepancies between display functions at the acquisition station and at the physician's diagnostic display will be discussed.
1. Appreciate the technological limitations on ideal behavior by DR systems.
2. Understand the clinical manifestations of non‐ideal DR behavior.
3. Identify actions that can reveal or correct non‐ideal DR behavior.
33(2006); http://dx.doi.org/10.1118/1.2241753View Description Hide Description
Purpose: To explore the impact of DR mammography on the practice of clinical medical physics. Historical and new methods for providing scheduled and unscheduled mammography physics services for facilities using DR mammography will be presented. Method and Materials: Clinical medical physics services provided to DR mammography facilities from various manufacturers will be presented. Quality control activities approved by FDA for each individual manufacturer will be compared. The status of a proposed “Alternative Standard” to allow for a more uniform approach to medical physicsmammography services will be reviewed. Case studies will be presented demonstrating methods and cost considerations for providing medical physics services under the “minimum standard” (prescribed by MQSA regulations) and “best practices” model.Results:Medical physics services provided in the “minimum standard” and “best practices” model have implications for the quality and cost of mammography physics services. Conclusion:Medical physics services may be provided in a professional and valuable manner, using combinations of the minimum standard of practice and the “best practices” model. The professional medical physicist should consider multiple parameters when determining the appropriate model under which to deliver services.
- Image Science 2020: Perspectives on the Future of Imaging Physics
33(2006); http://dx.doi.org/10.1118/1.2241688View Description Hide Description
The characterization and classification of image quality in radiographicimaging is a complex and illusive goal. It involves understanding the physics of x‐ray interactions and how these influence statistical properties of image signals and noise. It requires an understanding of how observers can extract non‐random structures from random (and sometimes not random) image details. Finally, it requires an understanding of how observers are able to extract clinically meaningful information from the complicated clutter of background structural information. As processes responsible for producing image signals are often non‐linear, non‐stationary, multi‐dimensional and task dependent, simpler metrics of image quality and detector performance are at best idealized approximations.
Scientists have attacked this multi‐faceted problem from a number of directions. As a result, a wide spectrum of terminology and concepts have become commonplace. This talk will address some aspects of how far “image science” has come and where it may be going. It will highlight accomplishments that have become established in both academic and commercial fields, and problems that have not been solved. Hopefully, improved understanding of these issues will enable the development of better detectors and systems and improved patient outcomes.
1. Understand issues that influence image quality and observer performance for simple tasks.
2. Understand limitations of simple metrics.
3. Understand some directions being followed to better understand what determines image quality for the development of better detectors and systems.
WE‐C‐330D‐02: Image Science and CAD: In Pursuit of a Fundamental Theoretical Basis for CAD Development33(2006); http://dx.doi.org/10.1118/1.2241689View Description Hide Description
Computer‐aided diagnosis is still a very immature field, with very little theoretical framework upon which it is based. This is a major limitation in both developing systems and in evaluating them in a meaningful way. It is clear that in the future CAD will play a greater role in radiology, both as secondary reader and as a primary reader. The current clinical implementation of CAD is as second reader to radiologists. This will shift to CAD being used by a physician assistant and CAD as the primary reader and the radiologist as the secondary reader. Ultimately, CAD will be the only reader, at least for a subset of cases.
However, to increase the development and adoption of new CAD systems, the field needs a better fundamental foundation. This foundation will come from several areas. First, as we gain a better understanding of human observers, we can use this information not only to develop more accurate CAD algorithms, but also importantly, to design CAD systems that can be integrated into the radiologists' workflow more fully. Second, models of CAD techniques need to be developed. In analogy to modeling ideal observers and human observers, much can be gained from modelingCAD schemes. Third, a thorough understanding of the interaction between the image and the CAD technique is needed. For example, if the shape of the NEQ curve of the image receptor changes, can we predict how the performance of the CAD technique will change.
Medical imaging technology is rapidly changing. The current paradigm for developing CAD systems — try different techniques on hundreds of images — cannot keep pace with these changes, especially as new imaging systems are developed, where clinical images are scarce. Our goal should be to develop the field to the stage where it will be possible to model the imaging system's characteristics and then, guided by models of human observer performance, select from an array of image processing,artificial intelligence and pattern recognition techniques a group of techniques that will produce the optimum CAD system.
This talk will present my vision for the future of CAD and what is necessary for the field to make rapid progress.
1. Discuss future roles of CAD as both a secondary and the primary reader.
2. Explain the fundamental limitations of CAD development.
3. Discuss one possible approach to overcoming the limitations in the future.
WE‐C‐330D‐03: Bioinformatics, the Multiple‐Biomarker Classifier Problem, Complexity, and Uncertainty33(2006); http://dx.doi.org/10.1118/1.2241690View Description Hide Description
The most celebrated landmark of modern bioinformatics has been the sequencing of the human genome. Early in the project it was commonly believed that humans have about 100,000 genes and as the project neared completion the estimates came down into the neighborhood of 25,000–30,000. Hidden Markov Models (HMM) are used to carry out statistical parsing of the “linguistics” of such bioinformation. Such massively complex analysis has been facilitated by modern developments in massively complex hardware and software—but such analysis is naturally accompanied by great uncertainties. At a lower level of complexity are the algorithms used for computer‐aided diagnosis in medical imaging, and at an intermediate level are the tools under current development for fusing multiple biomarkers—for example, from a large number of spectral lines in mass spectroscopy of protein fragments in blood samples and other mutliplex data from protein and gene microarrays. This talk will review the uncertainties in measured performance of such diagnostic tests as a function of the sample sizes available for training and testing as well as the dependence on the number of fused biomarkers and the complexity of the associated statistical learning algorithm. A strategy for designing large trials based on pilot studies will be outlined.
1. Understand the multiple‐biomarker classifier problem.
2. Understand the uncertainties in its performance assessment due to finite training and testing.
3. Understand the dependence on number of biomarkers and complexity of statistical learning algorithm.
- Molecular Imaging I: The Physics of Molecular Imaging
33(2006); http://dx.doi.org/10.1118/1.2241412View Description Hide Description
The first day of the Molecular Imaging Symposium (MI‐1) will focus on the technology aspects of molecular, functional, and small animal imaging. Modalities which will be discussed include micro‐CT, micro‐PET, and high resolution MRI for small animal imaging. The presentation on micro‐CT technology will include an introduction to the basic requirements of the scanner hardware, examples of the images and biological applications in which micro‐CT is useful, and the radiation dose to the small animal undergoing micro‐CT will also be discussed. Micro‐CT techniques require longer acquisition times than human scanners, and thus maintaining the animal in a viable but motionless state is of clear importance. Therefore, issues surrounding animal support including anesthesia and respiratory gating will also be presented.
Micro‐PET systems are widely used in small animal imaging for genome research, and represent probably the mainstay of truly molecular imaging modalities at this point in time. The presentation on micro‐PET will include a description of micro‐PET scanner hardware, a discussion of PET‐radiotracers, and an overview of current small animal PET systems. The limitation of current micro‐PET system design will be discussed, and ideas for overcoming some of these limitations will be presented.
High resolution MRI systems have the benefit of delivering exquisite contrast with excellent spatial resolution, with no ionizing radiation. The presentation on micro‐MRI techniques for phenotype imaging will describe the integration of physics, biology, chemistry, engineering, and computer science which is necessary to achieve state‐of‐the‐art small animal MRIimaging. The use of hyperpolarized gases for lung imaging and MR histology will be discussed as well.
The availability of small animal imaging systems across a number of modalities has proved essential for a large number of research applications. The primary goal of MI‐1 is to help familiarize medical physicists with the technical design and capabilities of these high resolution small animal imaging systems, and to highlight research applications of their use. Different modalities are used to address different research questions, and this symposium will emphasize the strengths and weaknesses of each modality in regards to various research applications. Differences between animal imaging and human scanners will also be discussed.
- Molecular Imaging II: Clinical and Pre‐Clinical Applications
33(2006); http://dx.doi.org/10.1118/1.2241554View Description Hide Description
Day two of the Molecular Imaging Symposium (MI‐2) will focus on the applications of molecular imaging in small animals and humans. The session will begin with a discussion of a recent trans‐agency announcement that addresses molecular imaging as a biomarker for drug response (DHHS New Federal Health Initiative to Improve Cancer Therapy). Opportunities for imagingphysicists to engage in the development of physical performance standards for dual modality imaging platforms (anatomical and molecular imaging) during the course of therapy treatment will be discussed. Similarly the development of standardized methods to evaluate change analysis tools will be addressed. A case for creation of a new AAPM task group to address this topic will be presented. The following links are of interest:
The second lecture will review the clinical research use of existing contrast agents in dynamic contrast MRI for early assessment of therapy‐induced microvascular changes, pre‐clinical use of novel high molecular weight and/or targeted or enzymatically activated MR contrast agents, endogenous contrast agent techniques, such a blood oxygen level dependent (BOLD), for assessing changes in tissue oxygenation, and other techniques for assessing treatment response or improving lesion characterization, including quantitative diffusion and spectroscopy techniques.
The session will conclude with an introduction of the new combined modality instrumentation now available in PET/CT and SPECT/CT, discuss clinical examples of radiotracers that are being used in oncologicimaging (FDG, amino acids, peptide, hormones, antibodies, cell proliferation and hypoxia tracers), techniques to evaluate whether radiotracers actually localize at the intended site (i.e., autoradiographic correlation with tumor immunohistochemistry in rodent models and on clinical biopsy tissue), and the use of functional images to determine features of tumor biology, to monitor treatment response, and for radiotherapy treatment planning.
- Slice Wars: 64‐Slice CT and Beyond
33(2006); http://dx.doi.org/10.1118/1.2241456View Description Hide Description
Cone Beam CT(CBCT) is one of the most recent technical advancements in x‐ray computed tomography. The state‐of‐the‐art 64‐slice scanners provide isotropic sub‐millimeter spatial resolution, significantly improved dose efficiency, large volumetric coverage, and markedly improved temporal resolution. These capabilities open doors to new clinical applications.
This presentation provides an overview of the recent technical advancements in CBCT. The discussion first covers technical challenges that face CBCT, such as detector complexity, electrical and mechanical design, image reconstruction algorithms, dose, and information management.
The second part discusses some of the recent developments in CBCT. These include the advanced dose‐reduction techniques, the dual‐source CTscanner, more advanced reconstruction algorithms that break the traditional noise vs. dose tradeoffs, larger volume coverage, and dual‐energy CT for material decomposition and presentation. If the past 10 years of CT development can be characterized by the “slice‐war”, new technology developments are pointing to different directions.
1. Overview of the recent technology developments in CBCT.
2. Understand major technical challenges in CBCT development.
3. Explore future directions in CBCT technology.
33(2006); http://dx.doi.org/10.1118/1.2241457View Description Hide Description
As CT has evolved from axial to helical to multislice helical, the advanced applications have been focused on volumetric imaging. Improvements in coverage and temporal resolution have opened the doors to challenging imaging tasks such as coronary angiography. However, even with today's most advanced systems, there is still a several second time difference between the imaging of the superior and inferior portions of the heart. During this time period, small differences in heart rate and contrast medium can lead to discontinuities within the volume data as seen in some MPR images. Such artifacts can lead to inaccessibility or, worse, misdiagnosis. Toshiba has developed its 256 slice system to cover over 12 cm of anatomy in a single rotation with 0.5 mm slices to allow the complete coverage of the heart at a single, instantaneous time point. Such a system will enable seamless coverage free from banding artifacts and discontinuities. Furthermore, wide volume coverage will allow dynamic evaluation of the flow of contrast material and perfusion over an entire organ rather than in just 3–4 cm, opening the doors to new clinical applications. No new system is without technological challenges. The wide coverage of the 256 slice system requires a cone angle that is four times larger than a 64 slice system and advancements in cone beam reconstruction have been necessary to meet this need. This lecture will detail the design of Toshiba's 256 slice system and provide an overview of the system's image quality, dosimetry, and clinical applications. It will show early clinical examples of the advantages of whole organ coverage. It will explore some of the technological challenges inherent in such a system and the solutions to these challenges.
1. Understand the development and design of Toshiba's 256 slice system.
2. Understand the clinical utility of seamless, wide volume coverage.
3. Understand the technical challenges of the 256.
4. Understand the system's image quality.
MO‐E‐330D‐03: Clinical Perspective; Impact of Future Technology Developments; Cardiac CT; Challenges33(2006); http://dx.doi.org/10.1118/1.2241458View Description Hide Description
Computed tomography is not the most frequent radiologic imaging procedure, but is arguably the most important in terms of clinical impact. CT is used extensively for emergencies, cardiovascular, pulmonary, gastrointestinal, endocrine, neurological, orthopedic and other applications ‐ often as the first and only imaging procedure needed for diagnosis. The chances are very high that a patient will have a CT scan in the emergency department, as an outpatient or as an inpatient for a multitude of indications ‐ pain, trauma, suspected infection or malignancy, and frequently to clarify or resolve a question raised by another abnormal test, such as an EKG abnormality or ultrasound finding. Despite the universality of CT in hospitals and clinics as well as free‐standing imaging centers, the technology continues to evolve with greater coverage, faster acquisition and multienergy sources or detectors. The most demanding imaging applications are cardiovascular, where complex motion and small morphologic features coexist, so imaging methods that are very satisfactory elsewhere in the body may not be successful. Clinical cardiacCT consists of administering toxic materials, e.g., contrast media, while monitoring the EKG and illuminating the body with high brightness x‐rays. Larger area detectors and higher frame acquisition rates are welcome improvements, but don't solve all of the problems encountered with variability due to respiratory, random body, and cardiac motion, especially in a spectrum of patients from infant to massively obese adult sizes (< 1 kg to 250 kg or more). The challenges and pitfalls in CT will be delineated and evaluated relative to current and future technology.