Volume 35, Issue 6, June 2008
Index of content:
- Therapy Symposium: Auditorium A
Outcome‐Guided Biological Treatment Planning
MO‐D‐AUD A‐01: Summarizing Our Knowledge of Normal Tissue Tolerances: The Progress and Future Directions of QUANTEC35(2008); http://dx.doi.org/10.1118/1.2962339View Description Hide Description
AAPM and ASTRO are jointly funding an effort, called QUANTEC (QUAntitative Normal TissuE models in the Clinic), which aims to summarize our knowledge of dose volume dependencies of the normal tissue complications of external beam radiotherapy, and where possible, give quantitative guidance for clinical treatment planning and optimization. Following the initial QUANTEC meeting in Madison in October 2007, attended by approximately 75 physicists, MDs and bio‐statisticians, an upcoming special issue of the Red Journal is in preparation, which will include articles on complications in some 18 normal organs. For each of these organs, extensive literature reviews were undertaken, and clinically significant endpoints identified. Where possible, results were synthesized and compared, guided by the quality and levels of evidence of the studies. Criteria included prospective or retrospective nature, statistical power, presence and reliability of quantitative data on dose‐volume effects. Other clinical factors influencing complications were assessed, such as the influence of chemotherapy, fraction size and pre‐existing medical conditions. Special emphasis was placed on attempting to provide quantitative guidelines for clinical treatment planning. This talk will convey the mission, and give selected examples of the achievements of QUANTEC. It will outline the current state of our knowledge, and highlight areas where opportunities exist to improve our understanding.
1. To understand the current state of knowledge of dose volume dependencies of complication probabilities.
2. To understand the criteria and methods used by QUANTEC in selecting and summarizing this knowledge.
3. To understand the shortcomings of the clinical data as it currently exists.
4. To understand the possible strategies for overcoming these shortcomings.
35(2008); http://dx.doi.org/10.1118/1.2962340View Description Hide Description
Historically, RT fields/doses were selected based on clinical experience and intuition. Clinicians generally recognized the imprecision of these empiric guidelines, as they did not reflect the underlying three‐dimensional anatomy,physiology, and dosimetry. A great promise afforded by 3D imaging was an improved quantitative relationship between 3D doses/volumes and outcomes. When 3D dosimetric information became more widely available (late 80s–early 90s), 3D guidelines were needed. In 1991, multiple investigators pooled the available, albeit sometimes limited, information regarding leading to the often‐quoted “Emami paper” (IJROBP 1991). During the last 17 years, additional 3D dose/volume/outcome data has become available. A central goal of QUANTEC is to summarize this information in a clinically useful manner. For each organ, the literature providing meaningful dose/volume/outcome data is reviewed. Clinical/treatment variables that may impact the application of the data is discussed. Where available, NTCP‐model parameters are provided. We hope this information will improve patient care by providing clinicians and treatment planners with the tools necessary to determine the “optimal” 3D dose distribution for each individual case.
Nevertheless, the information provided herein is not ideal, and care must be taken to apply it correctly. Unfortunately, the data are incomplete for essentially almost every organ. The user must recognize the limitations inherent in extracting/pooling literature data. For some complications, some studies summarize their findings in terms of models that can be used to estimate risk. Extreme care should be taken when such models are applied clinically, especially when clinical dose/volume parameters are beyond the range used to generate the model. Models that rely on DVH data discard all spatial information (and hence inherently assume that all regions of an organ are functionally equally important), and often do not consider variations in fraction size (a particular concern with the increasing use of hypo‐fractionated schedules). Similarly, the increasing use of RT combined with concurrent chemotherapy, with rapidly evolving drugs/doses, questions the validity of historic data to modern times.
For essentially all patients with curative cancers, a marginal miss is a more serious complication than is a normal tissue injury. Care must be taken not to compromise target coverage to reduce the normal tissue risks. A clinical balance is needed. Further, palliation in patients with recurrent/metastatic/incurable disease, with limited expected survival, often requires one to exceed “tolerances”, as concern for late effects may not be applicable (e.g. RT fields for locally advanced lungcancer may include large volumes of lung and heart and withholding RT due to the risk of pericarditis, or pneumonitis, may not be “therapeutically rational”). These concerns are most applicable to our youngest generation of recently‐trained radiation oncologists. Such individuals have become accustomed to having 3D dosimetric information available for every case, and rely on such data for many of their clinical decisions. These physicians may be uncomfortable in clinical settings wherein large radiation fields need to be applied in a relatively rapid fashion (i.e. without 3D dosimetry) in order to provide effective palliation.
35(2008); http://dx.doi.org/10.1118/1.2962341View Description Hide Description
Rapid advances in functional and biological imaging, predictive assays and our understanding of the treatment responses herald the coming of the long‐sought goal of implementing biologically based radiation therapy in the clinic. Until now, the practice of RT has been based on the premise that effective treatments require the delivery of uniform radiationdoses to all target volumes. However, the uniform‐dose approach only yields the best possible tumorcontrol for the implausible situation in which all regions of the tumor have exactly the same biological characteristics and sensitivities to radiation. Theoretical studies and the accumulated clinical data strongly suggest that the treatment plans designed to exploit patient‐ and tumor‐specific biological features will substantially improve treatment outcomes.
Biologically based treatment planning (BBTP) use dose response models to estimate biological responses for a dose distribution and/or a fractionation schedule. There have been at least two commercial BBTP systems available for clinical use. A variety of dose response models and quantities (e.g., the generalized equivalent uniform dose (gEUD), Poisson cell killing, serial and parallel complication models, and Lyman‐Kutcher‐Burman model) along with a series of organ‐specific model parameters are included in these systems to calculate TCP, NTCP and EUD. Experience with these commercial and non‐commercial planning systems has shown that the appropriate use of biological‐model based cost functions can generate plans with similar target coverage but with better normal tissue sparing or equivalent plans but with much smaller number of iterations, as compared to the use of physical (dose‐based) cost functions. Because of these and other anticipated benefits, we believe that the use of biological models in treatment planning process will become popular in the near future.
However, it is also evident that, due to various limitations, such as the limitations of models and available model parameters, the incomplete understanding of dose responses, and the inadequate clinical data, the use of biologically models for treatment planning can be potentially dangerous. In addition, the models can also be used simply to evaluate treatment plans. Some aspects of the models may be unintuitive and their correct interpretation and use as planning indices may not be straightforward. Biologically based planning represents a paradigm shift. There will be a steep learning curve for most clinical physicists/planners to understand how, when and why biological models do and/or do not work. To address these issues, AAPM has recently established a task group (TG166).
This talk will provide an overview for the development of biologically‐based treatment planning. Initial experience of using a commercial BBTP system will be discussed.
1. Understand the basic and limitations of using dose‐response models in treatment planning.
2. Understand dosimetrical differences between biologically based and physical dose based treatment plan optimization and evaluation.
From Physical to Biological Optimization
35(2008); http://dx.doi.org/10.1118/1.2962380View Description Hide Description
Optimization techniques have found their way into radiation treatment planning with the advent of IMRT. Since then, medical physicists and operations researchers(optimization experts) have begun to collaborate more closely and tackled various optimization problems in our field. This symposium will present recent advances and challenges at this interface between medical physics and operations research.
35(2008); http://dx.doi.org/10.1118/1.2962381View Description Hide Description
The response to radiation therapy is determined by a number of cellular, proliferative and tumor physiological factors. Recent development in molecular and functional imaging provides non‐invasive information about these factors and can potentially be of value in treatment plan optimization. This represents a challenge to the central dogma in radiation therapy as a tailored, inhomogeneous dose is favored in stead of striving for a homogeneous dose to the target volume. One challenge is to develop optimization strategies that take the biological information of the tumortissue adequately into account. A prerequisite for such strategies is detailed knowledge about the dose modifying factor (DMF) associated with the various biological features visualized by different imaging techniques. For most biological factors of relevance for the response to radiation therapy DMF is not well established. The dose modifying effect of tumor hypoxia, i.e. the oxygen enhancement ratio (OER) — has, however, been extensively studied. Moreover, dynamic contrast enhanced MR imaging and PET imaging with Cu‐ATSM or F‐Miso as traces have shown to be promising with respect to imaging of tumor hypoxia. Spatial redistribution of the dose according to hypoxia maps, derived from MRI or PET images, has shown to increase calculated tumor control probability (TCP) significantly compared to a homogeneous dose distribution in canine head and neck tumors. The effect depended on the degree of reoxygenation, with a maximum relative increase in TCP for tumours with poor or no reoxygenation. Also, acute hypoxia reduced TCP moderately, while underdosing chronic hypoxic cells gave large reductions in TCP. Random errors in positioning were found to give a small decrease in TCP, whereas systematic errors were found to reduce TCP substantially. Molecular and functional imaging provides vast amount of information and the strategies for incorporating this information into treatment plan optimization is still in its infancy.
35(2008); http://dx.doi.org/10.1118/1.2962382View Description Hide Description
This talk will give a brief overview of the field of operations research (OR), and the models and tools that it can provide for radiation treatment planning. Some recent developments within OR and some future directions for the field will be outlined that have relevance for treatment planning.
The Challenge of Research: Bringing Clinical Studies and New Technology into the Clinic
35(2008); http://dx.doi.org/10.1118/1.2962433View Description Hide Description
At one point or another, almost everyone has recognized a problem and generated ideas they believe would improve clinical practice. In essence, this is what drives the desire to generate technical and clinical research, and translate findings into clinical practice. Some ideas become “practice changing” when the problem is important enough; the ideas compelling; and the right combination of skills, resources, and motivation are brought to bear. Geometric uncertainties in radiation therapydelivery represent an important problem and a fertile area of investigation that captures imaginations in clinical practice, academic research, and industry. This presentation will relate some experiences of participating in the development of image‐guidedradiation therapy(IGRT) solutions.
1. To demonstrate the importance of a team approach in making research ideas into clinical practice.
2. To illustrate some essential lessons learned from IGRT research and development.
3. To review some common institutional practices used to assess patient safety and the protection of ideas.
TU‐C‐AUD A‐02: Middleware for the Clinic — Taking Small Steps Towards Implementing New Technologies Into Routine Practice35(2008); http://dx.doi.org/10.1118/1.2962434View Description Hide Description
New products from vendors may not work immediately in many clinics due to a variety of reasons. One of the main reasons, however, is due to the fact that most clinics are in a multi‐vendor environment. The workflow demonstrated by the vendor may not completely work in a multi‐vendor / hospital‐specific environment. Similarly, new technologies are not perfect; they often require supplementary workflow solutions to accomplish the main goal. Often in time, vendor's support is limited. Simply waiting for the next product release is not always working. As the technology leader in a radiation oncology team, medical physicists should be actively involved in the technology implementation process, which can be hospital‐specific. However, there are common steps in achieving the goal of technology integration. The first step is often involved in identifying why things are not working. This trouble‐shooting process may require the assistant from vendors or friends in other hospitals. Once the main problem is identified, value‐adding new ideas should be sought as alternative solutions. After an alternative solution is judged feasible, resources need to be identified to implement it in the clinic. This resource‐seeking process may require you to “sell” your idea to your chief physicist, the vendor, or other members of the radiation oncology team. It is important to get buy‐ins in the clinic before you invest time significantly. Successful clinical implementation often relies on the key clinical issues identified. Technologies used for such implementation do not need to be complicated, definitely not a complete overhaul of the existing workflow. While inventing a technical solution may not be easy, the biggest challenge is to take it live in your clinic. In addition, continuous support, improvement and adapting to changes are required. It is worth mentioning that many of these middleware solutions can be short‐lived. The goal is not to insist “in‐house” solutions. Rather, the eventual goal is the clinical efficacy for your clinic. Examples of in‐house solutions will be discussed to illustrate these processes and experience will be shared among other presenters.
1. To suggest common strategies for in‐house implementation of new technologies.
2. To share experience on how to work effectively with your vendors.
3. To identify challenges in in‐house solutions.
TU‐C‐AUD A‐03: Integrating Research Into the Clinic: Experiences From Implementing Monte Carlo, IMRT, and …35(2008); http://dx.doi.org/10.1118/1.2962435View Description Hide Description
Clinical implementation of in‐house developed research‐based treatment planning software poses challenges for both the code developer and clinical user. This presentation will examine several important aspects of the implementation and integration processes, using in‐house IMRT and Monte Carlo dose calculation programs as examples. For the developer, challenges include the creation of fail‐safe strategies to ensure patient safety; procedures for upgrades, including updates to address clinically urgent issues; and separation of clinically implemented code from research‐based code which is under continual development. Jointly, the developer and clinical user are challenged to create commissioning and quality assurance procedures which not only meet AAPM TG guidelines, but which also test pre‐identified failure modes that are specific to the software implementation. They are further challenged to develop per‐patient quality assurance methods to ensure patient plan quality. A useful strategy in this regard is to utilize a vendor‐supplied FDA approved product for cross‐comparison of the deliverable treatment plan. Finally, the implementation and integration processes should include procedures for dealing with non‐compliant cases, including identifying error sources and implementing remedies. Examples of how these challenges were addressed in our clinic will be presented.
1. To appreciate the challenges inherent in safe clinical implementation of in‐house developed software.
2. To understand strategies for overcoming the challenges of clinical implementation of in‐house developed software.
3. To understand the role of QA in safe clinical implementation of research.
The research discussed is supported in part by NCI Grant P01 CA116602, by Philips Medical Systems, and by Varian Medical Systems.
35(2008); http://dx.doi.org/10.1118/1.2962436View Description Hide Description
Many of the most important advances in modern radiotherapy have been developed, implemented, tested, and/or perfected through in‐house clinical research work. Study of the possible benefits of new technology and new techniques for treatment planning and delivery is crucial if we are to learn when and how to effectively use these capabilities. This presentation will use our experience designing clinical studies of normal tissue toxicity, tumordose escalation, use of functional imaging, and other on‐going clinical efforts to illustrate both the challenges involved in clinical studies as well as some of the methods that have proven to be helpful to our research efforts. Performing clinical studies which have appropriate goals, are designed to include adequate clinical support, and which eventually result in clinically documented improvements for our patients can be a very valuable and satisfying contribution to improved patient care.
1. To illustrate clinical study design techniques.
2. To discuss some of the methods which help make clinical studies viable in a busy clinic.
3. To highlight some of the challenges associated with clinical research efforts.
Informatics in Radiation Oncology
35(2008); http://dx.doi.org/10.1118/1.2962570View Description Hide Description
Radiation oncology research faces an explosion of data collection, analysis, and management issues. Clinical trials research requires interfacing multiple vendor systems, storage of large‐scale, multi‐type data in systems that provide convenient, and secure access for analyses, quality assurance/improvement, and monitoring. HIPPA concerns are making multi‐institutional datasharing and clinical trial cooperation more challenging. Relevant datatypes now span the spectrum of multiple types of imaging, traditional radiotherapy data objects, image‐guidance data, physician‐reported, or patient‐reported outcomes, and new biological datasets and tissue/fluid samples along with their various bioprofiles and ‘‐omics.’ Effective management schemes must preserve the context, storyline, and linkage amongst related data. Effective utilization and learning based on all the available data will require new informatics tools and a more open approach to realizing the extent of the challenge and how ineffctively it has been addressed to date. In this presentation I will discuss these challenges that vendors, physicians, physicists, and informaticists face in helping to build out effective tools to support radiation oncology research in the next ten years. I will particularly focus on the emerging informatics needs of clinical trial groups (such as RTOG), as well as academic research centers, who want to fully utilize and learn from modern image‐guided, adaptive, biologically‐stratified, treatment paradigms. We will discuss approaches to problems associated with collecting outcomes data, utilizing complex datasets within the clinical workflow, and effectively learning from clinical, imaging, and biological data.
35(2008); http://dx.doi.org/10.1118/1.2962571View Description Hide Description
Purpose: The hypothesis of this long term project is that a multicentric based information system based on four modules (multiparametric interconnected healthcare databases,data mining tools, updated machine learning based predictive algorithms and user interfaces) will facilitate and accelerate research in oncology. We call this approach “Machine Learning Based Clinical Research (MLBCR)”. We performed a pilot project in non‐small cell lungcancer(NSCLC) patients for which clinical TNM stage is highly inaccurate for the prediction of survival of non‐surgical patients and alternatives are currently lacking. The objectives of this study were to develop and validate a prediction model for survival of NSCLC patients, treated with (chemo) radiotherapy, using clinical factors. Patients and Methods: Three interconnected databases were mirrored into a data warehouse using a disease based, cohort‐specific data model. The three data sources were a) electronic medical records, b) imaging and DICOM‐RT objects in a RT‐PACS and c) treatment information in a record and verify database. Data from 403 consecutive inoperable NSCLC patients, stage I‐IIIB, treated radically with (chemo) radiation were selected. In 82 patients data from blood samples were available. The 2‐norm Support Vector Machines were used to build the prognostic models. Performance of the models was expressed as the AUC (Area Under the Curve) of the Receiver Operating Characteristic (ROC) and assessed using leave‐one‐out (LOO) cross‐validation. The prognostic model, using clinical factors only, was validated using two external, independent datasets with 36 and 65 patients, respectively. In addition, a risk score was calculated and a nomogram, which is in fact a graphical representation of the risk score, was made for practical use. Results: The model, based on 403 patients and using clinical factors, consisted of gender, WHO performance status, forced expiratory volume (FEV1), number of positive lymph node stations on PET and gross tumor volume (on PET‐CT). The AUC, assessed by LOO cross‐validation, was 0.75 (95% CI 0.70–0.82), while application of the model to the external datasets yielded an AUC of 0.75 and 0.76 respectively. Splitting the MAASTRO cohort into 3 subgroups, based on the risk score, resulted in the identification of a high, medium and low risk group. The 2‐year survival was 66% (95% CI 54%–78%) for the low risk group, 29% (95% CI 21%–37%) for the medium risk group and 14% (95% CI 5%–23%) for the high risk group. If blood biomarkers were available, based on the 82 patients the prognostic model consisted of three additional biomarkers factors: OPN, IL8 and CEA. The LOO AUC was 0.83 (95% CI 0.76–0.94), which is significantly better than the prognostic model using only clinical factors based on the same 82 patients (AUC 0.71, 95% CI 0.60–0.87). Conclusion, the model, using clinical factors, successfully estimates 2‐year survival of NSCLC patients and the performance, assessed internally as well as in two independent datasets, is good. Combining blood biomarkers with clinical factors yielded a significantly better performance than using clinical factors only (AUC: 0.83 vs 0.71). We concluded that MLBCR is feasible. The bottle neck is the availability of external data sets. Therefore, we need to invest in international standards as well in multicentric approaches allowing to recruit more patients, preferably having had different type of treatments, and to have quick access to external validation data sets.Conflict of Interest: This project has been partially funded by Siemens IKM.
35(2008); http://dx.doi.org/10.1118/1.2962572View Description Hide Description
RadiationOncology is based on past experiences and clinical trials for the understanding and advancement of patient care. In the current practice we study the effects of radiation on our patients through controlled trials. These trials represent less than 5% of our patient population, take years for results, and are controlled with more rigor than the standard clinical practice. A vast amount of untapped knowledge is contained in our clinical data. The question is how to access it.
The workflow in radiationoncology has multiple stages from simulation to treatment planning, to daily record of treatments and follow‐up visits. Throughout this process there are multiple opportunities to capture meaningful information that is relevant to the complications and successes of the treatment. Current practices lack the organized collection of much of this data, and few tools exist to evaluate and analyze the data in order to reapply the new knowledge at the point of care.
Today we have many studies looking at anatomical, functional and molecular images to better characterize our patient's disease. We use pathology, and in the future, genetic information to better understand the nature of a specific patient's disease. We look at radiationdose distributions, fractionation patterns and patient motion to understand how it impacts treatment outcome. We complicate the practice further with concurrent chemo‐ and hormonal therapies in addition to surgery.
eScience refers to the practice of studying immense amounts of data through the use of computer networks and well organized databases. Such systems enable distributed collaboration among colleagues in the specified discipline. In radiationoncology, we are very good at the immense amount of data part, but we are lacking in the management of that data to practice eScience. Oncospace is an initiative to apply eScience concepts to radiationoncology for both a physician's tool for personalized medicine and a collaborative tool for multi‐institutional research on clinical data.
Oncospace is composed of several components: Data collection and workflow where the clinical workflow is altered to inherently collect information on our patients relevant for future analysis; Data warehouse design to create the active database model for efficient analysis, and web services to support web‐based access with security levels in place to protect patient privacy; Human interface design to make it easy to ask clinically relevant questions of the data and present the answers in ways the physician think about the problems; Statistical analysis tools to allow us to better understand the relative importance and validity of the clinical data that is a less controlled than a typical clinical trial; and decision support tools to allow us to apply knowledge from the system to our clinical practice.
This presentation will give an overview of the potential of eScience to help uncover clinical knowledge and apply it at the point of care.
1. Basic understanding of eScience concepts and implementation.
2. Expose the potential of expanding our knowledge base utilizing our clinical data.
An Interactive Session with TG100
35(2008); http://dx.doi.org/10.1118/1.2962684View Description Hide Description
35(2008); http://dx.doi.org/10.1118/1.2962685View Description Hide Description
Patient and employee safety is a critical concern in radiation oncology. All Radiation Oncology departments have a quality assurance program, most with an error reporting and analysis processes which are intended to monitor errors and modify treatment polices and procedures when necessary. Industry has been developing processes to improve quality and safety of their operations and products since the 1940s. These processes have become quite sophisticated and recently the AAPM has recognized that there is a need to develop such programs for radiation oncology QA programs. The AAPM subsequently formed TG‐100, charged with developing a structured systematic quality assurance program approach that balances patient safety and quality versus commonly available resources while striking a balance between prescriptiveness and flexibility. Based on industrial standards, the AAPM TG‐100 is going to provide examples and encourage the use of failure mode and effects analysis (FMEA) method for quality assurance management in radiation therapy. Briefly, FMEA is a risk assessment technique for systematically identifying potential failure modes in a process or a system. Failure mode refers to the ways in which something may fail while failures are any errors or defects and can be potential or actual. Effects analysis refers to studying the consequences of those failures. In FMEA, the failures are ranked according to the severity of their consequences, the frequency of occurrence, and the ease of detection.
This presentation provides an example of forthcoming TG‐100 recommendations by applying FMEA methodology to IMRTtreatment planning and delivery process. The example provides analysis of typical IMRT planning and delivery process and provides scores that have been developed by the TG‐100.
1. To describe development of an FMEA program.
2. To provide an example of FMEA for IMRT planning and delivery process.
3. To describe the process for scoring IMRT planning and delivery steps based on FMEA methods.
35(2008); http://dx.doi.org/10.1118/1.2962686View Description Hide Description
Quality management in brachytherapy can become very time‐consuming, performing checks on everything that can be checked. Some of the commonly performed brachytherapyquality assurance measures provide little utility, while some apparently comprehensive programs may leave patients at risk. The most efficient use of time would concentrate resources where the risk of errors is the greatest. Risk analysis can help a practicing medical physicist assess the aspects of the procedure with the greatest risk and focus control measures to guard those steps, and possibly eliminate efforts with little payoff. This presentation presents examples of using risk analysis to guide the development of a quality management program in brachytherapy.
1. To understand the nature of risk analysis.
2. To see how risk analysis can assist in developing a brachytherapy quality management program.
35(2008); http://dx.doi.org/10.1118/1.2962687View Description Hide Description
Failure Modes and Effects Analysis (FMEA) is a useful tool for deepening understanding of the radiation therapy (RT) delivery process flow and for identifying specific process steps that are likely to result in patient harm in the event of errors. Developing an RT process tree can in itself can lead to improved efficiency and safety. FMEA encourages the treatmentdelivery team to view RT delivery holistically, as it requires the team not only to catalogue possible errors at each process tree step, but to assess their frequency, clinical impact, and potential for propagation without detection into downstream processes. Thus, an FMEA helps focus physics attention on relatively low probability process errors with potential for inflicting catastrophic harm on patients. The AAPM FMEA IMRTanalysis raises many questions and issues:
1. How much benefit can a clinic derive from a model FMEA developed for a generic process tree compared to taking their own treatmentdelivery team through FMEA of their specific process?
2. FMEA gives limited insight into causes of errors; correlation or propagation of errors into other steps; or how to modify a process in order to mitigate errors.
3. FMEA ranks possible errors in order of risk (RPN) to the patient under the artificial assumption that no QA is being performed. The impact of QA tests chosen to address a high RPN error mechanism on remaining risk rankings is not accounted for.
A number of other risk assessment strategies for improving QA cost‐effectiveness and robustness should be investigated.
(a) Formal engineering tools such as fault‐tree analysis, probabilistic risk assessment, and human error classification schemes all start with observed treatmentdelivery errors or near misses. Such techniques seek uncover the detailed mechanism of the event, its causal relationships to other events, and seek to assign root causes.
(b) Sensitivity of the process outcome (e.g., dosedelivery accuracy) to system parameters (e.g., MLC leaf‐gap calibration) monitored by QC tests. Such analyses may provide a rational basis for assigning tolerances and action levels to device QC test outcomes.
(c) Statistical process control techniques for identifying optimal QC test frequencies and action levels.
(d) Multidisciplinary forums for developing QA guidance that bring together representatives of all constituencies impacted by RT QA and delivery errors, including radiation oncologists, vendors, and therapists as well as physicists.
1. To understand the potential clinical applications and value of FMEA in designing safe and robust RT processes.
2. To appreciate the limitations of FMEA especially an analysis developed by a consensus panel using a generic process tree.
3. To assess audience feedback on possible future directions of the TG‐100 work.
4. To review the potential costs and benefits of additional industrial engineering QM tools for more detailed and specific followup analysis of FMEA findings.
Control Theory and Feedback in Radiotherapy
35(2008); http://dx.doi.org/10.1118/1.2962730View Description Hide Description
Dynamic behavior is observed in a wide range of biological systems, from the changing concentrations of oxygen and carbon dioxide in the blood to the regulation of glucose. A key component guiding these systems is feedback, the mechanism and process by which the status of a system is used to regulate the input to that system. Feedback can be used to stabilize a dynamic system, and it can be used to improve the performance of the system. In this presentation, an overview of basic control system structures will be presented, encompassing feedback along with alternate control methods such as feedforward and adaptive control. The pros and cons of each approach will be discussed. The potential for interfacing artificial and natural control systems will serve as the framework for the presentation.
1. Recognize the existence of control systems in human physiology and medicine, and discuss the potential for artificial control systems to augment natural system behavior.
2. Define feedback, feedforward, and adaptive control system structures.
3. Describe the benefits and limitations of each control approach.
WE‐D‐AUD A‐02: Control Theory to Manage Inter and Intrafraction Spatial and Temporal Anatomic Changes35(2008); http://dx.doi.org/10.1118/1.2962731View Description Hide Description
If anatomical change and tumor movement during treatment can be monitored then it becomes possible to develop feedback systems that adapt to the changes. In the simplest adaptation the plan and/or the beam alignment can be altered before each treatment fraction begins; in the most ambitious situation the delivery of the beam and/or its alignment with the patient is continuously altered in real time during treatment. In this presentation I will review techniques for spatial and temporal estimation of target movement in real time, filtering (e.g., Kalman filters) to establish the best estimate of the target trajectory, and open and closed loop feedback techniques to compensate for tumormotion in real time.
1. Current and prospective methods for real‐time adaptation of radiation therapy to patient movement.
2. Elements of control theory pertinent to adaptive feedback systems.
3. Sources of error and their accommodation in the feedback process.
35(2008); http://dx.doi.org/10.1118/1.2962732View Description Hide Description