No data available.
Please log in to see this content.
You have no subscription access to this content.
No metrics data to plot.
The attempt to load metrics for this article has failed.
The attempt to plot a graph for these metrics has failed.
The full text of this article is not currently available.
Ongoing quality control in digital radiography: Report of AAPM Imaging Physics Committee Task Group 151
2.United States Food and Drug Administration: Code of Federal Regulations, 21CFR900.12(e)(3)(ii), 2008.
3.National Council on Radiation Protection and Measurements, “Quality assurance for diagnostic imaging,” NCRP Report 99 (NCRP, Bethesda, MD, 1988).
4.American Association of Physicists in Medicine, “Quality control in diagnostic radiology,” Report 74 (Medical Physics, Madison, WI, 2002).
5.American College of Radiology, ACR Technical Standard for Diagnostic Medical Physics Performance Monitoring of Radiographic and Fluoroscopic Equipment (Reston, VA, 2006), pp. 1139–1142.
6.J. E. Gray, N. T. Winkler, J. Stears, and E. D. Frank, Quality Control in Diagnostic Imaging (Aspen, Inc., Gaithersburg, MD, 1983).
8.GE Medical Systems Revolution XQ/i Digital Radiographic Imaging System, Pub 98-5502:1–8, 1998.
11.S. Peer, R. Peer, M. Walcher, M. Pohl, and W. Jaschke, “Comparative reject analysis in conventional film-screen and digital storage phosphor radiography,” Eur. Radiol. 9, 1693–1696 (1999).
12.G. C. Weatherburn, S. Bryan, and M. West, “A comparison of image reject rates when using film, hard copy computed radiography and soft copy images on picture archiving and communication systems (PACS) workstations,” Br. J. Radiol. 72, 653–660 (1999).
13.C. Prieto, E. Vano, J. I. Ten, J. M. Fernandez, A. I. Iñiguez, N. Arevalo, A. Litcheva, E. Crespo, A. Floriano, and D. Martinez, “Image retake analysis in digital radiography using DICOM header information,” J. Digital Imaging 22, 393–399 (2009).
14.D. H. Foos, W. J. Sehnert, B. Reiner, E. L. Siegel, A. Segal, and D. L. Waldman, “Digital radiography reject analysis: Data collection methodology, results, and recommendations from an in-depth investigation at two hospitals,” J. Digital Imaging 22, 89–98 (2009).
15.R. Polman, A. K. Jones, C. E. Willis, and S. J. Shepard, “Reject analysis tool,” in Proceedings of Society for Information Management in Medicine (SIIM) (SIIM, Leesburg, VA, 2008), pp. 38–40.
16.A. K. Jones, R. Polman, C. E. Willis, and S. J. Shepard, “One year’s results from a server-based system for performing reject analysis and exposure analysis in computed radiography,” J. Digital Imaging 24, 243–255 (2011).
17.L. Trestrail, D. J. Sandoval, P. H. Heintz, S. Luan, D. Sanchez, and D. Z. Chen, “A web-based automatic QA analysis program for digital image tracking,” in Proceedings of Radiological Society of North America, 2009.
18.F. A. Mettler, Jr., B. R. Thomadsen, M. Bhargavan, D. B. Gilley, J. E. Gray, J. A. Lipoti, J. McCrohan, T. T. Yoshizumi, and M. Mahesh, “Medical radiation exposure in the U.S. in 2006: Preliminary results,” Health Phys. 95, 502–507 (2008).
20.Institute of Medicine Crossing the Quality Chasm: A New Health System for the 21st Century (National Academy, Washington, DC, 2001).
The Alliance for Radiation Safety in Pediatric Imaging, www.imagegently.org
, accessed 19 November 2009.
23.A. Addler, R. Carlton, and B. Wold, “An analysis of radiography repeat and reject rates,” Radiol. Technol. 63, 308–314 (1992).
25.B. C. Dodd, “Repeat analysis in radiology: A method of quality control,” Can. J. Radiogr. Radiother. Nucl. Med. 14, 37–40 (1983).
34.“European guidelines on quality criteria for diagnostic radiographic images,” in Publication EUR 16260 EN (European Commission, Brussels, Belgium, 1996).
35.M. Freedman, E. Pe, and S. K. Mun, “The potential for unnecessary patient exposure from the use of storage phosphor imaging systems,” Proc. SPIE 1897, 472–479 (1993).
36.D. Gur, C. R. Fuhman, and J. H. Feist, “Natural migration to a higher dose in CR imaging,” in Proceedings of the Eight European Congress of Radiology (European Society of Radiology, Vienna, Austria, 1993), p. 154.
37.American Association of Physicists in Medicine, “An exposure indicator for digital radiography,” Report of AAPM Radiography and Fluoroscopy Subcommittee Task Group 116 (AAPM, 2009).
38.S. J. Shepard, J. Wang, M. Flynn, E. Gingold, L. Goldman, K. Krugh, D. L. Leong, E. Mah, K. Ogden, D. Peck, E. Samei, J. Wang, and C. E. Willis, “An exposure indicator for digital radiography: AAPM Task Group 116 (executive summary),” Med. Phys. 36, 2898–2914 (2009).
39.International Electrotechnical Commission, Medical electrical equipment–Exposure index of digital x-ray imaging systems–Part 1: Definitions and requirements for general radiography, IEC 62494-1, International Electrotechnical Commission ed. 1.0, Geneva, 2008.
40.American Association of Physicists in Medicine, “Standardized methods for measuring diagnostic x-ray exposures,” Report of AAPM Diagnostic X-ray Imaging Committee Task Group 8 (AAPM, 1990).
41.American National Standards Institute, “Method for the sensitometry of medical x-ray screen-film processing systems,” in ANSI PH2.43 (ANSI, New York, NY, 1982).
42.B. J. Conway, P. F. Butler, J. E. Duff, T. R. Fewell, R. E. Gross, R. J. Jennings, G. H. Koustenis, J. L. McCrohan, F. G. Rueter, and C. K. Showalter, “Beam quality independent attenuation phantom for estimating patient exposure from x-ray automatic exposure controlled chest examinations,” Med. Phys. 11, 827–832 (1984).
43.B. J. Conway, J. E. Duff, T. R. Fewell, R. J. Jennings, L. N. Rothenberg, and R. C. Fleischman, “A patient-equivalent attenuation phantom for estimating patient exposures from automatic exposure controlled x-ray examinations of the abdomen and lumbo-sacral spine,” Med. Phys. 17, 448–453 (1990).
44.International Committee on Radiation Units and Measurement, “Patient dosimetry for x rays used in medical imaging,” ICRU Report 74 (2005).
45.C. Shah, A. K. Jones, and C. E. Willis, “Consequences of modern anthropometric dimensions for radiographic techniques and patient radiation exposures,” Med. Phys. 35, 3616–3625 (2008).
47.D. Hart, D. G. Jones, and B. F. Wall, Estimation of effective dose in diagnostic radiology from entrance surface dose and dose-area product measurements: NRPB-R 262, National Radiological Protection Board, Oxon, England, 1994.
48.J. C. Le Heron, “Estimation of effective dose to the patient during medical x-ray examinations from measurements of the dose-area product,” Phys. Med. Biol. 37, 2117–2126 (1992).
51.National Electrical Manufacturer’s Association, DICOM Correction Item CP 1024, NEMA, Rosslyn, VA, 2010.
52.M. D. Cohen, M. L. Cooper, K. Piersall, and B. K. Apgar, “Quality assurance: Using the exposure index and the deviation index to monitor radiation exposure for portable chest radiographs in neonates,” Pediatr. Radiol. 41, 592–601 (2011).
53.B. K. Stewart, K. M. Kanal, J. R. Perdue, and F. A. Mann, “Computed radiography dose data mining and surveillance as an ongoing quality assurance improvement process,” Am. J. Roentgenol. 189, 7–11 (2007).
54.National Electrical Manufacturer’s Association
, Digital Imaging and Communications in Medicine (DICOM) Part 3: Information Object Definitions (PS 3.3-2009)
, Rosslyn, VA
), pp. 325
available at ftp://medical.nema.org/medical/dicom/2009/09_03pu3.pdf
, accessed April 2011.
55.J. E. Gray, B. R. Archer, P. F. Butler, B. B. Hobbs, F. A. Mettler, Jr., R. J. Pizzutiello, Jr., B. A. Schueler, K. J. Strauss, O. H. Suleiman, and M. J. Yaffe, “Reference values for diagnostic radiology: Application and impact,” Radiology 235, 354–358 (2005).
56.National Council on Radiation Protection and Measurements, “Reference levels and achievable doses in medical and dental imaging: Recommendations for the United States,” NCRP Report 174 (NCRP, Bethesda, MD, 2012).
57.Texas Regulations for Control of Radiation, “Use of radiation machines in the healing arts,” in 25 Texas Administrative Code Section 289.227(j) (2004), pp. 227–217.
59.International Electrotechnical Commission, Medical electrical equipment—Dose area product meters, IEC 60580, International Electrotechnical Commission ed. 2.0, Geneva, 2000.
61.P. G. Nagy, B. Pierce, M. Otto, and N. M. Safdar, “Quality control management and communication between radiologists and technologists,” J. Am. Coll. Radiol. 5, 759–765 (2008).
Article metrics loading...
Quality control (QC) in medical imaging is an ongoing process and not just a series of infrequent evaluations of medical imaging equipment. The QC process involves designing and implementing a QC program, collecting and analyzing data, investigating results that are outside the acceptance levels for the QC program, and taking corrective action to bring these results back to an acceptable level. The QC process involves key personnel in the imaging department, including the radiologist, radiologic technologist, and the qualified medical physicist (QMP). The QMP performs detailed equipment evaluations and helps with oversight of the QC program, the radiologic technologist is responsible for the day-to-day operation of the QC program. The continued need for ongoing QC in digital radiography has been highlighted in the scientific literature. The charge of this task group was to recommend consistency tests designed to be performed by a medical physicist or a radiologic technologist under the direction of a medical physicist to identify problems with an imaging system that need further evaluation by a medical physicist, including a fault tree to define actions that need to be taken when certain fault conditions are identified. The focus of this final report is the ongoing QC process, including rejected image analysis, exposure analysis, and artifact identification. These QC tasks are vital for the optimal operation of a department performing digital radiography.
Full text loading...
Most read this month