Skip to main content
banner image
No data available.
Please log in to see this content.
You have no subscription access to this content.
No metrics data to plot.
The attempt to load metrics for this article has failed.
The attempt to plot a graph for these metrics has failed.
The full text of this article is not currently available.
/content/aapm/journal/medphys/42/11/10.1118/1.4932623
1.
1.American Society for Quality Definitions of Quality Assurance and Quality Control, available at http://asq.org/learn-about-quality/quality-assurance-quality-control/overview/overview.html, accessed January 2013.
2.
2.United States Food and Drug Administration: Code of Federal Regulations, 21CFR900.12(e)(3)(ii), 2008.
3.
3.National Council on Radiation Protection and Measurements, “Quality assurance for diagnostic imaging,” NCRP Report 99 (NCRP, Bethesda, MD, 1988).
4.
4.American Association of Physicists in Medicine, “Quality control in diagnostic radiology,” Report 74 (Medical Physics, Madison, WI, 2002).
5.
5.American College of Radiology, ACR Technical Standard for Diagnostic Medical Physics Performance Monitoring of Radiographic and Fluoroscopic Equipment (Reston, VA, 2006), pp. 11391142.
6.
6.J. E. Gray, N. T. Winkler, J. Stears, and E. D. Frank, Quality Control in Diagnostic Imaging (Aspen, Inc., Gaithersburg, MD, 1983).
7.
7.W. K. Chu, S. Ferguson, B. Wunder, R. Smith, and J. J. Vanhoutte, “A two-year reject/retake profile analysis in pediatric radiology,” Health Phys. 42, 5359 (1982).
http://dx.doi.org/10.1097/00004032-198201000-00005
8.
8.GE Medical Systems Revolution XQ/i Digital Radiographic Imaging System, Pub 98-5502:1–8, 1998.
9.
9.R. Honea, M. E. Blado, and Y. Ma, “Is reject analysis necessary after converting to computed radiography?,” J. Digital Imaging 15(Suppl. 1), 4152 (2002).
http://dx.doi.org/10.1007/s10278-002-5028-7
10.
10.J. Nol, G. Isouard, and J. Mirecki, “Digital repeat analysis; setup and operation,” J. Digital Imaging 19, 159166 (2006).
http://dx.doi.org/10.1007/s10278-005-8733-1
11.
11.S. Peer, R. Peer, M. Walcher, M. Pohl, and W. Jaschke, “Comparative reject analysis in conventional film-screen and digital storage phosphor radiography,” Eur. Radiol. 9, 16931696 (1999).
http://dx.doi.org/10.1007/s003300050911
12.
12.G. C. Weatherburn, S. Bryan, and M. West, “A comparison of image reject rates when using film, hard copy computed radiography and soft copy images on picture archiving and communication systems (PACS) workstations,” Br. J. Radiol. 72, 653660 (1999).
http://dx.doi.org/10.1259/bjr.72.859.10624322
13.
13.C. Prieto, E. Vano, J. I. Ten, J. M. Fernandez, A. I. Iñiguez, N. Arevalo, A. Litcheva, E. Crespo, A. Floriano, and D. Martinez, “Image retake analysis in digital radiography using DICOM header information,” J. Digital Imaging 22, 393399 (2009).
http://dx.doi.org/10.1007/s10278-008-9135-y
14.
14.D. H. Foos, W. J. Sehnert, B. Reiner, E. L. Siegel, A. Segal, and D. L. Waldman, “Digital radiography reject analysis: Data collection methodology, results, and recommendations from an in-depth investigation at two hospitals,” J. Digital Imaging 22, 8998 (2009).
http://dx.doi.org/10.1007/s10278-008-9112-5
15.
15.R. Polman, A. K. Jones, C. E. Willis, and S. J. Shepard, “Reject analysis tool,” in Proceedings of Society for Information Management in Medicine (SIIM) (SIIM, Leesburg, VA, 2008), pp. 3840.
16.
16.A. K. Jones, R. Polman, C. E. Willis, and S. J. Shepard, “One year’s results from a server-based system for performing reject analysis and exposure analysis in computed radiography,” J. Digital Imaging 24, 243255 (2011).
http://dx.doi.org/10.1007/s10278-009-9236-2
17.
17.L. Trestrail, D. J. Sandoval, P. H. Heintz, S. Luan, D. Sanchez, and D. Z. Chen, “A web-based automatic QA analysis program for digital image tracking,” in Proceedings of Radiological Society of North America, 2009.
18.
18.F. A. Mettler, Jr., B. R. Thomadsen, M. Bhargavan, D. B. Gilley, J. E. Gray, J. A. Lipoti, J. McCrohan, T. T. Yoshizumi, and M. Mahesh, “Medical radiation exposure in the U.S. in 2006: Preliminary results,” Health Phys. 95, 502507 (2008).
http://dx.doi.org/10.1097/01.HP.0000326333.42287.a2
19.
19.K. D. Rogers, I. P. Matthews, and C. J. Roberts, “Variation in repeat rates between 18 radiology departments,” Br. J. Radiol. 60, 463468 (1987).
http://dx.doi.org/10.1259/0007-1285-60-713-463
20.
20.Institute of Medicine Crossing the Quality Chasm: A New Health System for the 21st Century (National Academy, Washington, DC, 2001).
21.
21.B. Reiner and E. Siegel, “Pay for performance (P4P) in medical imaging: The time has (finally) come,” J. Digital Imaging 19, 289294 (2006).
http://dx.doi.org/10.1007/s10278-006-1053-2
22.
22.The Alliance for Radiation Safety in Pediatric Imaging, www.imagegently.org, accessed 19 November 2009.
23.
23.A. Addler, R. Carlton, and B. Wold, “An analysis of radiography repeat and reject rates,” Radiol. Technol. 63, 308314 (1992).
24.
24.R. J. Berry and R. Oliver, “Letter: Spoilt films in x-ray departments and radiation exposure to the public from medical radiology,” Br. J. Radiol. 49, 475476 (1976).
http://dx.doi.org/10.1259/0007-1285-49-581-475
25.
25.B. C. Dodd, “Repeat analysis in radiology: A method of quality control,” Can. J. Radiogr. Radiother. Nucl. Med. 14, 3740 (1983).
26.
26.M. A. Al-Malki, W. H. Abulfaraj, S. I. Bhuiyan, and A. A. Kinsara, “A study on radiographic repeat rate data of several hospitals in Jeddah,” Radiat. Prot. Dosim. 103, 323330 (2003).
http://dx.doi.org/10.1093/oxfordjournals.rpd.a006149
27.
27.A. A. Mustafa, C. M. Vasisht, and S. J. Sumanasekara, “Analysis of wasted x-ray films: Experience in two Kuwait hospitals,” Br. J. Radiol. 60, 513515 (1987).
http://dx.doi.org/10.1259/0007-1285-60-713-513
28.
28.T. R. Minnigh and J. Gallet, “Maintaining quality control using a radiological digital x-ray dashboard,” J. Digital Imaging 22, 8488 (2009).
http://dx.doi.org/10.1007/s10278-007-9098-4
29.
29.American College of Radiology: General Radiology Improvement Database Metrics, available at https://nrdr.acr.org/portal/HELP/GRID/ACR_GRID_metrics.pdf, accessed 5 February 2009.
30.
30.H. Oosterwijk, “DICOM versus HL7 for modality interfacing,” J. Digital Imaging 11, 3941 (1998).
http://dx.doi.org/10.1007/BF03168256
31.
31.R. Noumeir, “Benefits of the DICOM modality performed procedure step,” J. Digital Imaging 18, 260269 (2005).
http://dx.doi.org/10.1007/s10278-005-6702-3
32.
32.Digital Imaging and Communications In Medicine (DICOM), Supplement 127-CT Radiation Dose Reporting (Dose SR), available at ftp://medical.nema.org/medical/dicom/final/sup127_ft.pdf, accessed 19 November 2009.
33.
33.Digital Imaging and Communications In Medicine (DICOM), Supplement 94-Diagnostic X-Ray Radiation Dose Reporting (Dose SR), available at ftp://medical.nema.org/medical/dicom/final/sup94_ft.pdf, accessed 19 November 2009.
34.
34.European guidelines on quality criteria for diagnostic radiographic images,” in Publication EUR 16260 EN (European Commission, Brussels, Belgium, 1996).
35.
35.M. Freedman, E. Pe, and S. K. Mun, “The potential for unnecessary patient exposure from the use of storage phosphor imaging systems,” Proc. SPIE 1897, 472479 (1993).
http://dx.doi.org/10.1117/12.146998
36.
36.D. Gur, C. R. Fuhman, and J. H. Feist, “Natural migration to a higher dose in CR imaging,” in Proceedings of the Eight European Congress of Radiology (European Society of Radiology, Vienna, Austria, 1993), p. 154.
37.
37.American Association of Physicists in Medicine, “An exposure indicator for digital radiography,” Report of AAPM Radiography and Fluoroscopy Subcommittee Task Group 116 (AAPM, 2009).
38.
38.S. J. Shepard, J. Wang, M. Flynn, E. Gingold, L. Goldman, K. Krugh, D. L. Leong, E. Mah, K. Ogden, D. Peck, E. Samei, J. Wang, and C. E. Willis, “An exposure indicator for digital radiography: AAPM Task Group 116 (executive summary),” Med. Phys. 36, 28982914 (2009).
http://dx.doi.org/10.1118/1.3121505
39.
39.International Electrotechnical Commission, Medical electrical equipment–Exposure index of digital x-ray imaging systems–Part 1: Definitions and requirements for general radiography, IEC 62494-1, International Electrotechnical Commission ed. 1.0, Geneva, 2008.
40.
40.American Association of Physicists in Medicine, “Standardized methods for measuring diagnostic x-ray exposures,” Report of AAPM Diagnostic X-ray Imaging Committee Task Group 8 (AAPM, 1990).
41.
41.American National Standards Institute, “Method for the sensitometry of medical x-ray screen-film processing systems,” in ANSI PH2.43 (ANSI, New York, NY, 1982).
42.
42.B. J. Conway, P. F. Butler, J. E. Duff, T. R. Fewell, R. E. Gross, R. J. Jennings, G. H. Koustenis, J. L. McCrohan, F. G. Rueter, and C. K. Showalter, “Beam quality independent attenuation phantom for estimating patient exposure from x-ray automatic exposure controlled chest examinations,” Med. Phys. 11, 827832 (1984).
http://dx.doi.org/10.1118/1.595611
43.
43.B. J. Conway, J. E. Duff, T. R. Fewell, R. J. Jennings, L. N. Rothenberg, and R. C. Fleischman, “A patient-equivalent attenuation phantom for estimating patient exposures from automatic exposure controlled x-ray examinations of the abdomen and lumbo-sacral spine,” Med. Phys. 17, 448453 (1990).
http://dx.doi.org/10.1118/1.596483
44.
44.International Committee on Radiation Units and Measurement, “Patient dosimetry for x rays used in medical imaging,” ICRU Report 74 (2005).
45.
45.C. Shah, A. K. Jones, and C. E. Willis, “Consequences of modern anthropometric dimensions for radiographic techniques and patient radiation exposures,” Med. Phys. 35, 36163625 (2008).
http://dx.doi.org/10.1118/1.2952361
46.
46.M. Rosenstein, “Handbook of selected tissue doses for projections common in diagnostic radiology,” U.S. Department of Health and Human Services HHS Publication (FDA) 89–8031, 1988, available at http://www.fda.gov/downloads/Radiation-EmittingProducts/RadiationEmittingProductsandProcedures/MedicalImaging/MedicalX-Rays/ucm117933.pdf, accessed April 2011.
47.
47.D. Hart, D. G. Jones, and B. F. Wall, Estimation of effective dose in diagnostic radiology from entrance surface dose and dose-area product measurements: NRPB-R 262, National Radiological Protection Board, Oxon, England, 1994.
48.
48.J. C. Le Heron, “Estimation of effective dose to the patient during medical x-ray examinations from measurements of the dose-area product,” Phys. Med. Biol. 37, 21172126 (1992).
http://dx.doi.org/10.1088/0031-9155/37/11/008
49.
49.pcxmc–A PC-based Monte Carlo program for calculating patient doses in medical x-ray examinations, STUK, Helsinki, Finland, available athttp://www.stuk.fi/sateilyn_kaytto/ohjelmat/PCXMC/en_GB/pcxmc/, accessed October 2011.
50.
50.American Association of Physicists in Medicine, “Definition of a qualified medical physicist,” available at http://www.aapm.org/medical_physicist/fields.asp, accessed April 2011.
51.
51.National Electrical Manufacturer’s Association, DICOM Correction Item CP 1024, NEMA, Rosslyn, VA, 2010.
52.
52.M. D. Cohen, M. L. Cooper, K. Piersall, and B. K. Apgar, “Quality assurance: Using the exposure index and the deviation index to monitor radiation exposure for portable chest radiographs in neonates,” Pediatr. Radiol. 41, 592601 (2011).
http://dx.doi.org/10.1007/s00247-010-1951-9
53.
53.B. K. Stewart, K. M. Kanal, J. R. Perdue, and F. A. Mann, “Computed radiography dose data mining and surveillance as an ongoing quality assurance improvement process,” Am. J. Roentgenol. 189, 711 (2007).
http://dx.doi.org/10.2214/AJR.06.1232
54.
54.National Electrical Manufacturer’s Association, Digital Imaging and Communications in Medicine (DICOM) Part 3: Information Object Definitions (PS 3.3-2009) (NEMA, Rosslyn, VA, 2009), pp. 325327available at ftp://medical.nema.org/medical/dicom/2009/09_03pu3.pdf, accessed April 2011.
55.
55.J. E. Gray, B. R. Archer, P. F. Butler, B. B. Hobbs, F. A. Mettler, Jr., R. J. Pizzutiello, Jr., B. A. Schueler, K. J. Strauss, O. H. Suleiman, and M. J. Yaffe, “Reference values for diagnostic radiology: Application and impact,” Radiology 235, 354358 (2005).
http://dx.doi.org/10.1148/radiol.2352020016
56.
56.National Council on Radiation Protection and Measurements, “Reference levels and achievable doses in medical and dental imaging: Recommendations for the United States,” NCRP Report 174 (NCRP, Bethesda, MD, 2012).
57.
57.Texas Regulations for Control of Radiation, “Use of radiation machines in the healing arts,” in 25 Texas Administrative Code Section 289.227(j) (2004), pp. 227–217.
58.
58.Nationwide Evaluation of X-Ray Trends, FDA Center for Devices and Radiological Health, available at http://www.fda.gov/Radiation-EmittingProducts/RadiationSafety/NationwideEvaluationofX-rayTrendsNEXT/default.htm, accessed May 2014.
59.
59.International Electrotechnical Commission, Medical electrical equipment—Dose area product meters, IEC 60580, International Electrotechnical Commission ed. 2.0, Geneva, 2000.
60.
60.See supplementary material at http://dx.doi.org/10.1118/1.4932623 for Appendix A containing artifact images.[Supplementary Material]
61.
61.P. G. Nagy, B. Pierce, M. Otto, and N. M. Safdar, “Quality control management and communication between radiologists and technologists,” J. Am. Coll. Radiol. 5, 759765 (2008).
http://dx.doi.org/10.1016/j.jacr.2008.01.013
http://aip.metastore.ingenta.com/content/aapm/journal/medphys/42/11/10.1118/1.4932623
Loading
/content/aapm/journal/medphys/42/11/10.1118/1.4932623
Loading

Data & Media loading...

Loading

Article metrics loading...

/content/aapm/journal/medphys/42/11/10.1118/1.4932623
2015-10-26
2016-09-27

Abstract

Quality control (QC) in medical imaging is an ongoing process and not just a series of infrequent evaluations of medical imaging equipment. The QC process involves designing and implementing a QC program, collecting and analyzing data, investigating results that are outside the acceptance levels for the QC program, and taking corrective action to bring these results back to an acceptable level. The QC process involves key personnel in the imaging department, including the radiologist, radiologic technologist, and the qualified medical physicist (QMP). The QMP performs detailed equipment evaluations and helps with oversight of the QC program, the radiologic technologist is responsible for the day-to-day operation of the QC program. The continued need for ongoing QC in digital radiography has been highlighted in the scientific literature. The charge of this task group was to recommend consistency tests designed to be performed by a medical physicist or a radiologic technologist under the direction of a medical physicist to identify problems with an imaging system that need further evaluation by a medical physicist, including a fault tree to define actions that need to be taken when certain fault conditions are identified. The focus of this final report is the ongoing QC process, including rejected image analysis, exposure analysis, and artifact identification. These QC tasks are vital for the optimal operation of a department performing digital radiography.

Loading

Full text loading...

/deliver/fulltext/aapm/journal/medphys/42/11/1.4932623.html;jsessionid=rLYblDCqKNiv9BDqci91mv6C.x-aip-live-06?itemId=/content/aapm/journal/medphys/42/11/10.1118/1.4932623&mimeType=html&fmt=ahah&containerItemId=content/aapm/journal/medphys
true
true

Access Key

  • FFree Content
  • OAOpen Access Content
  • SSubscribed Content
  • TFree Trial Content
752b84549af89a08dbdd7fdb8b9568b5 journal.articlezxybnytfddd
/content/realmedia?fmt=ahah&adPositionList=
&advertTargetUrl=//oascentral.aip.org/RealMedia/ads/&sitePageValue=online.medphys.org/42/11/10.1118/1.4932623&pageURL=http://scitation.aip.org/content/aapm/journal/medphys/42/11/10.1118/1.4932623'
Right1,Right2,Right3,