1887
banner image
No data available.
Please log in to see this content.
You have no subscription access to this content.
No metrics data to plot.
The attempt to load metrics for this article has failed.
The attempt to plot a graph for these metrics has failed.
oa
Quantitative error analysis for computer assisted navigation: A feasibility study
Rent:
Rent this article for
Access full text Article
/content/aapm/journal/medphys/40/2/10.1118/1.4773871
1.
1. R. Mösges and G. Schlöndorff, “A new imaging method for intraoperative therapy control in skull-base surgery,” Neurosurg. Rev. 11, 245247 (1988).
http://dx.doi.org/10.1007/BF01741417
2.
2. R. D. Bucholz, H. W. Ho, and J. P. Rubin, “Variables affecting the accuracy of stereotactic localization using computerized tomography,” J. Neurosurg. 79, 667673 (1993).
http://dx.doi.org/10.3171/jns.1993.79.5.0667
3.
3. E. Watanabe, T. Watanabe, S. Manaka, Y. Manayagi, and K. Takakura, “Three-dimensional digitizer (neuronavigator): New equipment for computed-tomography guided stereotaxic surgery,” Surg. Neurol. 27, 543547 (1987).
http://dx.doi.org/10.1016/0090-3019(87)90152-2
4.
4. S. J. Zinreich et al., “Frameless stereotaxic integration of CT imaging data: Accuracy and initial applications,” Radiology 188, 735742 (1993).
5.
5. K. S. Arun, T. S. Huang, and S. D. Blostein, “Least-squares fitting of two 3D point sets,” IEEE Trans. Pattern Anal. Mach. Intell. 9, 698700 (1987).
http://dx.doi.org/10.1109/TPAMI.1987.4767965
6.
6. C. R. J. Maurer and J. M. Fitzpatrick, “A review of medical image registration,” in Interactive Image-Guided Neurosurgery, edited by R. J. Maciunas (AANS, Park Ridge, Ill., USA, 1993), pp. 1744.
7.
7. J. B. A. Maintz and M. A. Viergever, “A survey of medical image registration,” Med. Image Anal. 2, 136 (1998).
http://dx.doi.org/10.1016/S1361-8415(01)80026-8
8.
8. P. J. Besl and N. D. Mckay, “A method for registration of 3D shapes,” IEEE Trans. Pattern Anal. Mach. Intell. 14, 239256 (1992).
http://dx.doi.org/10.1109/34.121791
9.
9. Z. Zhang, “Iterative point matching for registrationof free-form curves and surfaces,” Int. J. Comput. Vis. 12, 119152 (1994).
http://dx.doi.org/10.1007/BF01427149
10.
10. J. M. Fitzpatrick, “The role of registration in accurate surgical guidance,” Proc. Inst. Mech. Eng., Part H: J. Eng. Med. 224, 607622 (2010).
http://dx.doi.org/10.1243/09544119JEIM589
11.
11. C. R. Mascott, J. C. Sol, P. Bousquet, J. Lagarrigue, Y. Lazorthes, and V. Lauwers-Cances, “Quantification of true in vivo (application) accuracy in cranial image-guided surgery: Influence of mode of patient registration,” Neurosurgery 59, ONS146ONS156 (2006).
http://dx.doi.org/10.1227/01.NEU.0000220089.39533.4E
12.
12. R. A. Kockro et al., “Planning and simulation of neurosurgery in a virtual reality environment,” Neurosurgery 46, 118135 (2000).
http://dx.doi.org/10.1097/00006123-200001000-00024
13.
13. B. J. Dixon, M. J. Daly, H. Chan, A. Vescan, I. J. Witterick, and J. C. Irish, “Augmented image guidance improves skull base navigation and reduces task workload in trainees: A preclinical trial,” Laryngoscope 121, 20602064 (2011).
http://dx.doi.org/10.1002/lary.22153
14.
14. J. Wilbert et al., “Semi-robotic 6 degree of freedom positioning for intracranial high precision radiotherapy; first phantom and clinical results,” Radiat. Oncol. 5, 4253 (2010).
http://dx.doi.org/10.1186/1748-717X-5-42
15.
15. R. F. Labadie, M. Fenion, H. Cevikalp, S. Harris, R. Galloway, and J. M. Fitzpatrick, “Image-guided otologic surgery,” Int. Congr. Ser. 1256, 627632 (2003).
http://dx.doi.org/10.1016/S0531-5131(03)00273-5
16.
16. C. J. Coulson, A. P. Reid, D. W. Proops, and P. N. Brett, “ENT challenges at the small scale,” Int. J. Med. Rob. Comput. Assist. Surg. 3, 9196 (2007).
http://dx.doi.org/10.1002/rcs.132
17.
17. R. R. Shamir, R. Joskowicz, S. Spektor, and Y. Shoshan, “Localization and registration accuracy in image-guided neurosurgery: A clinical study,” Int. J. Comput. Assist. Radiol. Surg. 4, 4552 (2009).
http://dx.doi.org/10.1007/s11548-008-0268-8
18.
18. J. Berry, B. W. O’Malley, Jr., S. Humphries, and H. Staecker, “Making image guidance work: Understanding control of accuracy,” Ann. Otol. Rhinol. Laryngol. 112, 689692 (2003).
19.
19. S. Nicolau, X. Pennec, L. Soler, and N. Ayache, “An accuracy certified augmented reality system for therapy guidance,” Lect. Notes Comput. Sci. 3023, 7991 (2004).
http://dx.doi.org/10.1007/b97871
20.
20. P. Pillai, S. Sammet, and M. Ammirati, “Application accuracy of computed tomography-based, image-guided navigation of temporal bone,” Neurosurgery 63, 326332 (2008).
http://dx.doi.org/10.1227/01.NEU.0000316429.19314.67
21.
21. P. A. Woerdeman, P. W. Willems, H. J. Noordmans, C. A. Tulleken, and J. W. van der Sprenkel, “Application accuracy in frameless image-guided neurosurgery: A comparison study of three patient-to-image registration methods,” J. Neurosurg. 106, 10121016 (2007).
http://dx.doi.org/10.3171/jns.2007.106.6.1012
22.
22. J. West and J. M. Fitzpatrick, “Point-based rigid registration: clinical validation of theory,” Proc. SPIE 3979, 353359 (2000).
http://dx.doi.org/10.1117/12.387697
23.
23. M. P. Fried, J. Kleefield, H. Gopal, E. Reardon, B. T. Ho, and F. A. Kuhn, “Image-guided endoscopic surgery: Results of accuracy and performance in a multicenter clinical study using an electromagnetic tracking system,” Laryngoscope 107, 594601 (1997).
http://dx.doi.org/10.1097/00005537-199705000-00008
24.
24. E. P. Sipos, S. A. Tebo, S. J. Zinreich, and H. Brem, “In vivo accuracy testing and clinical experience with the ISG viewing wand.,” Neurosurgery 39, 194204 (1996).
http://dx.doi.org/10.1097/00006123-199607000-00048
25.
25. J. M. Fitzpatrick, J. B. West, and C. R. Maurer, Jr., “Predicting error in rigid-body point-based registration,” IEEE Trans. Med. Imaging 17, 694702 (1998).
http://dx.doi.org/10.1109/42.736021
26.
26. M. H. Moghari, B. Ma, and P. Abolmaesumi, “A theoretical comparison of different target registration error estimators,” Lect. Notes Comput. Sci. 5242, 10321040 (2008).
http://dx.doi.org/10.1007/978-3-540-85990-1
27.
27. A. D. Wiles, A. Likholyot, D. D. Frantz, and T. M. Peters, “A statistical model for point-based target registration error with anisotropic fiducial localizer error,” IEEE Trans. Med. Imaging 27, 378390 (2008).
http://dx.doi.org/10.1109/TMI.2007.908124
28.
28. R. Sibson, “Studies in the robustness of multidimensional scaling: Procrustes statistic.,” J. R. Stat. Soc. Ser. B 40, 234238 (1978).
29.
29. R. Sibson, “Studies in the robustness of multidimensional scaling: Pertubational analysis of classical scaling,” J. R. Stat. Soc. Ser. B 40, 324328 (1979).
30.
30. S. P. Langron and A. J. Collins, “Perturbation-theory for generalized procrustes analysis,” J. R. Stat. Soc. Ser. B 47, 277284 (1985).
31.
31. D. Y. Hsu, Spatial Error Analysis (IEEE, New York, USA, 1998).
32.
32. R. Balachandran, J. M. Fitzpatrick, and R. F. Labadie, “Accuracy of image-guided surgical systems at the lateral skull base as clinically assessed using bone-anchored hearing aid posts as surgical targets,” Otol. Neurotol. 29, 10501055 (2008).
http://dx.doi.org/10.1097/MAO.0b013e3181859a08
33.
33. R. Balachandran, R. F. Labadie, and J. M. Fitzpatrick, “Clinical determination of target registration error of an image-guided otologic surgical system using patients with bone-anchored hearing aids,” Proc. SPIE 6509, 650930 (2007).
http://dx.doi.org/10.1117/12.709949
34.
34. A. D. Wiles and T. M. Peters, “Improved statistical TRE model when using a reference frame,” Lect. Notes Comput. Sci. 4791, 442449 (2007).
http://dx.doi.org/10.1007/978-3-540-75757-3
35.
35. N. L. Dorward, O. Alberti, J. D. Palmer, N. D. Kitchen, and D. G. Thomas, “Accuracy of true frameless stereotaxy: In vivo measurement and laboratory phantom studies. Technical note,” J. Neurosurg. 90, 160168 (1999).
http://dx.doi.org/10.3171/jns.1999.90.1.0160
36.
36. S. Schmerber and F. Chassat, “Accuracy evaluation of a CAS system: Laboratory protocol and results with 6D localizers, and clinical experiences in otorhinolaryngology,” Comput. Aided Surg. 6, 113 (2001).
http://dx.doi.org/10.3109/10929080109145988
37.
37. R. F. Labadie et al., “In vitro assessment of image-guided otologic surgery: Submillimeter accuracy within the region of the temporal bone,” Otolaryngol.-Head Neck Surg. 132, 435442 (2005).
http://dx.doi.org/10.1016/j.otohns.2004.09.141
38.
38. M. Vogele, W. Freysinger, R. Bale, A. R. Gunkel, and W. F. Thumfart, “Use of the ISG viewing wand on the temporal bone. A model study,” HNO 45, 7480 (1997).
http://dx.doi.org/10.1007/s001060050092
39.
39. F. Kral, H. Riechelmann, and W. Freysinger, “Navigated surgery at the lateral skull base and registration and preoperative imagery,” Arch. Otolaryngol. 137, 144150 (2011).
http://dx.doi.org/10.1001/archoto.2010.249
40.
40. F. D. Vrionis, K. T. Foley, J. H. Robertson, and J. J. Shea, “Use of cranial surface anatomic fiducials for interactive image-guided navigation in the temporal bone: A cadaveric study,” Neurosurgery 40, 755763 (1997).
http://dx.doi.org/10.1097/00006123-199704000-00019
41.
41. B. Ma, M. H. Moghari, R. E. Ellis, and P. Abolmaesumi, “Estimation of optimal fiducial target registration error in the presence of heteroscedastic noise,” IEEE Trans. Med. Imaging 29, 708723 (2010).
http://dx.doi.org/10.1109/TMI.2009.2034296
42.
42. M. H. Moghari and P. Abolmaesumi, “Distribution of fiducial registration error in rigid-body point-based registration,” IEEE Trans. Med. Imaging 28, 17911801 (2009).
http://dx.doi.org/10.1109/TMI.2009.2024208
43.
43. M. H. Moghari and P. Abolmaesumi, “Distribution of target registration error for anisotropic and inhomogeneous fiducial localization error,” IEEE Trans. Med. Imaging 28, 799813 (2009).
http://dx.doi.org/10.1109/TMI.2009.2020751
44.
44. F. Schwarm, Ö. Güler, F. Kral, G. M. Diakov, A. Reka, and W. Freysinger, “Characterization of Open4Dnav, an IGSTK-based 3D-navigation system for FESS,” Int. J. Comput. Assist. Radiol. Surg. 3, S248S249 (2008).
http://dx.doi.org/10.1007/s11548-008-0203-z
45.
45. M. Bickel, Ö. Güler, F. Kral, and W. Freysinger, “Exploring the validity of predicted TRE in navigation,” Proc. SPIE 7625, 261265 (2010).
http://dx.doi.org/10.1117/12.843656
46.
46. M. Bickel, Ö. Güler, F. Kral, and W. Freysinger, “Evaluation of the application accuracy of 3D-navigation through measurements and prediction,” IFMBE Proc. 25/VI, 349351 (2009).
http://dx.doi.org/10.1007/978-3-642-03906-5
47.
47. L. Ibanez et al., IGSTK-Image-Guided Surgery Toolkit Release 4.2 edited by K. Cleary, P. Cheng, A. Enquobahrie, and Z. Yaniv (Insight Software Consortium, www.isc.org, Gaithersburg, MD, 2009), see www.igstk.org.
48.
48. Ö. Güler, Z. Yaniv, K. Cleary, and W. Freysinger, “New video component for the image guided surgery toolkit IGSTK,” IFMBE Proc. 25/VI, 359390 (2009).
http://dx.doi.org/10.1007/978-3-642-03895-2
49.
49. B. K. P. Horn, “Closed-form solution of absolute orientation using unit quaternions,” J. Opt. Soc. Am. A 4, 629642 (1987).
http://dx.doi.org/10.1364/JOSAA.4.000629
50.
50. F. Chassat and S. Lavallee, “Experimental protocol of accuracy evaluation of 6D localizers for computer-integrated surgery: Application to four optical localizers,” Lect. Notes Comput. Sci. 1496, 277284 (1998).
51.
51. N. C. Atuegwu and R. L. Galloway, “Volumetric characterization of the aurora magnetic tracker system for image-guided transorbital endoscopic procedures,” Phys. Med. Biol. 53, 43554368 (2008).
http://dx.doi.org/10.1088/0031-9155/53/16/009
52.
52. VDI-VDE and GMDAOptical 3D measuring systems. Imaging systems with point-by-point probing,” VDI/VDE Handbook Measuring Technology II 2634, 110 (2002). Berlin, Germany, VDI, Düsseldorf, Germany.
53.
53. A. D. Wiles, D. G. Thompson, and D. D. Frantz, “Accuracy assessment and interpretation for optical tracking systems,” Proc. SPIE 5367, 112 (2004).
http://dx.doi.org/10.1117/12.536128
54.
54. R. Elfring, F. M. de la, and K. Radermacher, “Assessment of optical localizer accuracy for computer aided surgery systems,” Comput. Aided Surg. 15, 112 (2010).
http://dx.doi.org/10.3109/10929081003647239
55.
55. Ö. Güler, Z. R. Bardosi, M. Ertugrul, M. Di Franco, and W. Freysinger, “Extending the tracking device support in the image-guided surgery toolkit (IGSTK): CamBar B2, EasyTrack 500, and Active Polaris (version http://hdl.handle.net/10380/3288) [online],” Available: http://www.insight-journal.org/?journal=31 Kitware, Inc., (2011).
56.
56. A. R. Gunkel, M. Vogele, A. Martin, R. J. Bale, W. F. Thumfart, and W. Freysinger, “Computer-aided surgery in the petrous bone,” Laryngoscope 109, 17931799 (1999).
http://dx.doi.org/10.1097/00005537-199911000-00013
57.
57. W. Freysinger et al., “Computer assisted interstitial brachytherapy,” Lect. Notes Comput. Sci. 1496, 352357 (1998).
58.
58. J. B. West and C. R. Maurer, Jr., “Designing optically tracked instruments for image-guided surgery,” IEEE Trans. Med. Imaging 23, 533545 (2004).
http://dx.doi.org/10.1109/TMI.2004.825614
59.
59. B. Ma, M. H. Mogharm, R. E. Ellis, and P. Abolmaesumi, “On fiducial target registration error in the presence of anisotropic noise,” Lect. Notes Comput. Sci. 4792, 628635 (2007).
http://dx.doi.org/10.1007/978-3-540-75759-7
60.
60. A. L. Simpson, B. Ma, R. E. Ellis, A. J. Stewart, and M. J. Miga, “Uncertainty propagation and analysis of image-guided surgery,” Proc. SPIE 7964, 79640H1 (2011).
http://dx.doi.org/10.1117/12.878774
61.
61. X. Pennec and J.-P. Thirion, “A framework for uncertainty and validation of 3D registration methods based on points and frames,” Int. J. Comput. Vis. 25, 203229 (1997).
http://dx.doi.org/10.1023/A:1007976002485
62.
62. R. Khadem et al., “Comparative tracking error analysis of five different optical tracking systems,” Comput. Aided Surg. 5, 98107 (2000).
http://dx.doi.org/10.3109/10929080009148876
63.
63. D. D. Frantz, S. R. Kirsch, and A. D. Wiles, “Specifying 3D tracking system accuracy,” Bildverarbeitung für die Medizin (BVM) 2004, edited by J. Tolxdorff, J. Braun, H. Handels, A. Horsch, P. Meiner (Springer, Berlin, Germany, 2004), pp. 234238.
64.
64. A. Danilchenko and J. M. Fitzpatrick, “General approach to first-order error prediction in rigid point registration,” IEEE Trans. Med. Imaging 30, 679693 (2011).
http://dx.doi.org/10.1109/TMI.2010.2091513
65.
65. C. R. Maurer, Jr., J. M. Fitzpatrick, M. Y. Wang, R. L. Galloway, Jr., and R. J. Maciunas, “Registration of head volume images using implantable fiducial markers,” IEEE Trans. Med. Imaging 16, 447462 (1997).
http://dx.doi.org/10.1109/42.611354
66.
66. R. R. Shamir, L. Joskowicz, and Y. Shoshan, “Fiducial optimization for minimal target registration error in image-guided neurosurgery,” IEEE Trans. Med. Imaging 31, 725737 (2012).
http://dx.doi.org/10.1109/TMI.2011.2175939
67.
67. R. J. Bale, A. Martin, M. Vogele, W. Freysinger, P. Springer, and S. M. Giacomuzzi, “The VBH mouthpiece – A registration device for frameless stereotaxic surgery,” Radiology 205(P), 403 (1997).
http://aip.metastore.ingenta.com/content/aapm/journal/medphys/40/2/10.1118/1.4773871
Loading

Figures

Image of FIG. 1.

Click to view

FIG. 1.

Setup used for the experiments. The objects rest on a standard operating table (Brumaba, Germany) on a wood-plexiglass combination to hold hydraulic immobilization arms. The volunteer was resting in a comfortable position directly on the operating table to which he was immobilized with a tape running across the forehead. The active NDI Polaris camera (1) is placed in the optimal working distance from the object. The navigation system's monitor (2) and tracker control unit (3) are placed opposite to the surgeon. The probe used for all experiments (4) is lying on the table. In the example shown, the anatomic specimen (5) is held by two hydraulic arms and the patient tracker (a NDI rigid body, 6), is held separately. Thus a rigid mechanical setup could be achieved.

Image of FIG. 2.

Click to view

FIG. 2.

Setup with overlay of optimal working zone. The active Polaris tracker was placed 1400 mm away from the zone of maximum precision, a silo-type volume made up by a cylinder of 1000 mm height and diameter, covered by a semisphere of radius of 500 mm. All numbers in the figures are given in millimeters. Object placement within the ideal measurement zone was verified with a custom application used for centering patient tracker and tracked probe as seen by the tracker within this volume specified by the manufacturer. The dot on the cameras' center marks the origin of the camera coordinate system. Down to the right the coordinate axes are shown.

Image of FIG. 3.

Click to view

FIG. 3.

Schematic drawing of the positions selected for the DRF-DRF tracker calibration within the measurement volume. The dark dots were selected on the outer border; bright dots show positions at the border of the optimal measurement volume.

Image of FIG. 4.

Click to view

FIG. 4.

Photograph of the DRF-DRF assembly used for tracker calibration. The dark structure protruding downwards is a carbon holder. The origins of the DRF coordinate systems are marked in the centers of the LEDs in the fourth quadrant of each DRF; distances were obtained with a microcalliper; all dimensions are given in millimeters.

Image of FIG. 5.

Click to view

FIG. 5.

Active tracked probe used for the experiments, front and top view with dimensions (in millimeters). The origin is located on the most distal LED (the crossed circle).

Image of FIG. 6.

Click to view

FIG. 6.

(a) Plastic skull with landmarks used for registration (crosses) and targets (circles) on which the system accuracy was tested. The skull is placed on a base plate to hold the mechanical immobilization on base of the VBH headholder's mouthpiece (Ref. 67 ). (b) Anatomic specimen with landmarks used for registration (crosses) and targets (circles) on which the system accuracy was tested. The specimen was cut to allow accessing various anatomical structures. (c) Volunteer (3D model) with landmarks used for registration (crosses) and targets (circles) on which the system accuracy was tested. Surface reconstruction of the CT-data, thresholded to skin.

Image of FIG. 7.

Click to view

FIG. 7.

(a) Definitions of the coordinate systems for the experiments: image, DRF, and tracker. (b) The probe coordinate system associated with the navigated probe.

Image of FIG. 8.

Click to view

FIG. 8.

Correlation plots for (a) plastic skull, (b) anatomic specimen, and (c) volunteer for TLE with TRE from the experimental data.

Image of FIG. 9.

Click to view

FIG. 9.

Experimental TTEs for the three specimens, right column, and predictions of the TTE with the isotropic model (Ref. 25 ), left column.

Image of FIG. 10.

Click to view

FIG. 10.

Experimental TTEs for the three specimens, right column, and predictions of the TTE with the anisotropic model (Ref. 27 ), left column.

Image of FIG. 11.

Click to view

FIG. 11.

Experimental TTEs for the three specimens, right column, and predictions of the TRE with the isotropic model (Ref. 25 ), left column.

Image of FIG. 12.

Click to view

FIG. 12.

Experimental TTEs for the three specimens, right column, and predictions of the TRE with the anisotropic model (Ref. 27 ), left column.

Tables

Generic image for table

Click to view

TABLE I.

Test for normal distribution of measurements for all measured objects, fiducials and targets, used in the experiments. Testing was done with the Shapiro-Wilk for α = 0.05. Deviations from normal distribution of data are given in image coordinates.

Generic image for table

Click to view

TABLE II.

Measured values for ⟨FLE image 2⟩, ⟨FLE tracker 2⟩, ⟨FLE probe_calib 2⟩, and ⟨TFLE 2⟩ from the experiments with nine registration points for plastic skull, anatomic specimen and volunteer, all values in millimeters. ⟨FLE 2 tracker⟩ and ⟨FLE 2 probe_calib⟩ were obtained from the tracker measurements and probe calibration, respectively. Values are given in mm2.

Generic image for table

Click to view

TABLE III.

The measured covariance matrices of a DRF, the probe-DRF assembly, and the probe-tip-DRF calibration measurements as required for calculation of ΣTRE.

Generic image for table

Click to view

TABLE IV.

Covariance matrices for defining features in image and tracker space, respectively, with the computer mouse and tracked probe for plastic skull and anatomic specimen from experimental data.

Generic image for table

Click to view

TABLE V.

ULE 2⟩ from Eqs. (13) and (14) for all sets of registration points (3, 5, 7, and 9) and objects (anatomic specimen, plastic skull, and volunteer) studied; the average over all fiducials is given. All values are given in mm2.

Loading

Article metrics loading...

/content/aapm/journal/medphys/40/2/10.1118/1.4773871
2013-01-28
2014-04-21

Abstract

Purpose:

The benefit of computer-assisted navigation depends on the registration process, at which patient features are correlated to some preoperative imagery. The operator-induced uncertainty in localizing patient features—the user localization error (ULE)—is unknown and most likely dominating the application accuracy. This initial feasibility study aims at providing first data for ULE with a research navigation system.

Methods:

Active optical navigation was done in CT-images of a plastic skull, an anatomic specimen (both with implanted fiducials), and a volunteer with anatomical landmarks exclusively. Each object was registered ten times with 3, 5, 7, and 9 registration points. Measurements were taken at 10 (anatomic specimen and volunteer) and 11 targets (plastic skull). The active NDI Polaris system was used under ideal working conditions (tracking accuracy 0.23 mm root-mean-square, RMS; probe tip calibration was 0.18 mm RMS). Variances of tracking along the principal directions were measured as 0.18 mm2, 0.32 mm2, and 0.42 mm2. ULE was calculated from predicted application accuracy with isotropic and anisotropic models and from experimental variances, respectively.

Results:

The ULE was determined from the variances as 0.45 mm (plastic skull), 0.60 mm (anatomic specimen), and 4.96 mm (volunteer). The predicted application accuracy did not yield consistent values for the ULE.

Conclusions:

Quantitative data of application accuracy could be tested against prediction models with iso- and anisotropic noise models and revealed some discrepancies. This could potentially be due to the facts that navigation and one prediction model wrongly assume isotropic noise (tracking is anisotropic), while the anisotropic noise prediction model assumes an anisotropic registration strategy (registration is isotropic in typical navigation systems). The ULE data are presumably the first quantitative values for the precision of localizing anatomical landmarks and implanted fiducials. Submillimetric localization is possible for implanted screws; anatomic landmarks are not suitable for high-precision clinical navigation.

Loading

Full text loading...

/deliver/fulltext/aapm/journal/medphys/40/2/1.4773871.html;jsessionid=pjon72bx307v.x-aip-live-02?itemId=/content/aapm/journal/medphys/40/2/10.1118/1.4773871&mimeType=html&fmt=ahah&containerItemId=content/aapm/journal/medphys
true
true
This is a required field
Please enter a valid email address
752b84549af89a08dbdd7fdb8b9568b5 journal.articlezxybnytfddd
Scitation: Quantitative error analysis for computer assisted navigation: A feasibility study
http://aip.metastore.ingenta.com/content/aapm/journal/medphys/40/2/10.1118/1.4773871
10.1118/1.4773871
SEARCH_EXPAND_ITEM