Many image registration algorithms rely on the use of homologous control points on the two input image sets to be registered. In reality, the interactive identification of the control points on both images is tedious, difficult, and often a source of error. We propose a two-step algorithm to automatically identify homologous regions that are used as a priori information during the image registration procedure. First, a number of small control volumes having distinct anatomical features are identified on the modelimage in a somewhat arbitrary fashion. Instead of attempting to find their correspondences in the reference image through user interaction, in the proposed method, each of the control regions is mapped to the corresponding part of the reference image by using an automated image registration algorithm. A normalized cross-correlation (NCC) function or mutual information was used as the auto-mapping metric and a limited memory Broyden-Fletcher-Goldfarb-Shanno algorithm (L-BFGS) was employed to optimize the function to find the optimal mapping. For rigid registration, the transformation parameters of the system are obtained by averaging that derived from the individual control volumes. In our deformable calculation, the mapped control volumes are treated as the nodes or control points with known positions on the two images. If the number of control volumes is not enough to cover the whole image to be registered, additional nodes are placed on the modelimage and then located on the reference image in a manner similar to the conventional BSpline deformable calculation. For deformable registration, the established correspondence by the auto-mapped control volumes provides valuable guidance for the registration calculation and greatly reduces the dimensionality of the problem. The performance of the two-step registrations was applied to three rigid registration cases (two PET-CT registrations and a brain MRI-CT registration) and one deformable registration of inhale and exhale phases of a lung 4D CT. Algorithm convergence was confirmed by starting the registration calculations from a large number of initial transformation parameters. An accuracy of was achieved for both deformable and rigid registration. The proposed image registration method greatly reduces the complexity involved in the determination of homologous control points and allows us to minimize the subjectivity and uncertainty associated with the current manual interactive approach. Patient studies have indicated that the two-step registration technique is fast, reliable, and provides a valuable tool to facilitate both rigid and nonrigid image registrations.
We would like to thank Dr. A. Koong, Dr. A. Quon, Dr. B. Thorndyke, Dr. T. Li, Dr. Y. Yang, Dr. D. Levy, and Dr. D. Paquin for useful discussions. This work is supported in part by research grants from the U.S. Department of Defense (DAMD17-03-1-0023), the Susan G. Komen Breast Cancer Foundation (BCTR0504071), and the National Cancer Institute (5 R01 CA98523-01).
II. METHODS AND MATERIALS
II.A. Software platform
II.B. Selection of control regions on the modelimage
II.C. Mapping of control regions from the modelimage to reference image
II.D. Optimization of the NCC function
II.E. Rigid and BSpline deformable registration with incorporation of the mapped control volume information
II.F. Search space characteristics and convergence analysis
II.G. Case studies
III.A. Study 1: Rigid registration of CT and FLT-PET images
III.B. Study 2: Rigid registration of CT and MRIimages for a brain case
III.C. Study 3: Deformable registration of exhale and inhale CTimages for a thorax patient
Data & Media loading...
Article metrics loading...
Full text loading...