Documentation:Nightly:Registration:RegistrationLibrary:RegLib C44

From Slicer Wiki
Jump to: navigation, search
Home < Documentation:Nightly:Registration:RegistrationLibrary:RegLib C44

Back to Registration Library

Slicer Registration Library Case 44: Visible Human Pelvis CT


this is the main fixed reference image. All images are ev. aligned into this space lleft this is the moving image
baseline image follow-up

Modules used


This dataset contains CT of the visible human male and female pelvis. This serves as a test example for exploring non-rigid registration for inter-subject comparison from CT. The overall strategy will be to register "vhf" to "vhm" via first affine and then BSpline registration. We will generate a mask to focus the registration on the bone structure only and ignore the soft tissue when computing the deformation. Because our original images are quite large (512x512x150), we will subsample the vhf pair for use with the Deformation Field Visualizer module, which might otherwise become too memory intensive.


Why 2 sets of files? The "input data" mrb includes only the unregistered data to try the method yourself from start to finish. The full dataset includes intermediate files and results (transforms, resampled images etc.). If you use the full dataset we recommend to choose different names for the images/results you create yourself to distinguish the old data from the new one you generated yourself.

Video Screencasts

  1. Movie/screencast showing generating a registration mask
  2. Movie/screencast showing affine and nonrigid BSpline registration
  3. Movie/screencast showing visualization of the deformation via the Transform Visualizer module


CT, pelvis, visible human, inter-subject

Procedure / Pipeline

  1. Mask generation: open the Editor module. Movie/screencast showing this step
    1. "Master Volume": select vhm
    2. A new labelmap "vhm-label" will be created
    3. Select "vhm" to be visible in the slice viewer
    4. Select the Threshold tool from the editor toolbar
    5. Adjust the lower threshold (slider bar) until most of the bone is highlighted,just before speckle noise starts to become included e.g. somewhere around an intensity value of 80. Leave the upper threshold unchanged at the max.
    6. Click Apply
    7. clean the segmentation:
    8. Select the "Identify Islands" editor effect. IdentifyIslandsEffectIcon.png This will identify all continuous areas that are disconnected from each other. Click "Apply"
    9. you should see the bones of the arms being assigned a different label value & color. We can now delete them with one click:
    10. select the "Change Island" effect ChangeIslandsEffectIcon.png . Change the label value to 0 (zero).
    11. in the axial (red) view, click the left mouse within the segmented areas of the arms.
    12. select the "Dilate" effect DilateEffectIcon.png. Click the "Apply" button 3-4 times until the boundary of the segmentation extends well beyond the bone, including a several pixel wide layer of adjacent tissue.
    13. repeat the above for "vhf".
    14. save the two segmentations.
  2. Affine Registration: open the General Registration (BRAINS) module. Movie/screencast showing this step
    1. Fixed Image Volume: vhm
    2. Moving Volume: vhf
    3. check boxes for Include Rigd registr. phase , Include ScaleVersor3D, include Affine
    4. Slicer Linear Transform: select "create new transform", rename to "Xf1_Affine" or similar
    5. leave rest at defaults. Click Apply
    6. registration should take ~ 10 secs.
    7. use fade slider to verify alignment; compare with result snapshots shown below. Alignment will not be perfect but should be better than before.
    8. note: you can also change the colormaps for the fixed and moving volumes to better judge the alignment: go to the Volumes module and in the Display tab, select "green" and "magenta" as the respective colormaps for the two volumes (vhf, vhm)
  3. Nonrigid Registration (masked): open the General Registration (BRAINS). Movie/screencast showing this step
    1. Fixed Image Volume: vhm Moving: vhf
    2. Registration phases: from Initialize with previously generated transform', select "Xf1_Affine" node created before.
    3. Registration phases: uncheck boxes for rigid, scale and affine and check box for BSpline
    4. Output: Slicer Linear transform: set to None
    5. Output: Slicer BSpline transform: create new, rename to "Xf2_BSpline_msk" or similar
    6. Output Image Volume: create new, rename to "vhf_Xf2"; Pixel Type: "short"
    7. Registration Parameters: increase Number Of Samples to 200,000; Number of Grid Subdivisions: 7,7,7
    8. Control Of Mask Processing Tab: check ROI box, for Input Fixed Mask and Input Moving Mask select the two dilated labelmaps from above
    9. Leave all other settings at default
    10. click apply
  4. Deformation Visualization: Movie/screencast showing this step
    1. if you have not yet installed the Deformation Field Visualizer extension, see here for a movie clip on how to install it.
    2. we first generate a smaller version of our image to save memory:
      1. open the Resize Image (BRAINS) module (under Registration)
      2. Image To Warp: select "vhf"
      3. Output Image: create new, rename to "vhf_small" or similar
      4. Pixel Type: select "short"
      5. Scale Factor: leave at 2.0
      6. click Apply
    3. open the Transform Visualizer module (under: All Modules)
      1. Deformation: select "Xf2_BSpline"
      2. Reference Image: select "vhf_small" generated above
      3. Visualization mode: click on "Grid Slice"
      4. Grid Slice Options: Slice: red; Spacing: 20mm
      5. Click Apply. you should see the deformation field overlay. Adjust slice, spacing etc. to taste.

Registration Results

original unregistered unregistered
registered (affine) registered (affine)
registered (nonrigid w/o masking) registered (nonrigid w/o masking)
registered (nonrigid+masking) registered (nonrigid+masking)
deformation only of vhf registered deformation only of vhf
deformation grid deformation visualized by grid image overlay


Original CT from the Visible Human Project shared by the University of Iowa.