Difference between revisions of "Documentation/4.0/Modules/OpenIGTLinkIF"

From Slicer Wiki
Jump to: navigation, search
(Created page with '<!-- ---------------------------- --> {{documentation/{{documentation/version}}/module-header}} <!-- ---------------------------- --> <!-- ---------------------------- --> {{doc…')
 
(Prepend documentation/versioncheck template. See http://na-mic.org/Mantis/view.php?id=2887)
 
(11 intermediate revisions by 3 users not shown)
Line 1: Line 1:
 +
<noinclude>{{documentation/versioncheck}}</noinclude>
 
<!-- ---------------------------- -->
 
<!-- ---------------------------- -->
 
{{documentation/{{documentation/version}}/module-header}}
 
{{documentation/{{documentation/version}}/module-header}}
Line 17: Line 18:
 
}}
 
}}
 
{{documentation/{{documentation/version}}/module-introduction-end}}
 
{{documentation/{{documentation/version}}/module-introduction-end}}
 
<!-- ----------------------------------------------------------------- -->
 
<!-- THE FOLLOWING SHOULD BE REMOVED FOR YOUR OWN MODULE DOCUMENTATION -->
 
<!-- ----------------------------------------------------------------- -->
 
 
  
 
<!-- ---------------------------- -->
 
<!-- ---------------------------- -->
Line 35: Line 31:
 
*Slice driving: The module can control volume re-slicing plane based on linear transform in the MRML scene.
 
*Slice driving: The module can control volume re-slicing plane based on linear transform in the MRML scene.
  
|[[image:Slicer3_OpenIGTLinkIF_Architecture.png|thumb|center|500px|The figure shows an example schematic diagram where multiple devices are communicating with 3D Slicer through the OpenIGTLink Interface. Each connector is assigned to one of the external devices for TCP/IP connection. The connectors serve as interfaces between the external devices and the MRML scene to convert an OpenIGTLink message to a MRML node or vice versa. ]]
+
[[image:Slicer3_OpenIGTLinkIF_Architecture.png|thumb|center|500px|The figure shows an example schematic diagram where multiple devices are communicating with 3D Slicer through the OpenIGTLink Interface. Each connector is assigned to one of the external devices for TCP/IP connection. The connectors serve as interfaces between the external devices and the MRML scene to convert an OpenIGTLink message to a MRML node or vice versa. ]]
 
|}
 
|}
  
 +
<!-- ---------------------------- -->
 +
{{documentation/{{documentation/version}}/module-section|Supported Devices}}
 +
See [http://www.na-mic.org/Wiki/index.php/OpenIGTLink/List here] for a list of supported devices.
  
 
<!-- ---------------------------- -->
 
<!-- ---------------------------- -->
 
{{documentation/{{documentation/version}}/module-section|Use Cases}}
 
{{documentation/{{documentation/version}}/module-section|Use Cases}}
Most frequently used for these scenarios:
+
* '''MRI-compatible Robotic Systems''' (BRP Project between BWH, Johns Hopkins University and Acoustic MedSystems Inc., "Enabling Technologies for MRI-Guided Prostate Interventions")
* Quantification of small changes in meningioma tumor volume from post-contrast MRI
+
**The 3D Slicer was connected to an MRI-compatible Robot, using OpenIGTLinkIF to send target position from Slicer to the robot and to send the actual robot position (based on sensor information) from the robot back to Slicer. Slicer was also connected to the MRI scanner. Scan plane position and orientation were prescribed in Slicer and transmitted to the scanner for controlling real-time image acquisition and for transferring the acquired images from the MR scanner back into Slicer for display.
 +
* '''Neurosurgical Robot Project'''(Nagoya Institute of Technology, Japan)
 +
**The 3D Slicer was connected to the optical tracking system (Optotrak, Northern Digital Inc.) to acquire current position of the end-effector of the robot.
  
 
<!-- ---------------------------- -->
 
<!-- ---------------------------- -->
 
{{documentation/{{documentation/version}}/module-section|Tutorials}}
 
{{documentation/{{documentation/version}}/module-section|Tutorials}}
''Under development''
+
Please follow the [[Media:OpenIGTLinkTutorial_Slicer4.1.0_JunichiTokuda_Apr2012.pdf|OpenIGTLink IF Tutorial presentation file]].
 
 
The test data can be downloaded automatically by pushing the "Load test data" in the panel of the first step (you will need the internet connection).
 
  
 
<!-- ---------------------------- -->
 
<!-- ---------------------------- -->
 
{{documentation/{{documentation/version}}/module-section|Panels and their use}}
 
{{documentation/{{documentation/version}}/module-section|Panels and their use}}
 
+
A list of all the panels in the interface, their features, what they mean, and how to use them.
ChangeTracker is organized as a workflow that consists of the following steps:
 
 
 
* '''Step 1: Define input scans'''
 
  
 
{|
 
{|
|Use drop-down controls to choose the two scans where you would like to measure pathology development. Currently, we support analysis of the images that correspond to two time points.
+
|[[Image:Slicer4-OpenIGTLinkIF-GUI.png|thumb|200px|OpenIGTLinkIF ]]
 
+
|[[Image:Slicer4-OpenIGTLinkIF-Connectors.png|thumb|200px|Connector Properties]]
|[[Image:Slicer4_ChangeTracker_Step1.png|thumb|450px|Step 1: Scan selection]]
+
|[[Image:Slicer4-OpenIGTLinkIF-IOConfiguration.png|thumb|200px|I/O Configuration Tree interface]]
 
|}
 
|}
 
* '''Step 2: Define volume of interest'''
 
 
{|
 
|This step of wizard includes the following user controls to facilitate Volume of Interest (VOI) selection:
 
* "Hide/show render" button: used to control visibility of volume rendering for the selected region
 
* "ROI Widget Controls: RAS Space" frame: contains sliders to initialize VOI in RAS (physical) space
 
"ROI Widget" refers to the three-dimensional selection box that appears in the 3d slice view once you begin to select VOI. You can define VOI by adjusting the colored handles of the ROI widget in the slice viewer or 3d viewer.
 
 
|[[Image:Slicer4_ChangeTracker_Step2.png|thumb|450px|Step 2: ROI Widget controls in wizard GUI]]
 
|}
 
 
 
* '''Step 3: Segment the analyzed structure'''
 
 
{|
 
|Use threshold control slider to find the intensity that most closely approximates tumor volume. Thresholded volume is rendered interactively in the 3D viewer as you are adjusting the threshold value, and is also visualized as semi-transparent label in the image slice viewers.
 
 
Note, that currently ChangeTracker expects that the tissue you monitor is hyperintensive on the image.
 
 
In the cases when threshold is not effective in segmenting the structure of interest, '''Advanced''' tab allows to prescribe the segmentation label image directly, instead of using threshold. (''this feature is currently disabled, under development'')
 
 
|[[Image:Slicer4_ChangeTracker_Step3.png|thumb|450px|Step 3: ChangeTracker ROI segmentation]]
 
|}
 
 
* '''Step 4: ROI Analysis'''
 
 
{|
 
|Choose the metric(s) you would like to use. ChangeTracker provides an extensible framework for developing and incorporating change quantification metrics into the workflow (see Information for Developers section). The metrics currently available are the following (follow the links for details and documentation):
 
* [[Documentation/4.0/Modules/IntensityDifferenceMetric | Intensity Difference Metric]] is based on intensity difference between the two ROIs
 
 
In some cases, the registration procedure that ChangeTracker is using may not be robust enough to align your data. If this happens and the baseline and followup ROIs are not aligned after this step, you can use the '''Advanced''' tab to register your data, and place followup volume under the transform. ChangeTracker will use the prescribed transform and will skip registration step.
 
 
|[[Image:Slicer4_ChangeTracker_Step4.png|thumb|450px|Step 4: ROI Analysis]]
 
|}
 
 
* '''Step 5: ROI Analysis Results'''
 
 
{|
 
|Results are reported as the change in tumor volume, separately for growth and shrinkage component. The quantitative results are reported in voxels, mL and percentage relative the the volume of the structure segmented in the baseline scan.
 
 
The visualization of the analysis results includes the following components upon the completion of analysis:
 
* Red slice view: resampled VOI for the second of the analyzed time-points
 
* 3d slice viewer: color-coded results of the change analysis. Red color corresponds to the estimated growth regions, Green corresponds to estimated shrinkage.
 
* Compare view: first row contains the resampled ROI corresponding to the first time-point as foreground, with the growth analysis results in the background. The second row contains resampled ROI corresponding to the second time-point.
 
 
|[[Image:Slicer4_ChangeTracker_Step5.png|thumb|450px|Step 5: ROI Analysis Results]]
 
|}
 
 
  
 
<!-- ---------------------------- -->
 
<!-- ---------------------------- -->
 
{{documentation/{{documentation/version}}/module-section|Similar Modules}}
 
{{documentation/{{documentation/version}}/module-section|Similar Modules}}
* Point to other modules that have similar functionality
+
N/A
  
 
<!-- ---------------------------- -->
 
<!-- ---------------------------- -->
 
{{documentation/{{documentation/version}}/module-section|References}}
 
{{documentation/{{documentation/version}}/module-section|References}}
* Konukoglu, E., Wells, W. M., Novellas, S., Ayache, N., Kikinis, R., Black, P. M., & Pohl, K. M. (2008). Monitoring slowly evolving tumors. 2008 5th IEEE International Symposium on Biomedical Imaging: From Nano to Macro (pp. 812-815). IEEE. doi:10.1109/ISBI.2008.4541120 [http://www.spl.harvard.edu/publications/item/view/1958 URL]
+
#'''Tokuda J''', Fischer GS, Papademetris X, Yaniv Z, Ibanez L, Cheng P, Liu H, Blevins J, Arata J, Golby A, Kapur T, Pieper S, Burdette EC, Fichtinger G, Tempany CM, Hata N, OpenIGTLink: An Open Network Protocol for Image-Guided Therapy Environment, Int J Med Robot Comput Assist Surg, 2009 (In print)
* Pohl, K. M., Konukoglu, E., Novellas, S., Ayache, N., Fedorov, A., Talos, I.-F., Golby, A., et al. (2011). A new metric for detecting change in slowly evolving brain tumors: validation in meningioma patients. Neurosurgery, 68(1 Suppl Operative), 225-33. doi:10.1227/NEU.0b013e31820783d5 [http://www.spl.harvard.edu/publications/item/view/1958 URL]
 
  
 
<!-- ---------------------------- -->
 
<!-- ---------------------------- -->
 
{{documentation/{{documentation/version}}/module-section|Information for Developers}}
 
{{documentation/{{documentation/version}}/module-section|Information for Developers}}
{{documentation/{{documentation/version}}/module-developerinfo|ModuleTemplate|type=Interactive|category=Wizards}}
+
{{documentation/{{documentation/version}}/module-developerinfo}}
  
  
 
<!-- ---------------------------- -->
 
<!-- ---------------------------- -->
{{documentation/{{documentation/version}}/module-footer|category=Wizards}}
+
{{documentation/{{documentation/version}}/module-footer}}
 
<!-- ---------------------------- -->
 
<!-- ---------------------------- -->

Latest revision as of 07:28, 14 June 2013

Home < Documentation < 4.0 < Modules < OpenIGTLinkIF


For the latest Slicer documentation, visit the read-the-docs.



Introduction and Acknowledgements

This work is supported by NA-MIC, NCIGT, and the Slicer Community. This work is partially supported by NIH 1R01CA111288-01A1 "Enabling Technologies for MRI-Guided Prostate Interventions" (PI: Clare Tempany), P01-CA67165 "Image Guided Therapy" (PI: Ferenc Joelsz) and AIST Intelligent Surgical Instrument Project (PI: Makoto Hashizume, Site-PI: Nobuhiko Hata).
Author: Junichi Tokuda, Jean-Christophe Fillion-Robin, Nobuhiko Hata
Contact: Junichi Tokuda <email> tokuda@bwh.harvard.edu</email>

NA-MIC  
NCIGT  

Module Description

The OpenIGTLink Interface Module is a program module for network communication with external software / hardware using OpenIGTLink protocol. The module provides following features:

  • Data import: The module can import position, linear transform and image data from OpenIGTLink-compliant software to the MRML scene.
  • Data export: The module can export linear transform and image data from the MRML scene to external software.
  • Multi-connection: The module can manage multiple OpenIGTLink connections at the same time.
  • Locator visualization: The user can choose one of linear transforms in the MRML scene to visualize its position and orientation in the 3D space.
  • Slice driving: The module can control volume re-slicing plane based on linear transform in the MRML scene.
The figure shows an example schematic diagram where multiple devices are communicating with 3D Slicer through the OpenIGTLink Interface. Each connector is assigned to one of the external devices for TCP/IP connection. The connectors serve as interfaces between the external devices and the MRML scene to convert an OpenIGTLink message to a MRML node or vice versa.

Supported Devices

See here for a list of supported devices.

Use Cases

  • MRI-compatible Robotic Systems (BRP Project between BWH, Johns Hopkins University and Acoustic MedSystems Inc., "Enabling Technologies for MRI-Guided Prostate Interventions")
    • The 3D Slicer was connected to an MRI-compatible Robot, using OpenIGTLinkIF to send target position from Slicer to the robot and to send the actual robot position (based on sensor information) from the robot back to Slicer. Slicer was also connected to the MRI scanner. Scan plane position and orientation were prescribed in Slicer and transmitted to the scanner for controlling real-time image acquisition and for transferring the acquired images from the MR scanner back into Slicer for display.
  • Neurosurgical Robot Project(Nagoya Institute of Technology, Japan)
    • The 3D Slicer was connected to the optical tracking system (Optotrak, Northern Digital Inc.) to acquire current position of the end-effector of the robot.

Tutorials

Please follow the OpenIGTLink IF Tutorial presentation file.

Panels and their use

A list of all the panels in the interface, their features, what they mean, and how to use them.

OpenIGTLinkIF
Connector Properties
I/O Configuration Tree interface

Similar Modules

N/A

References

  1. Tokuda J, Fischer GS, Papademetris X, Yaniv Z, Ibanez L, Cheng P, Liu H, Blevins J, Arata J, Golby A, Kapur T, Pieper S, Burdette EC, Fichtinger G, Tempany CM, Hata N, OpenIGTLink: An Open Network Protocol for Image-Guided Therapy Environment, Int J Med Robot Comput Assist Surg, 2009 (In print)

Information for Developers