Back to Projects List
- Fryderyk Kögl (BWH, TUM)
- Harneet Cheema (BWH, UOttawa)
- Tina Kapur (BWH)
- Simon Drouin (ETS)
- Andrey Titov (ETS)
- Steve Pieper (Isomics)
- Tamas Ungi (Queen's University)
- Sandy Wells (BWH)
Corresponding landmarks between MR and ultrasound images acquired during neurosurgery are valuable for (a) validation of registration algorithms and (b) training supervised registration algorithms (c) initializing a registration. In this project we aim to create a tool that makes the process of finding those landmarks easier.
- Objective A. Create a UI that provides new functionality and gathers existing functionality in one place to facilitate landmarking
- Objective B. Investigate the rendering infrastructure that would facilitate the adjustment of landmark position in the 3D view of Slicer
- We use an iterative process for creating the UI - the user(s) give feedback to the developer(s) who then continuously update(s) the UI
Progress
- The extension is ready. It can be found here on the main branch. A screenshot can be seen below in Illustrations. For more details refer to the readme.
- A lot of bug fixes
- More intuitive control of active views
- More fine-grained control of viewing options
- Automatically join corresponding landmarks with curves to visualise brain shift (also sanity check - the curves should be more or less smooth)
Next Steps
- Fulfill all formal requirements for a pull request
- Search for bugs/corner cases
- Push to the ExtensionIndex
Next Steps (outside the scope of this project week)
- Add volume rendering
- Automatically detect landmarks (e.g. 3D-SIFT features) and manually choose the best ones
Current state of the extension
Landmark flow
Example landmarks
[1] Xiao, Yiming, et al. "RE troSpective Evaluation of Cerebral Tumors (RESECT): A clinical database of pre‐operative MRI and intra‐operative ultrasound in low‐grade glioma surgeries." Medical physics 44.7 (2017): 3875-3882.