University of Bergen | Faculty of Mathematics and Natural Sciences | Department of Informatics | Visualization Group
Visualization
You are here: Department of Informatics > Visualization Group > Team > Stefan Bruckner > Publications > birkeland10USpainting
 Visualization
 > about
 > team & contact info
 --- >  Stefan Bruckner
 > research
 > publications
 > projects
 > teaching
 > resources
 > network
 > events
 > links

Ultrasound Painting of Vascular Tree

Åsmund Birkeland, Ivan Viola

CONFERENCE PAPER: In Proceedings of Vision, Modeling, and Visualization (VMV 2010), pp. 163–170, 2010.

Abstract

In treatment planning and surgical interventions, physicians and surgeons need information about the spatial extent of specific features and the surrounding structures. Previous techniques for extracting features, based on magnetic resonance imaging and computed tomography scans, can be slow and cumbersome and are rarely used by doctors. In this paper we will present a novel approach to extract features from tracked 2D ultrasound, in particular hypo-echoic regions such as blood vessels. Features are extracted during live examinations, removing the need for slow and cumbersome post-scan processes and interaction is based on the natural interaction techniques used by doctors during the examination. The ultrasound probe is utilized as a 3D brush, painting features in a 3D environment. The painting occurs during a regular examination, producing little extra interaction from the doctor. We will introduce a novel approach to extract hypo-echoic regions from an ultrasound image and track the regions from frame to frame. 3D models are then generated by storing the outline of the region as a 3D point cloud. Automatically detecting branching, this technique can handle complex structures, such as liver vessel trees, and track multiple regions simultaneously. During the examination, the point cloud is triangulated in real-time, enabling the doctor to examine the results live and discard areas which are unsatisfactory. To enable modifications of the extracted 3D models, we present how the ultrasound probe can be used as a interaction tool for fast point cloud editing.

Published

Proceedings of Vision, Modeling, and Visualization (VMV 2010)

  • Pages: 163–170
  • Location: Siegen, Germany
  • Project: IllustraSound, MedViz, Illustrative Visualization

Documents and Links

Additional Media

  • video
  • Click to view
  • Click to view
  • Click to view
  • Click to view

BibTeX

@inproceedings{birkeland10USpainting,
  title = {Ultrasound Painting of Vascular Tree},
  author = {{\AA}smund Birkeland and Ivan Viola},
  year = {2010},
  booktitle = {Proceedings of Vision, Modeling, and Visualization (VMV 2010)},
  abstract = {In treatment planning and surgical interventions, physicians and 
surgeons need information about the spatial extent of specific features and the 
surrounding structures. Previous techniques for extracting features, based on
magnetic resonance imaging and computed tomography scans, can be slow and 
cumbersome and are rarely used by doctors. In this paper we will present a novel 
approach to extract features from tracked 2D ultrasound, in particular hypo-echoic 
regions such as blood vessels. Features are extracted during live examinations, 
removing the need for slow and cumbersome post-scan processes and interaction is 
based on the natural interaction techniques used by doctors during the examination.
The ultrasound probe is utilized as a 3D brush, painting features in a 3D 
environment. The painting occurs during a regular examination, producing little 
hypo-echoic regions from an ultrasound image and track the regions from frame to 
frame. 3D models are then generated by storing the outline of the region as a 3D 
point cloud. Automatically detecting branching, this technique can handle complex 
structures, such as liver vessel trees, and track multiple regions simultaneously. 
During the examination, the point cloud is triangulated in real-time, enabling the 
doctor to examine the results live and discard areas which are unsatisfactory. To
enable modifications of the extracted 3D models, we present how the ultrasound probe
can be used as a interaction tool for fast point cloud editing.},
  location = {Siegen, Germany},
  pages = {163--170},
}






 Last Modified: Stefan Bruckner, 2014-04-10