Volume Haptics Toolkit

From H3D.org

Jump to: navigation, search



The Volume Haptics Toolkit, VHTK for short, was developed during a research project aiming at bringing haptics into volume data exploration interfaces and the volume data understanding process. During this project the algorithms needed for both simple and effective volume haptics were designed and the primary interface for VHTK was formed. The toolkit extends H3D API by introducing the scene-graph nodes necessary for loading volumetric data, handling and processing the data and for using the data to produce both visual and haptic feedback.

While developed primarily for haptic scientific visualization, the toolkit has matured to be useful also for other applications in which the haptic or visual feedback does not come from explicit geometrical surfaces. This can be simple tissue simulation (no dynamics) in needle insertion or biopsy where volumetric data can specify material properties, or to provide guidance fields assisting or helping a user to perform a certain task.

VHTK has been designed ground up to make full use of the features provided by H3D API. Thus, the toolkit can be use in all three design levels of H3D API — structural design using X3D, interactive dynamics using Python scripting and low-level setup and extensions in C++. Also the event handling system is highly integrated in the functionality of VHTK. Changes in fields controlling, for example, filters propagates an event to every node involved, forming a conceptual multi-modal data pipeline, similar to those of Visualization Toolkit, VTK, and AVS/Express.


The first version of VHTK was released in 2003. At that time the core technology for the calculation of the volume haptic force feedback was the yielding constraints, only a few visualization nodes were available and the data loading and filtering capabilities were at a basic level. With that haptics technology only one haptic mode could be used at a time. In 2004 the core was replaced by the haptic primitives technology which allowed for several modes to be used simultaneously. At the same time the number of haptic modes supported by the toolkit increased. The toolkit was ported to H3D API in 2005 and the first version, VHTK 1.0 for H3D API, was released at the end of that year.

Specific Features

  • Multi-level programming using X3D for structure definition, Python scripting for simple behaviour and setup, and C++ for extending the functionality.
  • Volume haptics based on haptic primitives, which allows for a free combination of both active and passive haptic modes.
  • Eight predefined haptic modes for haptic rendering of features in scalar and vector data.
  • A range of volume visualization methods including direct volume rendering (DVR), stream-tubes, hedge-hogs and iso-surfaces.
  • Highly configurable visualization pipeline, both for haptics and graphics.
  • A wide range of transfer functions, both for specifying haptic material properties and colours.
  • Volume data processing, both rasterized (regular grid) volumes and continuous data fields.
  • Fast local reconstruction of iso-surfaces of updated data.
  • Explicit handling of dynamic transforms and animated volumetric data, rendering the haptic feedback consistent with dynamics.

Typical Applications

VHTK has been designed to directly provide or assist in the implementation of most types of haptic feedback not associated with geometric surfaces. This means that it is not very useful for many conventional VR applications, but there are many areas where such feedback can be invaluable.

Scientific/Medical Visualization 
In scientific or medical visualization the haptic feedback can provide information about the local structure and other data features, as well as provide some guidance in certain tasks. The volumetric nature of many types of such data makes volume haptic very effective.
Needle or Biopsy Simulation 
The volumetric nature of real tissues makes volume haptics suitable for simulation of simple interaction with tissues, for example pushing a needle through different tissue layers to perform a biopsy.
Physical Guidance 
VHTK provides an arsenal of both passive and active modes of haptic feedback. If the best way or a suggested way to move the haptic instrument for any position in the virtual environment is known, then this is in effect a volumetric vector field. A set of modes from VHTK can be applied to such a guidance field to provide a physical support assisting in complicated or time pressing tasks.

Python Scripts

There are some Python scripts accompanying the distribution of VHTK. These are providing tools useful for the interactive multi-modal visualization of volumetric data. The haptics device position is required by some of these scripts. Also, the position of the device must typically be defined in local coordinates of the space where the script is operating. This position is most easily obtained through the following X3D code

    <LocalInfo DEF="INFO" />
  <PythonScript DEF="SCRIPT"
  <IMPORT inlineDEF="H3D_EXPORTS" exportedDEF="HDEV" AS="HDEV"/>
      fromNode="HDEV" fromField="trackerPosition"
      toNode="INFO" toField="position" />
      fromNode="INFO" fromField="position"
      toNode="SCRIPT" toField="position" />

Clip Planes

  • urn:vhtk:python/ClipPlanes.py

ClipPlanes allows the user to interactively control clipplanes in a scene. A button click adds a clipplane at the current position with the current orientation. The plane can then be moved and rotated. Previously added planes can be interactively moved and rotated, and removed.

The following must be provided through the "references" field:

  • the group node which should be clipped

The following may also be provided through the "references" field:

  • a identically transformed group for clipplane icons,
  • an identically transformed LocalInfo node,
  • an icon for the clipplanes.

The following must be routed:

  • to the field "button", the button to control the clipplanes with,
  • to the field "position", the position to control the clipplanes, and
  • to the field "orientation", the orientation to control the clipplanes.

Put Seeded

  • urn:vhtk:python/PutSeeded.py

PutSeeded allows the user to interactively seed a seeded visualization type, such as stream ribbons.

The following must be provided through the "references" field:

  • the group node in which the visualization nodes should be put,
  • an appearance to use, and
  • a template node of the type that should be seeded

The following must be routed:

  • To the "button" field, the button to put the seeds with
  • To the "position" field, the seed position in the same coordinate system as the volume and the group node

The script response to the following commands:

  • z undo
  • Z undo all
  • 1-9 release multiple seeds by drawing a line in the air

Probe Display

  • urn:vhtk:python/ProbeDisplay.py

ProbeDisplay opens a Tk window and displays information from a specified VolumeProbe.

The following must be provided through the "references" field:

  • the VolumeProbe instance from which to extract the data

This script only reads data from the VolumeProbe node and displays it. For the VolumeProbe to update the values some probe position must be routed to the "probe" field of the VolumeProbe node. See the documentation for VolumeProbe for more information.

Points Editor

  • urn:vhtk:python/PointsEditor.py

PointsEditor makes it possible to add, edit and remove points in 3D space, for example for specifying the distribution of glyphs in space. Pushing the button in free space adds a new point, pressing at a point icon and moving the haptic instrument moves that point, and pressing at a point and then releasing the button removes the point.

The following must be provided through the "references" field:

  • the group node for the point icons.

The following may also be provided through the "references" field:

  • an identically transformed LocalInfo node, and
  • an icon for the points.

The following must be routed:

  • To the "button" field, the button to control the points with
  • To the "position" field, the position to control the points

The resulting points resides in the "point" field.


Each example below is programmed using only VHTK, H3D and an individual X3D setup file. The X3D file specifies the visualization nodes and haptic nodes and their transforms, parameters, transfer functions and volumetric data. A Python script available in the VHTK package provides the interactive stream ribbons and stream tubes.

See also YouTube for video examples.

SHARC in Virtual Windtunnel Visualization

A screen shot from the SHARC Windtunnel example.
A screen shot from the SHARC Windtunnel example.

The SHARC aircraft is an experimental unmanned aerial vehicle (UAV). In this example the air flow from a computational fluid dynamics simulation (CFD) is explored using multi-modal interaction. While only simple properties can be rendered visually without cluttering the display, interactive stream tubes and haptic feedback can be used to freely explore the full 3D volume.

Two different haptic modes have been showed to work well with this kind of data: the Vector Follow Mode and the Vector Vortex Mode. The follow mode convey the flow orientation and its strength. The vortex mode, on the other hand, produce a haptic guidance surface where the vorticity of the vector field is strong. This plane guides the haptic instrument so that it is easy to find and follow vortices and areas of strong turning winds.

Femur Bone Drilling Simulation

A screen shot from the femur bone drilling simulation.
A screen shot from the femur bone drilling simulation.

When repairing cervical hip fractures, long nails are used to fixate the bone. The surgeon first drills a pilot hole with a narrow drill that can be steered through the soft marrow during drilling. This is done mostly blind with only occasional images through a fluoroscope. In this simulator VHTK has been extended with a haptic bone drilling simulation mode that provides haptic feedback from the volumetric CT data of a real patient. The feedback allows the surgeon to feel structures in the marrow and bone features to control the drill and complete the task.

This application can be used for training on problem or patient specific data, during education or before surgery.

Heart Blood-flow Visualization

A screen shot from the blood-flow exploration example.
A screen shot from the blood-flow exploration example.

Modern MRI-scanners are capable of acquiring animated blood-flow data from withing a beating human heart. The phase contrast pulse sequence used to acquire flow information produces poor tissue contrast when used to scan full 3D data, so the visual quality of the dataset is low. Both the poor tissue contrast of this kind of data and the fact that the noisiness of MRI data makes automatic extraction of features difficult makes it an interesting target for multi-modal methods.

The task of exploring this kind of data is effectively guided using the Vector Follow Mode, applied to the blood-flow data. The follow mode provides guidance and information about the local flow. The characteristics of flow can easily be recognized and also be distinguished from noise, which shows the effectiveness of human perception. By also adding the Vector Force Mode that pushes the haptic probe in the direction of the flow, an extra channel of information about the anatomy of the heart is provided.

The Scalar Gradient Mode can also be applied to the flow magnitude. This produces a push towards high flow which makes it easier to find main blood flows and follow the major streams. However, the pushing effect also obfuscates detailed information from the other modes, which makes it unsuitable for close examination of the identified flow.

Dichloroethane Modecule Electropotential Visualization

A screen shot from the dichloroethane example.
A screen shot from the dichloroethane example.

This setup is a pretty simple example of how a molecule electropotential field can be explored using a multi-modal display. The visual rendering provide a general overview using iso-surfaces at zero potential and at a positive and a negative potential. Volume Rendering is also added to show the electropotential throughout the volume.

Two different haptic modes are appropriate for effective haptic rendering of this kind of data: the Vector Follow Mode and the Vector Front Shape Mode. These vector modes are applied to the gradient of the scalar field. The follow mode thus convey the orientation of the gradient and guides the user to follow the highest gradient direction path between high and low potential. The front shape mode, on the other hand, renders a haptic iso-surface at every position in the field.

Personal tools
go to