Cutting Effect Demo


Jump to: navigation, search
A screenshot of the Cutting Effect Demo Tutorial application.
A screenshot of the Cutting Effect Demo Tutorial application.



This tutorial describes how to create the illusion of cutting with a knife or scalpel using some simple graphic and haptic effects. Instead of attempting to simulate every detail of cutting using complex algorithms, this tutorial explains how we can create a couple of relatively simple custom H3D nodes, to produce a rough visual and haptic approximation to cutting (see screenshot, right). It?s important to note that this example does not actually change the geometry of the shape during cutting, but simply produces the haptic and graphic effects to create the illusion of cutting.

Image:Note-tip.pngThe following tutorial will make a lot more sense if you first download and try the demo application. You can download the source code here. Please read on for compilation and usage instructions.
Image:Note-info.pngThis tutorial does not assume much previous experience of H3DAPI, but a basic understanding of 3D geometry and a very basic idea of how the God Object haptic rendering method works will be helpful (see Chapter 4 of the HAPI Manual).


Subjects discussed in this tutorial include:

  • Defining a new subclass of H3DSurfaceNode using C++, to customize the feel of a shape.
  • Creating a custom ?paintable? texture node using C++, to graphically render the path traced by the haptic device. (An extension to this previous tutorial).


Image:Note-info.pngTo get the most out of this tutorial you will need a haptic device that supports rotation (i.e. has rotation encoders), so that you can control the angle of the scalpel in the demo. You can still learn from this tutorial if you use another type of haptic device, but the functionality of the demo application will be slightly limited. Also note that this tutorial is designed to work H3D?s built-in haptic renderers (i.e. GodObjectRenderer and RuspiniRenderer), it will not work with OpenHapticsRenderer or Chai3DRenderer.

There are two main parts to this tutorial: Haptics and graphics.

  • Haptics
First we describe how to create the haptic effect of cutting. To do this we create a custom haptic surface node (a subclass of H3DSurfaceNode) to simulate the feel of interacting with a shape using a knife or scalpel.
  • Graphics
Secondly we take care of the graphic simulation of cutting. Here we use a DeformableShape node and create a custom X3DTextureNode to provide the visual effect of cutting by ?painting? the path of the cut onto the texture. We describe the necessary extensions to the PaintableTexture node (described in this previous tutorial) to achieve this.

Source Code

You can download the source code for this example here. The following tutorial will make much more sense if you first download and use the demo application.


Because we define some new nodes using C++, you will need to compile these in order to use the demo application. Use CMake to generate the project files for your chosen compiler, then build and install the project.

e.g. To build with Unix Makefiles, unzip the downloaded source code. Then from a command prompt, change directory to the newly unzipped folder and type:

Note: If using Visual Studio, ensure that you select the Release build configuration.
cd CuttingNodes/build
mkdir unix-makefiles
cd unix-makefiles
cmake ..
make install

After running ?make install? (or building the ?INSTALL? target in Visual Studio, or similar), you will be able to open the CuttingSimulation.x3d file using H3DLoad or H3DViewer. Note that H3DLoad or H3DViewer will need to have been compiled using the same compiler that you used to compile our new nodes, otherwise you will receive warning messages such as:

WARNING: Could not create "CuttingPaintableTexture" node. It does not exist in the H3DNodeDatabase.

The versions of H3DLoad and H3DViewer installed by the H3D 2.1 Windows installer are compiled with Visual Studio 9. If this is not the same as your compiler, then recompile H3DLoad or H3DViewer so that the compilers match.

Using the Demo

Start the demo using H3DLoad or H3DViewer. E.g.

H3DViewer CuttingSimulation.x3d
Note: The demo requires the H3D UI component. Ensure that the UI plugin is enabled in H3DViewer (File->Plugins) and has been compiled with the same compiler as H3DViewer (if you recompiled H3DViewer earlier, you probably need to recompile UI too). If using H3DLoad, ensure that the UI shared library is in the system path (or is copied into the same directory as CuttingSimulation.x3d). On Windows you may need to change the ImportLibrary ?library? field in CuttingSimulation.x3d to point to the correct location of the UI library, and/or change the name to match the compiler (e.g. UI_vc8, UI_vc9, etc).

Notice how it?s easier to move across the cutting surface in the same direction as the scalpel blade than it is to move in other directions. Try holding the scalpel at different angles and notice how this affects the level of friction you feel as you attempt to move the scalpel over the surface.

Using the mouse and the slider bars at the top of the screen, you can independently adjust the level of friction with the direction of cutting (right two slider bars) and against it (left two slider bars). Experiment with changing these friction parameters and see how they affect the simulation.


In the following sections we explore how the simulation is implemented, starting with an overview of the source files, then an in depth look at the C++ nodes that implement the haptic effects, and then the graphic effects. Finally, we?ll look at how to pull together everything that we?ve created in C++, using X3D and Python.

Overview of Source Files

The directory structure and source files of the project are described below. The project consists of a combination of X3D, Python and C++ sources. Each of the files listed here are described in more detail later in the tutorial, the aim here is just to get a general overview of the role of each.

Directory or Source File Purpose
CuttingSimulation.x3d This is the main X3D file that is loaded by the viewer to start the simulation. It defines the scene graph, which includes some of our own nodes implemented in C++ (described below). It also includes the Python script nodes described below.
CuttingNodes/ This directory contains the C++ source code and CMake project to build the shared library that will contain our additional scene graph nodes (described immediately below).
DirectionalFrictionSurfaceNode.h/.cpp A subclass of H3DSurfaceNode used to define the friction effects experienced when interacting with a surface using a scalpel.
DirectionalFrictionSurface.h/.cpp A subclass of HAPISurfaceObject used to actually implement the friction effect described above, using HAPI.
PaintableTexture.h/.cpp A texture which allows the surface of a shape to be drawn on by the haptic device. Based on the node described in this tutorial.
CuttingPaintableTexture.h/.cpp A specialized subclass of PaintableTexture (described above), which draws a visual cutting effect based of the position and force of contact with the haptic device.
images/ This directory contains some images used as textures in the simulation.
python/ A directory that contains some simple python scripts (described below).
python/ This script generates the flat cutting surface geometry, used to render the shape graphically.
python/ This script sends force and contact information about the haptic device, from the shape?s geometry node, to the CuttingPaintableTexture node, in order to create the visual effect of cutting.
python/ This script updates some aspects of the user interface. Specifically, it sends the slider bars? current values to the labels displayed next to them.
x3d/ This directory contains additional X3D files used in the simulation (see below).
x3d/Scalpel.x3d This is the X3D model of the scalpel, used to represent the stylus of the haptic device. It was created in Blender, and then exported as X3D.
x3d/UI.x3d This X3D file defines the scene graph representing the user interface. It uses nodes from the H3DAPI UI component.


Simulating the feel of the scalpel-surface interaction

When cutting with a knife or scalpel, the shape of the blade means that the friction is not the same in every direction. For example, it's easy to cut along the direction of the blade, but if we try to move the knife sideways or backwards it's much harder to move. It is as though we are constrained to move only in the direction of the blade. To simulate this effect we will create a custom haptic surface friction effect by defining a subclass of H3DSurfaceNode.


Each shape in the scene (which can be felt using the haptic device) contains in its Appearance node, a subclass of H3DSurfaceNode. The H3DSurfaceNode, together with the haptic renderer node, determine the forces that should be sent to the haptic device when the shape is touched. By defining our own subclass of H3DSurfaceNode we are able to customize the haptic rendering to a degree and control how a particular shape should feel. H3DAPI provides a number of existing H3DSurfaceNode subclasses, which allow you to define shapes with smooth surfaces (SmoothSurface), frictional surfaces (FrictionalSurface) and even surfaces whose depth or friction is controlled by an image texture (DepthMapSurface and HapticTexturesSurface). However, there is nothing that quite fits our requirements of a frictional surface that is dependant on the angle of the haptic device, so we?ll create one ourselves.

Like many nodes in H3D, the hierarchy of H3DSurfaceNodes has an equivalent in HAPI, H3D?s haptics rendering library. An H3DSurfaceNode affects the actual haptics rendering by creating an instance of HAPI?s HAPISurfaceObject, which is then used by HAPI during the haptics rendering process. So when we create a new subclass of H3DSurfaceNode, we also need to create a corresponding subclass of HAPISurfaceObject. The H3DSurfaceNode takes care of creating the X3D interface to the surface, by defining the fields that can be accessed from the graphics thread. The HAPISurfaceObject turns these parameters into something you can feel, by implementing virtual functions, which are used by the haptic renderer, in the haptics thread, to help determine the forces to send to the haptic device.

Specifically, the HAPISurfaceObject is responsible for two tasks (see below), which are performed by overriding two virtual functions. Both functions use a ContactInfo object to pass parameters in and out of the function.

1. Deciding how the position of the god object (or proxy) should move.
virtual void getProxyMovement( ContactInfo &contact_info );
By controlling the movement of the proxy, we can create various friction effects. The less movement that we permit, the higher the friction of the surface will feel.

2. Calculating the reaction force for the surface.
virtual void getForces( ContactInfo &contact_info );
Generally, the reaction force is calculated by applying the spring equation to the distance between the proxy and the device position (as in FrictionSurface). However, if we wanted, we could use some more exciting method for calculating the reaction force.
You can find more information about implementing custom surface effects here.


We will derive our DirectionalFrictionalSurfaceNode from H3DFrictionalSurfaceNode (see class diagram, below). This will provide an interface for setting some of the frictional properties of the surface. As mentioned previously, the amount of friction will be different, depending on the direction and the angle of the scalpel. There will be two distinct sets of frictional properties. One set will control the friction in the direction of cutting and the other will define the friction in all other directions (opposing the cutting direction). We will use the frictional properties that we inherit from H3DFrictionalSurface to control the friction in the direction of cutting. To make our surface node flexible, we will delegate the task of defining the friction against the direction of cutting to another H3DSurfaceNode, wrapped inside our own. As mentioned before, as well as defining the surface friction, the other role of a H3DSurfaceNode is to calculate the reaction force. So we want to provide a way of keeping this level of customization. So we will also delegate the task of calculating the reaction force to our wrapped surface. That way we might decide to use our DirectionalFrictionalSurfaceNode in conjunction with another H3DSurfaceNode, which does some special force calculations (e.g. a non-linear stiffness surface or a DepthMapSurface).

A class diagram showing our two new DirectionalFrictionSurface classes: The H3DSurfaceNode on the left hand side, and the HAPISurfaceObject on the right hand side.
A class diagram showing our two new DirectionalFrictionSurface classes: The H3DSurfaceNode on the left hand side, and the HAPISurfaceObject on the right hand side.


The DirectionalFrictionSurfaceNode class really only needs to perform two tasks. First, when the node is initialized (i.e. gets its first reference count), it creates an instance of DirectionalFrictionSurface to influence the haptic rendering. Notice how we pass the surface?s parameters to the DirectionalFrictionSurface when it is constructed.

Source Code: DirectionalFrictionSurfaceNode.cpp

// Override to create the HAPI surface
void DirectionalFrictionSurfaceNode::initialize()
    new HAPI::DirectionalFrictionSurface(  
    wrappedSurface->getValue() ? wrappedSurface->getValue()->getSurface() : NULL,
    useRelativeValues->getValue() ) );

The second task that the DirectionalFrictionSurfaceNode must perform, after creating HAPI?s DirectionalFrictionSurface instance, is to keep updating that instance as its own field values change. For example, if the stiffness field of DirectionalFrictionSurfaceNode is changed, we need the stiffness variable in the DirectionalFrictionSurface instance to be updated too. Fortunately, because we derived from H3DFrictionalSurfaceNode, this parent class will take care of updating all the standard friction surface parameters (stiffness, damping, staticFriction and dynamicFriction). All that remains for our DirectionalFrictionSurfaceNode class to do is to keep the ?wrappedSurface? parameter up to date. We do this by specialising the update function of DirectionalFrictionSurfaceNode?s ?wrappedSurface? field. Then when the field value is changed, we register a blocking callback in the haptic thread in order to update the ?wrappedSurface? variable in the DirectionalFrictionSurface instance. Using a blocking callback just means that the ?wrappedSurface? value is guaranteed not to change during an execution of the getProxyMovement() or getForces() functions. Here is the callback that is executed, it just copies the new HAPISurfaceObject into the DirectionalFrictionSurface?s ?wrappedSurface? variable.

Source Code: DirectionalFrictionSurfaceNode.cpp

/// Callback to copy wrappedSurface to the HAPI directional surface implementation
H3DUtil::PeriodicThreadBase::CallbackCode DirectionalFrictionSurfaceNode::syncWrappedSurface ( void* data )
  CallbackData* callbackData= static_cast<CallbackData*>(data);
  static_cast<HAPI::DirectionalFrictionSurface*>(callbackData->directionalSurfaceNode->hapi_surface.get())->wrappedSurface.reset ( 
    callbackData->wrappedSurface );
  return H3DUtil::PeriodicThreadBase::CALLBACK_DONE;


Illustration of the directional friction algorithm implemented by DirectionalFrictionSurface in getProxyMovement(). The total proxy movement is from the Previous Proxy position (blue dot), to the Next Proxy position.
Illustration of the directional friction algorithm implemented by DirectionalFrictionSurface in getProxyMovement(). The total proxy movement is from the Previous Proxy position (blue dot), to the Next Proxy position.

Now we?ll examine how to implement the getProxyMovement() function of DirectionalFrictionSurface in order to create the direction dependant friction effect. This involves some basic geometry calculations for which we will rely on the vector and matrix operations readily available from H3DUtil.

The diagram (above) illustrates how to calculate the proxy movement for our surface. The final proxy movement will be a vector from the previous proxy position (the blue dot), to the next proxy position (green dot). The proxy movement is calculated in the method getProxyMovement() of DirectionalFrictionalSurface. We are passed a ContactInfo object by the haptic renderer, which provides details of the contact between the shape?s geometry and the haptic device, and contains a pointer to the haptic device (used to access information about the device, such as orientation). The local coordinate system of the ContactInfo object is relative to the surface being contacted (with the y-axis being aligned with the surface normal). This greatly simplifies our friction calculations.

The information that we are interested in, that can be obtained from the ContactInfo object, is the previous proxy position (the origin of the local coordinate system) and the current position of the haptic device relative to the surface (i.e. in the local coordinates of the ContactInfo object), which is accessed via localProbePosition(). Our task in getProxyMovement() is to calculate the movement that will take the proxy to its next position. In the simple case, where there is no friction at all, the proxy movement will take the previous proxy all the way to the device position (but just above the surface).

For an example of this, see the following code from HAPI?s FrictionalSurface.cpp. Note how the calculated proxy movement is passed out of the function using the ContactInfo object again, by calling setLocalProxyMovement():

Source Code (HAPI): FrictionalSurface.cpp

// SmoothSurface
Vec3 local_probe = contact_info.localProbePosition();
contact_info.setLocalProxyMovement( Vec2( local_probe.x , local_probe.z ) );

The Friction Algorithm

Calculating the total proxy movement for our directional friction effect consists of four steps, which are described below and labeled in the diagram:

1. First the cutting direction is determined. To do this we project a unit vector representing the angle of the scalpel/device onto the surface. To achieve this, a unit vector along the z-axis is rotated by the orientation of the haptic device. This vector is then transformed to the local coordinates of the surface contact using ContactInfo::vectorToLocal(). Once transformed to the local coordinates of the surface, we can simply zero the y component to yield the projection of the vector onto the surface. Finally, the result is normalized. Imagine this vector as the shadow of the scalpel on the surface, if it were lit from directly above the surface.

Source Code: DirectionalFrictionalSurface.cpp

  // A unit vector orientated down the z-axis of the device stylus in world coords
  Vec3 zDeviceGlobal= contact_info.hapticsDevice()->getOrientation() * Vec3 ( 0.0, 0.0, 1.0 );
  // Transform the z-axis of the haptic device to local coordinates of contact_info
  Vec3 zDeviceLocal= contact_info.vectorToLocal ( zDeviceGlobal );
  // Because the y-axis of the local coordinate system should be the same as the normal at the contact point,
  // simply zeroing the y component of zDeviceLocal should give the projection of zDeviceLocal onto the surface
  Vec2 cuttingDirection ( zDeviceLocal.x, zDeviceLocal.z );
2. Next we work out the proxy movement allowed by the friction surface representing friction along the cutting direction. Recall that the frictional properties along the cutting direction are defined by the friction parameters that our DirectionalFrictionalSurface class inherits from FrictionalSurface. Therefore we can simply call our parent class method FrictionalSurface::getProxyMovement() to obtain this. The movement of the proxy so far, after performing this step, is labeled 2 on the diagram. Note that the proxy moves from the previous proxy position (blue dot) towards the device position (red dot). If the friction parameters were zero (i.e., no friction), then the proxy movement would take the proxy all the way along the dotted line, to a point on the surface, closest to the device position.

Source Code: DirectionalFrictionalSurface.cpp

  // Calculate proxy movement using standard friction parameters
  FrictionSurface::getProxyMovement ( contact_info );
3. So we have calculated the movement of the proxy permissible by friction along the line of cutting. But the movement we calculated is always towards the device position (and in the local x-z plane of the surface), and is not constrained by the direction of cutting. So the next step is to project the proxy movement onto the cutting direction, effectively restricting movement to the line of cutting. The resulting projection is labeled 3 on the diagram. As well as constraining movement to the cutting line, we also want to restrict movement to the direction of cutting, i.e., we don?t want to allow backwards movement. Therefore, we check the angle between the proxy movement and the cutting direction before doing the projection and if the proxy movement would be backwards, we zero it.

Source Code: DirectionalFrictionalSurface.cpp

  // Project the calculated proxy movement vector onto the cutting direction vector
  // i.e. Constrain movement to the cutting direction
  HAPIFloat zDeviceDotProxyMovement= cuttingDirection.dotProduct ( contact_info.localProxyMovement() );
  // Only allow forwards movement
  Vec2 constrainedProxyMovement;
  if ( zDeviceDotProxyMovement > 0 )
    constrainedProxyMovement= zDeviceDotProxyMovement * cuttingDirection;
4. Now we have movement that is constrained to the cutting direction and can be affected by friction. However, currently there is no possibility to allow any movement away from the cutting direction. We would like to permit some movement away from the cutting line, which could be controlled by separate friction parameters. This is where our ?wrappedSurface? member variable comes in. We will use this surface to calculate the proxy movement from the proxy position on the cutting line, which we calculated previously (labeled 3 on the diagram) towards the device position (red dot). This will allow the next proxy position (green dot) to slip away from the cutting direction by an amount that can be controlled by adjusting the friction parameters of the ?wrappedSurface?.
The first line in the code below calculates the constrained proxy position (labeled 3 on the diagram), from the proxy movement that we calculated previously. Then we need to provide this proxy position to the getProxyMovement() method of the ?wrappedSurface? member variable, in order to calculate the proxy movement away from the cutting line (from 3 to 4 in the diagram). To do this we need to call setGlobalOrigin() on the ContactInfo object that we pass to getProxyMovement(), just like the haptic renderer will have done before it called our own getProxyMovement() method. At this point we create a copy of the ContactInfo, and it is this copy that we modify and pass to the ?wappedSurface?. We do this because the haptic renderer is probably only expecting us to modify the ?proxy_movement_local? variable of ContactInfo, so to avoid any potential issues we take a copy and modify that instead.
Once we?ve made the call to wrappedSurface->getProxyMovement(), it simply remains to calculate the total proxy movement and return. Recall that the total proxy movement is from the blue dot to the green dot on the diagram. This vector is simply the sum of the two proxy movements that we calculated: movement along cutting line + movement away from cutting line.

Source Code: DirectionalFrictionalSurface.cpp

    // Estimate the new proxy position after applying the constrained proxy movement
    Vec3 constrainedProxyPos = 
      contact_info.globalContactPoint() + 
      contact_info.yAxis() * min_distance +
      constrainedProxyMovement.x * contact_info.xAxis() + 
      constrainedProxyMovement.y * contact_info.zAxis();
    // To calculate proxy movement from the wrapped surface we will set properties of the ContactInfo structure
    // intended for use by renderers only. Therefore, create a copy of contact_info in case the renderer relies on 
    // these properties being unchanged in contact_info.
    ContactInfo myContactInfo ( contact_info );
    // Update the proxy position to that calculated above
    myContactInfo.setGlobalOrigin ( constrainedProxyPos );
    // Get non-constrained proxy movement from wrapped surface
    wrappedSurface->getProxyMovement ( myContactInfo );
    // Total proxy movement is: movement constrained along direction of cutting + movement not constrained
    contact_info.setLocalProxyMovement ( constrainedProxyMovement + 
      myContactInfo.localProxyMovement() );


Now we?ve created our haptic effects by defining a new H3DSurfaceNode subclass and corresponding HAPISurfaceObject, it?s time to move on to the graphics. What we would like to achieve, is to have the path ?cut? by the scalpel rendered visually on the surface of the shape. This problem is similar to that solved by the PaintableTexture tutorial. So we?ll use that as our starting point. The PaintableTexture tutorial presents a custom node derived from X3DTexture3DNode, which colors the pixels of the texture that are touched by the haptic device. In this tutorial we?ll only discuss the additional functionality required to produce a more realistic cut effect, and we shall assume that you are already familiar with the original PaintableTexture node presented in the earlier tutorial.

We will add the following extra functionality:

  • The line drawn will be made from solid line segments, rather than single pixels. This means that unlike the standard PaintableTexture, our node will always produce a solid line, no matter how fast the haptic device moves across the surface.
  • The width of the line will be proportional to the force with which the haptic device contacts the geometry. This will produce a more realistic cut, starting as a narrow line, then widening into a ?deeper? cut, before tapering off again.
  • The color (and transparency) of the line will also vary with the contact force, helping to create a smoother start and end to the cut effect than could be achieved by changing the width alone. This feature will be used to give the areas of the cut where gentle force was used a lighter/paler color and areas where more force was used a darker/deeper red.
  • It will be possible to reset the texture and erase the entire rendering.

The implementation of our texture node will be split across two classes. First we shall define a fairly abstract and general purpose node: PaintableTexture. This node provides the basic functionality described in the PaintableTexture tutorial, and can be used independently if desired. Secondly, we will create a subclass of PaintableTexture node, called CuttingPaintableTexture. This subclass will provide the specialized functionality that will create the visual effect of cutting.


Let?s examine our PaintableTexture class first. There are various modifications that separate it from the node described in the previous tutorial, for example, we define the following additional two fields:

Source Code: PaintableTexture.h

/// Field to track contact with the surface so as to begin and end drawing correctly
auto_ptr< UpdateContact > contact;
/// If sent a value of true, this field will reset the image to the background color
auto_ptr< UpdateClearAll > clearAll;

The ?contact? field is intended to receive a Boolean value indicating whether or not the haptic device is in contact with the shape?s geometry. It is used to call two virtual functions defined by PaintableTexture, listed in the following code segment. This is to simplify the task of implementing our specialized subclass of PaintableTexture, CuttingPaintableTexture, which is described later. We need to know when the device breaks contact with the geometry, because our CuttingPaintableTexture class will render the cutting path as a series of line segments. So unlike the previous implementation of PaintableTexture, which colors only discrete pixels ? one per frame ? it?s important to track the contact status of the device so that we can introduce a break in the line if the haptic device is lifted up from the surface, and then put down on the surface again in a different location.

Source Code: PaintableTexture.h

/// Override to customise drawing to texture. Called once on contact with surface.
virtual void beginPainting () {}
/// Override to customise drawing to texture. Called once on loss of contact with surface.
virtual void endPainting () {}

The functions are called by creating a custom SFBool field type using the OnValueChangeSField<> template and using this as the PaintableTexture?s ?contact? field:

Source Code: PaintableTexture.h

/// Field to track contact with the surface so as to begin and end drawing correctly
class CUTTING_API UpdateContact: public OnValueChangeSField< AutoUpdate< SFBool > > {
  virtual void onValueChange( const bool &v );

The onValueChange() function is then implemented to call beginPainting() and endPainting() appropriately, when the value changes:

Source Code: PaintableTexture.cpp

// Field to track contact with the surface so as to begin and end drawing correctly
void PaintableTexture::UpdateContact::onValueChange( const bool &v )
  PaintableTexture* owner= static_cast< PaintableTexture * >( getOwner() );
  if ( v )
    owner->beginPainting ();
    owner->endPainting ();
Note: Although we?ve defined two functions here to track the contact status, we?ll see later that to implement CuttingPaintableTexture, it is only necessary to use endPainting(). The function beginPainting() is just provided for symmetry, and incase it might be useful to as yet unforeseen additional subclasses.

In a similar way to that described for the ?contact? field, the ?clearAll? field is used to call PaintableTexture?s clearPainting() member function, in order to reset the texture and erase all drawings. In this case however, the OnNewValueSField<> template is used, because we want to clearPainting() to be called every time a value of true is received, even if the previous value was true.

As in the previous tutorial example, our PaintableTexture node defines a ?paintAtTexCoord? field, to which the current texture coordinate of the haptic device/geometry contact will be sent. The only difference here is that we define a virtual member function paintToTexture() to perform the actual drawing, which slightly simplifies the task of customizing the drawing later in subclasses. The default implementation is to color a single pixel at the point of contact with the haptic device (as in the previous tutorial), but we will override this later, in CuttingPaintableTexture, to provide a more sophisticated implementation.

Source Code: PaintableTexture.h

/// Override to customise drawing to texture
virtual void paintToTexture ( const Vec3f& textureCoord );


Having implemented the basic functionality in PaintableTexture, now we move on to the subclass CuttingPaintableTexture, where we extend the rendering to produce a more realistic visual representation of the cutting path. CuttingPaintableTexture adds a few additional fields, described below:

Source Code: CuttingPaintableTexture.h

/// The current contact force
auto_ptr< SFVec3f > force;
/// The weighting that determines how much the contact force affects the color of the cutting line
auto_ptr < SFFloat > colorForceWeight;
/// The weighting that determines how much the contact force affects the width of the cutting line
auto_ptr < SFFloat > widthForceWeight;
/// The maximum allowable width of the rendered cutting line
auto_ptr < SFInt32 > maxWidth;

CuttingPaintableTexture defines an additional field called ?force?, to which the contact force of the haptic device on the geometry will be routed. The value will be used to help calculate the color and width of the line to render to the texture.

The fields ?colorForceWeight? and ?widthForceWeight? contain float values, which can be used to control how much the contact force (the ?force? field), will affect the color change of the line, and how much it will affect the width of the line, respectively. Essentially, the weight value will be multiplied by the force magnitude to give the current line width, or color shift. The ?maxWidth? field is used to provide an upper limit to the width of the cutting line. In order to implement the rendering of the cutting line we override two virtual functions defined earlier in PaintableTexture:

Source Code: CuttingPaintableTexture.h

/// Override to customise drawing to texture
virtual void paintToTexture ( const Vec3f& textureCoord );
/// Override to customise drawing to texture. Called once on loss of contact with surface.
virtual void endPainting ();

The paintToTexture() function performs the actual rendering, whereas endPainting() notifies our class when the haptic device breaks contact with our shape?s geometry, enabling us to insert a break in the cutting line. In order to implement the rendering we define a few helper functions shown below:

Line drawing helper functions

Source Code: CuttingPaintableTexture.h

/// Draws a solid line of the specified width between the two texture coordinates specified
void drawLine2D ( const Vec3f& a, const Vec3f& b, const RGBA& color, unsigned int size );
/// Draws a solid line of the specified width between the two pixel coordinates specified
void drawLine2D ( unsigned int& x1, unsigned int& y1,
						  unsigned int& x2, unsigned int& y2, const RGBA& color, unsigned int size );
/// Draws a square of pixels of the specified size
void setPixels ( unsigned int x, unsigned int y, const RGBA& color, unsigned int size );
/// Convert texture coord to pixel in the image
void textureCoordToPixel ( const Vec3f& tc, unsigned int& x, unsigned int& y, unsigned int& z );

The first function drawLine2D(), is the only one which we will use directly to implement the drawing of the cutting line. It will draw a line segment on the texture between texture coordinates a and b, of the specified color and size (i.e. width). The other three functions are employed by the first so that it can perform its task. The implementation of these three functions is described briefly below, but for more detail refer to the source file CuttingPaintableTexture.cpp.

The function textureCoordToPixel() is used to convert the normalized texture coordinates used to describe the haptic device contact position, into pixel coordinates, used to draw to the texture. This is a very simple operation, the x, y and z components of the normalized texture coordinate are just multiplied by the number of pixels in the texture in the x, y and z axis, respectively. The first drawLine2D() function uses textureCoordToPixel() to convert the start and end points of the line segment to pixel coordinates and then calls the second drawLine2D() function.

The second drawLine2D() function implements a simple straight line rendering algorithm (adapted from here). At each point on the line returned by the rendering algorithm, we call the setPixels() function, which draws a square on the texture. The square will be centered on the point on the line returned by the line rendering algorithm and the size of the square will be equal to the desired width of the line. This drawing method is not optimal, because the same pixels will be set multiple times as the drawing of the line progresses, but it is a simple way of ensuring that we have an unbroken line segment, whose width can be controlled.

Now that our helper function drawLine2D() is in place, the task of implementing the virtual function paintToTexture(), which will render the cutting line, is quite simple. However, we will need to add two additional variables to the class definition to store some state information between calls to paintToTexture(). The function paintToTexture() receives a Vec3f parameter named ?textureCoord?, which indicates the texture coordinate of the current point of contact with the shape?s geometry. But to draw a line segment we need two points, a start and an end coordinate. Therefore we add a Vec3f member variable ?previousTexCoord? to the CuttingPaintableTexture node?s class definition, which will contain the texture coordinate at the time of the previous call to paintToTexture().

Of course, on the first call to paintToTexture() there will be no previous value, so we also add the Boolean member variable ?validPreviousTexCoord?, which will initially be set to false, and only become set to true after the first execution of paintToTexture() when we will have a valid previous texture coordinate to use assigned to ?previousTexCoord?. Here are the additional member variables declared in CuttingPaintableTexture.h:

Source Code: CuttingPaintableTexture.h

/// The previous texture coordinate
Vec3f previousTexCoord;
/// Is the previous texture coordinate valid
bool validPreviousTexCoord;

With these variables in place, and ?validPreviousTexCoord? initialized to false in the constructor, we can implement paintToTexture() as shown below:

Line/Cut drawing function

Source Code: CuttingPaintableTexture.cpp

// Override to customise drawing to texture
void CuttingPaintableTexture::paintToTexture ( const Vec3f& textureCoord )
  // Is the haptic device in contact with the geometry?
  if ( contact->getValue () )
    // Is there a previous contact texture coordinate available from the last iteration?
    if ( validPreviousTexCoord )
      // Work out the line width
      H3DInt32 width= (H3DInt32)(force->getValue().length() * widthForceWeight->getValue ());
      if ( width > maxWidth->getValue() )
        width= maxWidth->getValue();
      // Work out the line color
      H3DFloat blend= force->getValue().length() * colorForceWeight->getValue();
      if ( blend > 1 ) 
        blend= 1;
      RGBA color= backgroundColor->getValue()*(1.0-blend) + paintColor->getValue()*blend;
      // Draw the line to the texture
      drawLine2D ( previousTexCoord, textureCoord, color, width );
    // Store the current contact texture coordinate for the next iteration
    previousTexCoord= textureCoord;
    validPreviousTexCoord= true;

If the haptic device is in contact with the geometry of the shape and we already have a valid value stored for the previous texture coordinate, then we proceed to draw a line segment between ?previousTexCoord? and ?textureCoord? (the texture coordinates at the previous and current points of contact. The drawing of the line is performed by drawLine2D() described above, so all we need to do is to provide appropriate parameters to this function. ?previousTexCoord? and ?textureCoord? are used directly as the start and end coordinates passed to drawLine2D(). But we also need to calculate the current line width and color based on the current haptic device contact force. To calculate the current line width, we multiply the magnitude of the contact force by the ?widthForceWeight? field, and cap it at a maximum given by the field ?maxWidth?:

Working out line width

Source Code: CuttingPaintableTexture.cpp

// Work out the line width
H3DInt32 width= (H3DInt32)(force->getValue().length() * widthForceWeight->getValue ());
if ( width > maxWidth->getValue() )
  width= maxWidth->getValue();

To calculate the current color to use for the line, we multiply the magnitude of the contact force by the ?colorForceWeight?, and cap it at a maximum of 1. Then the result is used to ?blend? between the ?backgroundColor? field and the ?paintColor? fields, both defined in PaintableTexture, in order to produce the final line color. If the value is 0 then the line color will be equal to ?backgroundColor?. If the value is 1, then the line color will be ?paintColor?, and values in between will be linearly blended to produce a smooth transition from one color to the other:

Working out line color

Source Code: CuttingPaintableTexture.cpp

// Work out the line color
H3DFloat blend= force->getValue().length() * colorForceWeight->getValue();
if ( blend > 1 ) 
  blend= 1;
RGBA color= backgroundColor->getValue()*(1.0-blend) + paintColor->getValue()*blend;

The implementation of CuttingPaintableTexture is almost complete, but for one remaining issue. When the haptic device is lifted up from the geometry of the shape and then later placed back down again in a new location, in this special case, we do not want to draw a line segment between the current and previous texture coordinates. Instead, we want to start drawing a completely new line. To achieve this behavior, CuttingPaintableTexture overrides the endPainting() virtual function defined previously in PaintableTexture, in order to reset the ?validPreviousTexCoord? variable to false. The endPainting() function is called whenever the haptic device breaks contact with the shape?s geometry. The result is that when the device makes contact with the geometry again, there will be no valid previous texture coordinate, and nothing will be drawn until the next iteration- meaning that a break will be introduced into the path of the rendered cutting line:

Source Code: CuttingPaintableTexture.cpp

// Override to customise drawing to texture. Called once on loss of contact with surface.
void CuttingPaintableTexture::endPainting () 
  // Is the previous texture coordinate valid?
  validPreviousTexCoord= false;

Putting it all Together

All the necessary specialized scene graph nodes to create our simulation have now been coded in C++. Our final task is to ?slot? those nodes together using X3D, and ?glue? them together using Python. More specifically, we?ll define the structure of the scene graph (which will include our custom nodes) using X3D, and use Python to help set up some routes between them, and to perform various other miscellaneous tasks. We?ll begin by taking a look at the main X3D file, which is loaded by the viewer, CuttingSimulation.x3d. This defines the root scene graph which represents our simulation.

Our scene contains one Shape node, which will provide a surface to cut with the scalpel. To create a slightly more realistic appearance, we will use H3D?s DeformableShape node to provide the visual effect of deformation when the scalpel pushes against the shape. To add our own visual and haptic cutting effects to the DeformableShape, we will create instances of the new nodes that we defined above, and assign them to fields of the DeformableShape?s Appearance node. But for the moment, here is the X3D code for the DeformableShape representing the cutting surface without any of our custom nodes added:

Source Code: CuttingSimulation.x3d

<!!-- The cutting/skin area -->
<DeformableShape DEF="CuttingSurface" >
  <GaussianFunction width="0.03" containerField="distanceToDepth"/>
 <Rectangle2D size="0.4 0.3" DEF="HapticGeometry" solid="false"
              containerField="hapticGeometry" />

The CoordinateDeformer node and the H3DFunctionNode contained within it, define how the coordinates of the shape?s geometry will deform when touched by the haptic device. Here we have used a GaussianFunction node to produce a ?bell shaped? deformation at the location of the contact point.

To finalize the deformation effect there are two further steps required, which will be performed by the Python script The python code for this script is adapted from the DeformableShape demo shipped with H3DAPI. You will notice that a Rectangle2D node has been used to specify the geometry for this shape that can be felt by the haptic device (the ?hapticGeometry? field of the Shape node). This makes the haptic rendering process simple and efficient because the geometry only contains 4 vertices.

However, for the very same reason the Rectangle2D node is not suitable to use as our graphical geometry as well (the ?geometry? field of the Shape node). For the CoordinateDeformer of the DeformableShape to produce a realistic deformation, more vertices will be required. The majority of the code in is dedicated to generating a flat plane used for the graphical geometry of the shape, which matches the haptic geometry provided by Rectangle2D, but consists of many more vertices. The flat graphical plane is constructed using an IndexedTriangleSet.

Using the Python Script (

To use the script, from X3D we create a PythonScript node and pass it the DeformableShape instance via its ?references? field, so that it can assign to it the geometry that it generates. With the ?USE=?? attribute, we can create a reference to another X3D node, by specifying the name that the node was originally given using ?DEF=??. So in the code below, we are not passing a new instance to the PythonScript, but the DeformableShape instance already declared above. The ?references? field is an MField, so we could pass in as many nodes as we like in this way, and they will be added to the list.

Source Code: CuttingSimulation.x3d

<!!-- This python script generates the visual cutting surface geometry -->
<!!-- and sets properties of the haptic devices in the scene           -->
<PythonScript url="python/">
 <DeformableShape USE="CuttingSurface" containerField="references" />

From within we can then access the ?references? field of the PythonScript node. This contains a list of all the instances that we passed into the PythonScript node from X3D, in the order that they were listed in X3D. In this case we just expect one node, the DeformableShape:

Source Code:

# Get the node references passed in to the script
deform_node, = references.getValue()

Generating the Graphic Geometry

The following Python code is taken from the DeformableShape demo shipped with H3DAPI, and simply creates a 31x31 grid of coordinates (and texture coordinates) to use in the graphical geometry of the shape. Notice that we first get the size of the haptic geometry from the DeformableShape node, so that we can generate graphic geometry of a size that matches.

Source Code:

# This python file creates an rectangular IndexedTriangleSet with
# columns X rows coordinates.
# Get the size of the graphic geometry from the size of the haptic geometry
size = deform_node.hapticGeometry.getValue().size.getValue()
columns = 31
rows = 31
coords = []
tex_coords = []
index = []
step_c = size.x / (columns-1)
step_r = size.y / (rows-1)
tc_step_c = 1.0/ (columns-1)
tc_step_r = 1.0/ (rows-1)
for c in range( columns ):
  for r in range( rows ):
    coords.append( Vec3f( step_c * c - size.x / 2, step_r * r - size.y/2, 0 ) )
    tex_coords.append( Vec2f( tc_step_c * c, tc_step_r * r ) )
for c in range( columns - 1 ):
  for r in range( rows - 1 ):
    v0 = r * columns + c 
    v1 = r * columns + c+1
    v2 = (r+1) * columns + c+1
    v3 = (r+1) * columns + c
    index = index + [v0, v1, v2, v0, v2, v3 ]

Now that the coordinates have been created, the script creates a Coordinate node and a TextureCoordinate node that use them. An IndexedTriangleSet node is created which will represent the geometry of the shape, and the two Coordinate nodes are assigned to it:

Source Code:

# Now create the geometry node and its coordinate nodes
its =  createX3DNodeFromString( "<IndexedTriangleSet DEF=\"CUTTING_GEOM\" solid=\"FALSE\" />" )[0]
coord = createX3DNodeFromString( "<Coordinate />" )[0]
coord.point.setValue( coords )
tex_coord = createX3DNodeFromString( "<TextureCoordinate />" )[0]
tex_coord.point.setValue( tex_coords )
its.index.setValue( index )
its.coord.setValue( coord ) 
its.texCoord.setValue( tex_coord )

Finally, the geometry is applied to the DeformableShape originally passed into the script via the ?references? field:

Source Code:

# Apply the geometry node to the DeformableShape node
deform_node.geometry.setValue( its )

Proxy Weighting

As well as creating the geometry for the DeformableShape, the script performs one other task. The amount that the DeformableShape?s geometry will deform depends, in part, on the position that the graphical representation of the haptic device stylus is rendered. This is because it would look odd to have the surface deform by a large amount, and the haptic device stylus only appear to move a small distance into the surface. Therefore the two are linked, and we can control the magnitude of the deformation, by controlling the position that the haptic device stylus is rendered.

The device stylus is rendered at the position given by the ?weightedProxyPosition? field of the H3DHapticsDevice node. This position is on a line somewhere between the actual device position and the proxy position. This gives the stylus the appearance of being able to push slightly into a surface, whilst still appearing to be constrained by the surface. The distance at which the stylus is rendered between the device position and the proxy is controlled by the ?proxyWeighting? field of the H3DHapticsDevice. If this value is 0, then the stylus will be rendered at the device position, in world coordinates (the ?trackerPosition? field), if it is 1, then the stylus position will be equal to the proxy position (the ?proxyPosition? field). This is a single-line extract from H3DHapticsDevice.h, which shows how the weightedProxyPosition is calculated:

Source Code (H3DAPI): H3DHapticsDevice.h

value =  tracker_pos + weighting * ( proxy_pos - tracker_pos );

The following code segment from is used to loop over all haptic devices in the scene and set the ?proxyWeighting? field to 0.8. This value will allow the surface to deform more visibly than the default value for the field, which is 0.95.

Source Code:

# Loop over all haptic devices in the scene
di = getActiveDeviceInfo()
if( di ):
  devices = di.device.getValue()
  for d in devices:
    # Define the position of the stylus between the device and
    # the proxy. Controls how much the surface deforms visually.
    # Closer to 0 = More deformable
    d.proxyWeighting.setValue( 0.8 )

The Scalpel Stylus Model

The scalpel stylus model being designed using Blender, an open source 3D modeling application.
The scalpel stylus model being designed using Blender, an open source 3D modeling application.

The 3D model that will give the stylus of the haptic device the graphical appearance of a scalpel was created using Blender, an open source 3D modeling tool. The model was exported to the X3D file Scalpel.x3d. When creating a model to use as the stylus of the haptic device it is important to remember that the origin (0, 0, 0) of the stylus model will correspond to the point at which the stylus appears to interact with (touch) the haptic shapes in the scene. So for our scalpel model, the origin should be somewhere on the tip of the blade. This can always be fine tuned after the stylus model is exported from the 3D modeling program, by coding a Transform node around the entire model and adjusting the ?translation? field until the point of haptic interaction is aligned with the desired point on the stylus model. To actually use the model contained in Scalpel.x3d as the haptic device stylus, we use the following Python code in

Source Code:

# Loop over all haptic devices in the scene
di = getActiveDeviceInfo()
if( di ):
  devices = di.device.getValue()
  for d in devices:
    # Specify a node to use as the graphical representation of the haptic device
    d.stylus.setValue ( createX3DNodeFromURL ( "x3d/Scalpel.x3d" )[0] )

This code loops over all the haptic devices in the scene and creates the scalpel model from file using createX3DNodeFromURL(). The resulting node is then assigned to the ?stylus? field of the haptic device, ensuring that it is used as the graphical representation of the haptic device in the simulation.

Now we have a flat, deformable surface, and a scalpel model representing the stylus of the haptic device. Next we?ll use the nodes we created earlier in C++ to add the graphical and haptic effects of cutting to the DeformableShape.

Importing our Nodes

Before we can use any of our custom nodes defined in C++, we need to import them using the ImportLibrary node. The shared library containing our nodes is automatically copied into the ?bin? and/or ?lib? subdirectories when the ?install? target is built (e.g. when ?make install? is executed as described above). The exact name and location of our library varies depending on the platform, so we?ll include three lines (for Windows, Mac and Linux) and hope that one succeeds:

Source Code: CuttingSimulation.x3d

    <!!-- Import additional H3D nodes from CuttingNodes project -->
    <ImportLibrary library="bin\\CuttingNodes" />            <!!-- Windows -->
    <ImportLibrary library="lib/libCuttingNodes.dylib" />    <!!-- MacOSX -->
    <ImportLibrary library="lib/" />       <!!-- Linux/Unix -->

Using the DirectionalFrictionSurfaceNode

To use our DirectionalFrictionSurfaceNode in our simulation, we simply add a few lines to CuttingSimulation.x3d, shown in the next code segment. The highlighted lines just specify that we?d like to create a DirectionalFrictionSurfaceNode and assign it to the ?surface? field of the Appearance node of the DeformableShape. This means that when we touch the DeformableShape with the haptic device, our custom DirectionalFrictionSurfaceNode will be used to calculate proxy movement and forces, and so we will feel the directional friction effect that we created above.

Embedded within our DirectionalFrictionSurfaceNode we have specified a standard FrictionalSurface to use as the ?wrappedSurface? value. We have named the node instances with ?DEF=? so that later we can set up routes to control the friction parameters with our user interface. Alternatively we could have specified the friction parameters directly in the X3D code (e.g. "<DirectionalFrictionSurfaceNode dynamicFriction=?1.2?> ..."). And that?s all that?s required to use our custom friction surface.

Source Code: CuttingSimulation.x3d

<!!-- The cutting/skin area -->
<DeformableShape DEF="CuttingSurface" >
 <!!-- The directional friction surface (and wrappedSurface) -->
 <DirectionalFrictionSurfaceNode DEF="DirectionalFrictionSurface">
  <FrictionalSurface DEF="FrictionSurface" containerField="wrappedSurface" />
 </DirectionalFrictionSurfaceNode >

Using the CuttingPaintableTexture node

To use our CuttingPaintableTexture on the DeformableShape in our simulation, we could us X3D to create a CuttingPaintableTexture node and assign it to the ?texture? field of the DeformableShape?s Appearance node. However, as well as the texture to represent the cut, we would also like to display an image representing the skin. To show both textures, we will use a MultiTexture node, which allows us to layer several textures to create a single texture that can be applied to a shape. For the skin, we will use an ImageTexture, and place it on the bottom layer of the MultiTexture. For the cut, we will use a CuttingPaintableTexture placed on the next layer up in the MultiTexture. The result is that we see the cut superimposed on the skin texture.

Source Code: CuttingSimulation.x3d

<!!-- The cutting/skin area -->
<DeformableShape DEF="CuttingSurface" >
  <!!-- Layered texture containing cut effect on top of skin texture -->
   <CuttingPaintableTexture DEF="CuttingTexture" 
        width="512" height="512"
        paintColor="1 0 0 1" colorForceWeight="0.3" 
        widthForceWeight="0.5" maxWidth="5" />
   <ImageTexture url="images/skin.jpg" />

The final step in using the CuttingPaintableTexture node is to provide the texture node with the values that it will require to render the cutting line. Namely, these are the texture coordinates of the current device/surface contact position, and the force of contact. These values are available from the X3DGeometryNode contained in the Shape?s ?hapticGeometry? field. The X3DGeometryNode node has a number of MField types which contain arrays of values that relate to information about the X3DGeometryNode and a specific haptic device instance.

For example, the MFVec3f field ?force? contains an array of vectors describing the contact force with the geometry for each haptic device in the scene. Each force is indexed by the index of the haptic device, which is defined by the order in which the device appears in the DeviceInfo node. Similarly the X3DGeometryNode contains a ?contactTexCoord? field, which should contain the texture coordinate at the point of contact with the geometry for each device. A MFBool field ?isTouched? indicates which of the haptic devices is currently in contact with the geometry.

Here is the X3D code which creates the PythonScript node for, and passes in references to the X3DGeometryNode and CuttingPaintableTexture instances, so that it can set up the routes between them:

Source Code: CuttingSimulation.x3d

<!!-- This python script routes the contact information from the haptic geometry -->
<!!-- to the cutting texture in order to draw the visual feedback of the cutting -->
<PythonScript url="python/">
 <CuttingPaintableTexture USE="CuttingTexture" containerField="references" />
 <Rectangle2D USE="HapticGeometry" containerField="references" />

The file itself, is shown in the next code segment. First, we declare the new field types by subclassing TypedField and overriding the update() function. Note the parameters given to the TypedField class (which acts like a template class in C++). The first parameter is the type that this field will have. The second parameter is the type of fields that this field can have routed to it. In both examples below, our field is an SField type that can have an MField type routed to it. After the field types are declared, we create instances of them to do the conversions:

Source Code:

from H3DInterface import *
# A field type which takes the value of the first bool in the MFBool routed to it
class FirstBool( TypedField(SFBool, MFBool) ):
  def update( self, event ):
    contacts= event.getValue()
    if ( len(contacts) > 0 ):
      return contacts[0]
      return False
# A field type which takes the value of the first Vec3f in the MFVec3f routed to it
class FirstVec3f( TypedField(SFVec3f, MFVec3f) ):
  def update( self, event ):
    forces= event.getValue()
    if ( len(forces) > 0 ):
      return forces[0]
      return Vec3f(0, 0, 0)
# Create field instances
firstContact= FirstBool()
firstTexCoord= FirstVec3f()
firstContactForce= FirstVec3f()

Next we set up the actual routes. This could have been done in X3D too, but we?ll do it in Python. First we need the references to the X3DGeometryNode and CuttingPaintableTexture that we passed in earlier:

Source Code:

# Get references to nodes passed in
paintableTexture, geometry = references.getValue()

Then for each of the three fields (?isTouched?, ?contactTexCoord? and ?force?) we route the values from ?geometry? to ?paintableTexture?:

Source Code:

# Set up routes from geometry to paintable texture
geometry.isTouched.route ( firstContact )
firstContact.route ( )
geometry.contactTexCoord.route ( firstTexCoord )
firstTexCoord.route ( paintableTexture.paintAtTexCoord )
geometry.force.route ( firstContactForce )
firstContactForce.route ( paintableTexture.force )

So now, when our X3DGeometryNode is touched by the haptic device, the contact status, force and texture coordinate will all be routed to the CuttingPaintableTexture and so generate the visual appearance of cutting.

The User Interface

Next we?ll describe how the user interface (the slider bars etc. at the top of the screen) is implemented. The sliders, labels and buttons are nodes provided by the H3D UI component. The runtime part of this component (everything needed to load X3D files containing UI nodes) is installed by the Windows H3DAPI installer, and the full source code can be downloaded here.

Defining the User Interface

Since the user interface elements are implemented as X3D nodes, creating the user interface is just a matter of writing some X3D code. To keep the main X3D file clean and readable, the code for the user interface is contained in a separate file, x3d/UI.x3d. In the code segment below we will just show the X3D code for one slider bar and its labels. Notice that the entire user interface is inside a Billboard node, which ensures that it always faces the viewer, even when the viewpoint is changed. Setting the ?axisOfRotation? to (0, 0, 0) indicates that the Billboard should keep its y-axis aligned with the viewer as well. There is also a DirectionalLight node to illuminate the user interface. Because the ?direction? field of the light is in the local coordinate system of the Billboard, the controls will always be lit from directly in front, even when the viewpoint is changed.

Source Code: UI.x3d

<!!-- Billboard containing UI controls - always faces viewer -->
<Billboard axisOfRotation='0 0 0'>
    <!!-- Light to illuminate the UI controls -->
    <DirectionalLight direction='0 0 -1' />
        <Frame DEF="SliderFrame">
          <!!-- Slider name label -->
          <Label text="Dynamic">
            <FontStyle DEF="FS" justify='"MIDDLE" "MIDDLE"' size="0.01" 
                     family='"Verdana" "FreeSerif"' />
            <ImageTexture DEF="ButtonTexture" url="../images/button.png" />
            <GridInfo columnSpan="3" row="0" padding="0.0001 0.001"
          <!!-- The Slider -->
          <SliderBar DEF="SB_AgainstDynFric" valueRange="0 2" value="1.5" 
            <ImageTexture DEF="SliderTexture" url="../images/slider.png" />
            <GridInfo columnSpan="10" column="3" row="0" 
                      padding="0.0001 0.001" sticky="W+E+N+S"/>
          <EXPORT localDEF="SB_AgainstDynFric" AS="SB_AgainstDynFric" />
          <!!-- Slider value label -->
          <Label DEF="L_AgainstDynFric">
            <FontStyle USE="FS" />
            <ImageTexture USE="ButtonTexture" />
            <GridInfo columnSpan="3" column="13" row="0" 
                      padding="0.0009 0.001" sticky="W+E+N+S"/>
        </Frame DEF="SliderFrame">
Note: Because we will be embedding this file (UI.x3d) within another (CuttingSimulation.x3d), in order to access the nodes defined in UI.x3d from CuttingSimulation.x3d, we need to add an EXPORT line for each node that we will want to access. In the above code we only need to export the SliderBar node.

There are three visible user interface nodes in the code above. First, there is a Label to indicate to the user what the slider bar is for. Here the ?text? field is set to contain the string ?Dynamic?, which is displayed on the Label to indicate that the slider is for controlling the dynamic friction.

Then comes the SliderBar itself. The ?valueRange? and ?stepLength? fields are used to control the values that the user can specify using the slider, and the ?value? field is the current value indicated by the slider. We give the SliderBar a name with ?DEF= SB_AgainstDynFric? so that we can reference it later.

Finally, there is another label, this time to display the current numerical value of the slider bar. Since the text on this label will change dynamically, there is no point specifying any text for it here, so we just give it a name with DEF="L_AgainstDynFric", and will reference this later.

So the slider?s value will need to be routed both to the ?dynamicFriction? field of the FrictionSurface that it controls, and to update the value on the label displayed next to it. Let?s deal with the label first.

The SliderBar?s ?value? field (which is of type SFFloat), cannot be routed directly to the Label?s ?text? field (which is an MFString), because the types do not match. Therefore we?ll define a short Python script to define a field type which will convert an SFFloat into an MFString. The field will be an MFString that contains a single formatted string value representing the value of the SFFloat, which is routed to it:

Source Code:

# A field type whose value will be a MFString, containing one string,
# which will be the formatted string representation of the
# SFFloat field routed to it.
class FloatToString ( TypedField ( MFString, SFFloat ) ):
  def update ( self, event ):
    return ["%.2f" % event.getValue()]
# Create one FloatToString instance for each SliderBar/Label
againstDynFric= FloatToString()
againstStaticFric= FloatToString()
withDynFric= FloatToString()
withStaticFric= FloatToString()

We create as many instances of the field FloatToString as there are slider bars. Then in UI.x3d, for each SliderBar and its associated Label, we set up a route from the SliderBar?s ?value? field to the appropriate FloatToString field in the PythonScript node, then from the FloatToString field to the Label?s ?text? field:

Source Code: UI.x3d

<!!-- Use python script to update slider lables with values -->
<PythonScript DEF="UI_PS" url="../python/" />
<ROUTE fromNode="SB_AgainstDynFric" fromField="value" toNode="UI_PS" toField="againstDynFric" />
<ROUTE fromNode="UI_PS" fromField="againstDynFric" toNode="L_AgainstDynFric" toField="text" />
Using the User Interface

To actually display the user interface defined in UI.x3d, the nodes created by UI.x3d are embedded in CuttingSimulation.x3d using the Inline node:

Source Code: CuttingSimulation.x3d

<!!-- Import User Interface -->
<Inline DEF="UI" url="x3d/UI.x3d" />

In order to make the user interface affect the model, we set up routes from the user interface to the fields of nodes in the model. In order to be able to reference the nodes embedded within the Inline node (by their DEF name), we need to add an IMPORT line for each node that we?ll refer to. This is the counterpart to the EXPORT line which we used previously in UI.x3d. In the following code segment we set up a route to control the dynamic friction parameter using the slider bar declared earlier. This is repeated for each SliderBar in the interface.

Source Code: CuttingSimulation.x3d

<!!-- Setup routes between model and UI -->
<!!-- Friction against the line of cutting -->
<IMPORT inlineDEF="UI" exportedDEF="SB_AgainstDynFric" AS="SB_AgainstDynFric"/>
<ROUTE fromNode="SB_AgainstDynFric" fromField="value" toNode="FrictionSurface" toField="dynamicFriction" />

You can learn more about the H3D UI component here.


We have described how to create a reasonable illusion of cutting with a scalpel by combining some simple haptic and graphic effects. Hopefully, in the process, we have learned some general points about how to create custom H3D nodes, such as H3DSurfaceNode subclasses, and how to use them in the scene graph to customize the haptic and graphic rendering.

Please use the forums to post any queries about this tutorial or the demo application, or contact me (neilf).

Personal tools
go to