Jump to: navigation, search

Programmable shaders are programs that users may add to modify the graphics rendering pipeline. Instead of proceeding through the default fixed rendering pipeline, the programmable shaders will be used instead. This gives the developer flexibility in defining visual effects that may not be possible with existing nodes.

H3DAPI supports both the Cg and GLSL shading languages. The examples here demonstrate the use of these two languages in H3D programs. Note also that some common shaders are implemented as nodes. For example the PhongShader node. When creating a new shader, if it is very common then consider implementing it as node.

Image:Note-tip.pngThis tutorial refers to the source code. You can download it from SVN at H3D release branch, or find it at H3D/H3DAPI/examples/Shader nodes/.

Shading with nVidia Cg

<!!-- cg_glass.x3d -->
     backUrl  ="../textures/b_arch_00000.png"
     frontUrl ="../textures/f_arch_00000.png"
     leftUrl  ="../textures/l_arch_00000.png"
     rightUrl ="../textures/r_arch_00000.png"
     topUrl   ="../textures/t_arch_00000.png"  />
  <Viewpoint position="0 0 0.6" />
  <NavigationInfo type="NONE"/>
  <DynamicTransform angularMomentum="-0.40 0.22 0.5">
    <FitToBoxTransform boxCenter="0 0 0" boxSize="0.25 0.25 0.25"
                       uniformScalingOnly="true" active="true">
          <Material />
          <!!-- Programmable shaders here -->
        <SuperShape DEF="SS" resolution="128" 
                    ss1_m="1" ss1_a="1.1" ss1_b="1.88"
                    ss1_n1="3.41" ss1_n2="-0.24" ss1_n3="19.07"
                    ss2_m="4" ss2_a="1" ss2_b="1"
                    ss2_n1="110" ss2_n2="100" ss2_n3="70"  />    

The code excerpt above shows the scene setup. The relevant shader nodes are omitted for now. As can be seen, the programmable shader is applied to the scene as part of the Appearance node. Otherwise, scene setup is as usual. We define the background with a Background node and the respective images that form the front, back, left, right, top and bottom parts. Viewpoint is then set so that the viewer observes the the scene at position (0, 0, 0.6). NavigationInfo type is set to NONE to disable viewpoint changes.

The Shape uses a SuperShape geometry and is transformed using the FitToBoxTransform node that transforms Shape by fitting it into a box of specified boxSize at location boxCenter. uniformScalingOnly is set to True so that the shape is uniformly scaled and active is True so that the transformation matrix is updated.

Now that we have set the scene, we are getting to the meat of this tutorial - the shaders.

<!!-- cg_glass.x3d -->
          <ProgramShader DEF="SHADER" language="CG" >
            <ShaderProgram DEF="FRAGMENT_SHADER" 
                           url="Shaders/cg_Refraction.frag" >
              <field name="EnvMap" type="SFNode" accessType="inputOutput">
                  <ImageTexture url="../textures/b_arch_00000rot.png" 
                  <ImageTexture url="../textures/f_arch_00000rot.png" 
                  <ImageTexture url="../textures/l_arch_00000rot.png" 
                  <ImageTexture url="../textures/r_arch_00000rot.png" 
                  <ImageTexture url="../textures/t_arch_00000.png" 
                  <ImageTexture url="../textures/d_arch_00000.png" 
              <field value="1" name="enableRefraction" type="SFFloat"
              <field value="1" name="enableFresnel" 
                     type="SFFloat" accessType="inputOutput"/>

We specify use of shaders by adding an X3DProgrammableShaderNode. Here, the ProgramShader is used and its language specified as CG to indicate that we will be using the Cg shading language. The ProgramShader node defines a shader that can consist of one or more individually programmable, self contained pieces. These pieces are then defined in ProgramShader as ShaderProgram nodes. We use two ShaderProgram nodes in this example, one as the fragment shader and one as the vertex shader. Both shaders define a refractive appearance for the SuperShape so that it appears glass-like The vertex shader modifies the values of SuperShape vertices and the fragment shader defines the pixels drawn to screen.

In the excerpt above, The type field of the ShaderProgram is set as FRAGMENT to indicate that this object should be compiled as a fragment shader. The url field specifies the code for the fragment shader program.

Fields in ShaderProgram are specified using the field tag. There are three fields specified for this fragment shader: EnvMap of with ComposedCubeMapTexutre as its value, and enableRefraction and enableFressnel, both of type SFFloat and value 1. These specified fields will be available in the fragment shader program to be manipulated.

<!!-- cg_glass.x3d -->
            <ShaderProgram DEF="VERTEX_SHADER" type="VERTEX"
                           url="Shaders/cg_Refraction.vert" />

A vertex shader is then defined by adding another ShaderProgram with type VERTEX. The code in cg_Refraction.vert will be compiled as a vertex shader program.

Glass effect modelled with shaders written in Cg.
Glass effect modelled with shaders written in Cg.

Shading with OpenGL GLSLang

This example shows the use of GLSL as the shading language in our program. The process is the same: add an X3DProgrammableShaderNode in Appearance, define the fields and include the vertex and fragment shader program files, only now the shaders are written in GLSL.

<!!-- glsl_earthshader.x3d -->
  <IMPORT inlineDEF='H3D_EXPORTS' exportedDEF='HDEV' AS='HDEV' />
      <Material />
      <!!-- Programmable shader here -->
    <Sphere radius="0.1" />

As usual, the relevant nodes for programmable shaders will reside within Appearance.

<!!-- glsl_earthshader.x3d -->
      <ComposedShader DEF="SHADER" language="GLSL" >
        <field name="EarthDay" type="SFNode" accessType="inputOutput">
          <ImageTexture DEF="EARTH_DAY" url="../textures/Day.jpg" />
        <field name="EarthNight" type="SFNode" accessType="inputOutput">
          <ImageTexture DEF="EARTH_NIGHT" url="../textures/Night.jpg" />
        <field name="EarthCloudGloss" type="SFNode" accessType="inputOutput">
          <ImageTexture DEF="EARTH_CLOUDS" url="../textures/Clouds.jpg" />
        <field name="lightPosition" type="SFVec3f" value="0.45 0 -0.45" accessType="inputOutput" />
        <field name="viewpointPosition" type="SFVec3f" value="0 0 0.6"
                 accessType="inputOutput" />
        <field name="viewpointOrn" type="SFMatrix4f" value="1 0 0 0
                                                            0 1 0 0
                                                            0 0 1 0
                                                            0 0 0 1"
               accessType="inputOutput" />
        <ShaderPart type="FRAGMENT" url="Shaders/glsl_Earth.frag" />
        <ShaderPart type="VERTEX" url="Shaders/glsl_Earth.vert" />

The ComposedShader node is used this time instead of ProgramShader. The ComposedShader node defines a shader where the individual source files are not individually programmable. This is exactly so with GLSL programs. With ComposedShader, all access to the shading capabilities is defined through a single interface that applies to all parts. The fields are defined similarly to the Cg example above. There are six fields defined and accessible in both the vertex and fragment shaders:

  • EarthDay, EarthNight and EarthCloudGloss with ImageTextures as values
  • lightPosition, which we will later define as the position of haptics device tracker
  • viewpointPosition and viewpointOrn

In the vertex shader program we use the last three fields to determine the colours of the Sphere vertices, and while the fragment shader defines the texturing of the globe.

<!!-- glsl_earthshader.x3d -->
  <PythonScript DEF="PS" url="" >
    <ShaderProgram USE="SHADER" containerField="references"/>
  <ROUTE fromNode="HDEV" fromField="trackerPosition"
         toNode="SHADER" toField="lightPosition" />

The only work left with this example is proper routing. The PythonScript takes the shader as reference and will do a type conversion before setting up the routes. There is one route that is defined in X3D: trackerPosition is routed to lightPosition.
from H3DInterface import *
class SFRotation2SFMatrix4f( TypedField( SFMatrix4f, (SFRotation ) ) ):
  def __init__(self, inverse ):
    TypedField( SFMatrix4f, (SFRotation ) ).__init__(self)
    self.inverse = inverse
  def update( self, event ):
    if( self.inverse ):
      return Matrix4f( event.getValue() ).inverse()
      return Matrix4f( event.getValue() )
vertex_shader, = references.getValue()
rotation2Matrix = SFRotation2SFMatrix4f(0)
rotation2MatrixInverse = SFRotation2SFMatrix4f(1)

The SFRotation2SFMatrix4f field class takes an input (incoming route) of type SFRotation, converts and outputs it as an SFMatrix4f. It generates an inverse matrix if indicated.

Two fields are created from the class, rotation2Matrix and rotation2MatrixInverse.

We also get the reference to the shader and store it in vertex_shader.
flag = 1
def traverseSG():
  vp = getActiveViewpoint()
  global flag
  if vp and flag:
    rotation2Matrix.route( vertex_shader.viewpointOrn )
    vp.totalOrientation.route( rotation2Matrix )
    vp.totalPosition.route( vertex_shader.viewpointPosition )
    flag = 0
      # go to except block in case the viewpointOrnInv field does not exist in the shader.
      rotation2MatrixInverse.route( vertex_shader.viewpointOrnInv )
      vp.totalOrientation.route( rotation2MatrixInverse )
      # Do nothing, since field is not needed. 

This code excerpt merely sets up necessary routes between the fields that we have created. Our intention is to route totalOrientation and totalPosition fields of the scene's active viewpoint to viewpointOrn and viewpointPosition of ComposedShader. Direct routing between these fields cannot be done as the field types are different. As such, rotation2Matrix and rotation2MatrixInverse are used as intermediary routes.

Texturing and lighting effects modelled with shaders written in GLSL.
Texturing and lighting effects modelled with shaders written in GLSL.


Personal tools
go to