Feedback Search Top Backward Forward
EDM/2last month and get it or download it here.

More Menus

Last month, we added code that enabled us to show the model in either wireframe or point mode. To flip between them however, we had to recompile. The first thing that we will do this month is to enhance the menuing system to let us choose which way we want our model shown.

If you remember when we set up our menus, we already added four possible display modes: points, wireframe, solid and texture. Selecting one of these choices calls the function menuFuncDisplay(), which is currently empty. What we need to add here is code that performs any one-time mode switching for each of the display types, and then inform the draw routine which method to use when drawing.

If you look at our draw code that draws a triangle mesh, you will notice that there is a line in there like this one:

  glPolygonMode( GL_FRONT_AND_BACK, GL_LINE );

This command tells OpenGL that we want our polygons drawn as outlines only, and that we want both the front and back drawn this way. This is an OpenGL mode command, and once specified, all polygon drawing uses this mode until we issue another command. Placing such a command in the draw function is not only not needed, but will actually slow things down as OpenGL does processing on it unneccessarily. Therefore, we will move this command to the menuFuncDisplay() function.

For solid display, we will change the glPolygonMode() setting from GL_LINE to GL_FILL. Since at this point this is the only differece between wireframe and solid, these two modes can now share the same chunk of code in our draw routine. All that is left in our menuFuncDisplay() routine is to store the selected mode, so that the draw routine knows what it is. We end up with the following:

 /* change settings based on what we want to show */
 switch( id )
 {
    case MENU_SHOW_POINTS:
       break;
    case MENU_SHOW_WIREFRAME:
       glPolygonMode( GL_FRONT_AND_BACK, GL_LINE );
       break;
    case MENU_SHOW_SOLID:
       glPolygonMode( GL_FRONT_AND_BACK, GL_FILL );
       break;
    case MENU_SHOW_TEXTURE:
       break;
 }
 /* store the display mode selected for the draw routine */
 selectedDisplay = id;

In our draw routine, a simple switch will choose which code we use to draw our model:

  int i;
  /* draw model based on menu selection */
  switch( selectedDisplay )
  {
     case MENU_SHOW_POINTS:
        /* show model as a point cloud */
        glBegin( GL_POINTS );
           for( i=0; i<model->numPoints; i++ )
              glVertex3fv( model->pointList[i].pt );
        glEnd();
        break;

     case MENU_SHOW_WIREFRAME:
     case MENU_SHOW_SOLID:
        /* show as a solid or wireframe polygon mesh */
        glBegin( GL_TRIANGLES );
           for( i=0; i<model->numTriangles; i++ )
           {
              glVertex3fv( model->pointList[ model->triIndex[i].index[0] ].pt );
              glVertex3fv( model->pointList[ model->triIndex[i].index[1] ].pt );
              glVertex3fv( model->pointList[ model->triIndex[i].index[2] ].pt );
           }
        glEnd();
        break;
  }

Now via the menu popup, we can easily flip amongst our display choices. If you try this you will note that the solid mode doesn't really show much because all of the polygons are drawn in the same colour; no feaures except for the outline are distinguishable. We will correct this problem by adding lighting when we display in solid mode. When we add the textures, this problem will be bypassed as the textures help us determine where the features are.

Lighting

To see features in the 3D solid model that does not have any colour information, we need a way to distingush between the different polygons in the model. Lighting does this by modifying the colour of each polygon, based on the orientation of that polygon with respect to the viewer and the light source. (For an in depth discussion of lighting, see the 'Let there be light!' column).

For lighting to be calculated correctly, we need to specify normals for each of the polygons. The normals tell OpenGL which way the polygon is facing so that the lighting variations can be calculated when the polygon changes direction with respect to the viewer. To calculate the normal just take any two connecting sides of the polygon, convert them to vectors and take the cross product. This resulant vector should then be normalized by dividing by its length. A function that does this is given below.

  /* calculate normal of polygon */
  float va[3], vb[3], vr[3], val;
  va[0] = coord1[0] - coord2[0];
  va[1] = coord1[1] - coord2[1];
  va[2] = coord1[2] - coord2[2];

  vb[0] = coord1[0] - coord3[0];
  vb[1] = coord1[1] - coord3[1];
  vb[2] = coord1[2] - coord3[2];

  vr[0] = va[1] * vb[2] - vb[1] * va[2];
  vr[1] = vb[0] * va[2] - va[0] * vb[2];
  vr[2] = va[0] * vb[1] - vb[0] * va[1];

  /* get length of normal */
  val = sqrt( vr[0]*vr[0] + vr[1]*vr[1] + vr[2]*vr[2] );

  /* specify normalized normal */
  glNormal3f( vr[0]/val, vr[1]/val, vr[2]/val );

coord1, coord2 and coord3 are the three points of the triangle polygon. They are pointers so that coord1[0] is the x coordinate of coord1, coord1[1] is the y coordinate, and coord[2] is the z coordinate.

The paintWindow() section now becomes:

 case MENU_SHOW_WIREFRAME:
 case MENU_SHOW_SOLID:
    /* show as a solid or wireframe polygon mesh */
    glBegin( GL_TRIANGLES );
       for( i=0; i<model->numTriangles; i++ )
       {
          calculateNormal( model->pointList[ model->triIndex[i].index[0] ].pt,
                                 model->pointList[ model->triIndex[i].index[2] ].pt,
                                 model->pointList[ model->triIndex[i].index[1] ].pt );
          glVertex3fv( model->pointList[ model->triIndex[i].index[0] ].pt );
          glVertex3fv( model->pointList[ model->triIndex[i].index[2] ].pt );
          glVertex3fv( model->pointList[ model->triIndex[i].index[1] ].pt );
       }
    glEnd();
    break;

You may have noticed that the vertex order has been reversed from what was listed earlier. This is because the face of the polygon was oriented in the wrong direction (inwards) resulting in no light being reflected back to the viewer. To reverse the direction of the polygon, we could either alter the order that we specify the polygon vertices, or we could call the function glFrontFace() with the argument GL_CW. This states that polygons whose vertices are ordered clockwise are front facing (by default, front faces are those that are counterclockwise GL_CCW ).

Running this on the file tris.md2 (you can download the source up to this point here), you should now see a solid image that looks like the following:

Texture Skins

Another method of showing the model as a solid, instead of using artificial lighting, is to add in colour texture maps to each polygon. While the colour information itself is not stored in this file, the name of the associated texture file, and all the information we need to attach it is.

Note: The models and textures that I am working on can be found in the players\female directory under the main Quake directory. You are free to use any directory or Quake models you wish as the techniques described here should apply to all of the Quake II models. You may need to run the game before it self extracts these files from the pack file that the models and textures are originally stored in.

Texture 'skins' as they are called for Quake II models, are stored in standard .PCX files. To see these files you can simply show them in an OS/2 folder and double click on them. OS/2 understands the format and will display them in the default image browser. By clicking on a variety of different skins you will notice that most of them follow the same organization and are interchangeable. Some .MD2 files have specific textures associated with them, for example the 'weapon.md2' file has as a specific skin file 'weapon.pcx'. This file name is listed in the .MD2 structure as a skin name. Some .MD2 files have no specific texture associated with them, such as 'tris.md2'. This model is a generic model (one for male, and one for female), and different characters are created by laying the approprite texture on top of the model. In both cases, the mapping of how to place the texture on the model exists in the .MD2 structure.

The first thing we need to do is to load in the .PCX file format and convert it to a OpenGL compatible texture. The actual format of the .PCX file is not really that important to our discussion here so I will just give a brief overview and show you the code. What is important is how to convert it into a texture that OpenGL can use and how to map that texture onto the model.

Note: The code that I'm using to read in the .PCX is just the bare minimum required for our purposes. It's enough to read in Quake II skins but I'm sure that it won't do much with other .PCX files.

The first thing we need to do to decode the .PCX file is to read in the 128 byte header, although we only need the first six shorts. The first two shorts are used to identify the file as .PCX so we'll ignore them. The second two shorts are the image's start pixel, followed by two more shorts that are the image's end pixel. To find the dimensions of the image we subtract the start pixel from the end pixel and add 1.

 /* get the image dimensions */
 pcxPtr = (pcxHeader *)texBuffer;
 imgWidth = pcxPtr->size[0] - pcxPtr->offset[0] + 1;
 imgHeight = pcxPtr->size[1] - pcxPtr->offset[1] + 1;

PCX files are 8-bit palettized images that are also slightly compressed. I say slightly because the only compression that occurs is when there are a number of pixels in a row that have the same value [This is also known as run-length encoding. Ed]. In this case it stores the number of repetitions followed by the pixel value to repeat. To make things easier, the first step I have done is to uncompress the PCX into a regular image array:

 /* image starts at 128 from the beginning of the buffer */
 imgBuffer = malloc( imgWidth * imgHeight );
 imgBufferPtr = 0;
 pcxBufferPtr = &texBuffer[128];
 /* decode the pcx image */
 while( imgBufferPtr < (imgWidth * imgHeight) )
 {
    if( *pcxBufferPtr > 0xbf )
    {
       int repeat = *pcxBufferPtr++ & 0x3f;
       for( i=0; i<repeat; i++ )
          imgBuffer[imgBufferPtr++] = *pcxBufferPtr;
    } else {
       imgBuffer[imgBufferPtr++] = *pcxBufferPtr;
    }
    pcxBufferPtr++;
 }

The values have been palettized, so we need to convert them back out to 24 bit values before we can use them. The image palette is stored as the last 768 bytes in the file (256 possible values * 24bit = 768 bytes).

 /* read in the image palette */
 paletteBuffer = malloc( 768 );
 for( i=0; i<768; i++ )
    paletteBuffer[i] = texBuffer[ texFileLen-768+i ];

We will do the final conversion out to 24bit below when we create the OpenGL texture.

Skins to Texture

The previous column that dealt with textures was 'Let there be texture!'. If at any time during the next few paragraphs, you get confused and think to yourself "What the heck is this guy doing?", simply go back to the texturing column and all of your questions will be answered. And remember, "Hakuna Matata" [As far as I remember, this is a Lion King reference :) Ed.]

One of the restrictions of OpenGL textures is that the dimensions must be a power of two. Since the image has to be converted to 24 bit before we give it to OpenGL, we can change the dimensions at the same time. In the following code imgWidth and imgHeight are the dimensions of the PCX image, and texWidth and texHeight are the dimensions of the next higher OpenGL-friendly dimension.

  int i, j;
  model->texture = malloc( model->texWidth * model->texHeight * 3 );
  for (j = 0; j < imgHeight; j++)
  {
     for (i = 0; i < imgWidth; i++)
     {
        model->texture[3*(j * model->texWidth + i)+0]
              = paletteBuffer[ 3*imgBuffer[j*imgWidth+i]+0 ];
        model->texture[3*(j * model->texWidth + i)+1]
              = paletteBuffer[ 3*imgBuffer[j*imgWidth+i]+1 ];
        model->texture[3*(j * model->texWidth + i)+2]
              = paletteBuffer[ 3*imgBuffer[j*imgWidth+i]+2 ];
     }
  }

Now that we have the texture, we need the information that maps portions of this texture to each polygon. Just like points have a coordinate in 3D space, the points also have a texture coordinate. In 3-space the coordinates are referred to as x, y and z. In texture space the coordinates are generally called s and t. S and t coordinates in OpenGL are floating point values ranging from 0.0 to 1.0, with 0.0 being the left or bottom most edge of the texture, and 1.0 being the right or top most edge.

Texture coordinates in the Quake II file are not floating point, but short values that correspond to pixel positions. We have to convert these values to floating point for OpenGL, but first we have to read them in. In the modelHeader (the header associated with the start of the .MD2 file) there is an entry called numST. This is the number of ST pairs in the .MD2 file and you can find them starting at offsetST bytes from the start of the file. When we read these values in we will convert them to floating point by dividing by the texture dimensions.

 /* create the texture list */
 st = (stTexCoord *)malloc( sizeof( stTexCoord ) * mdh->numST );
 model->numST = mdh->numST;
 model->st = st;

 stPtr = (stTexture *)&buffer[mdh->offsetST];
 for( i=0; i<mdh->numST; i++ )
 {
    st[i].s = (float)stPtr[i].s / (float)model->texWidth;
    st[i].t = (float)stPtr[i].t / (float)model->texWidth;
 }

These values are a list of all of the texture mappings used by the model. When we created our mesh earlier, each of the points for the polygons in the mesh were actually an index into the point list. The same is true for the texture coordinates; each point for the polygon has in index to the texture coordinate list created above. We can expand our mesh creation code to the following:

  /* create a mesh list */
  ...
  /* point to indexes in buffers */
  bufIndexPtr = (mesh *)&buffer[mdh->offsetTris];

  for( i=0; i<mdh->numTris; i++ )
  {
     triIndex[i].meshIndex[0] = bufIndexPtr[i].meshIndex[0];
     triIndex[i].meshIndex[1] = bufIndexPtr[i].meshIndex[1];
     triIndex[i].meshIndex[2] = bufIndexPtr[i].meshIndex[2];
     triIndex[i].stIndex[0] = bufIndexPtr[i].stIndex[0];
     triIndex[i].stIndex[1] = bufIndexPtr[i].stIndex[1];
     triIndex[i].stIndex[2] = bufIndexPtr[i].stIndex[2];
  }

This now gives us all of the information we need to decode the texture. If the model has a specific skin associated with it, then the numSkins value in the header will indicate this. This value will then be 1 and the name of the required texture will be a text string at offsetSkins from the beginning of the file. If no skin name is present then we can apply any texture we want. In this modeller I am reading the texture that you want applied from the program parameters.

The only thing left to do is to set up the OpenGL texture settings, and to enable the textures when we select that mode from the menu. In our main() function, we can call a function that sets up our texture settings:

void initializeTextureMapping()
{
   /* specify texture parameters */
   glTexParameterf( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP );
   glTexParameterf( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP );
   glTexParameterf( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST );
   glTexParameterf( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST );
   glTexEnvf( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_DECAL );
}

After we load our model and call the above function, we need to tell OpenGL to use the texture we created:

 /* now load in the model from the parameter line */
 if( argc == 2 )
    model = loadModel( argv[1], 0 );
 else if( argc >= 3 )
    model = loadModel( argv[1], argv[2] );
 /* specify the texture loaded if any */
 initializeTextureMapping();
 if( model->texture )
 {
    glTexImage2D( GL_TEXTURE_2D, 0, 3, model->texWidth, model->texHeight, 0,
                               GL_RGB, GL_UNSIGNED_BYTE, (void *)model->texture );
 }

That's it! Now we just have to enable texturing when we select it from the menu (in the menuFuncDisplay(), add another case statement for MENU_SHOW_TEXTURE)

  case MENU_SHOW_TEXTURE:
     glDisable( GL_LIGHTING );
     glEnable( GL_TEXTURE_2D );
     glPolygonMode( GL_FRONT_AND_BACK, GL_FILL );
     break;

And when we draw the function, specify the texture coordinates. In the paintWindow() function, MENU_SHOW_TEXTURE is almost identical to MENU_SHOW_SOLID except we don't calculate a normal. In addition, we have to specify a glTexCoord*() command for every glVertex*(). Remember that like the vertices, the texture coordinates are indexed. Our MENU_SHOW_TEXTURE section then becomes:

 case MENU_SHOW_TEXTURE:
    /* show as a solid with texture mapped on */
    glBegin( GL_TRIANGLES );
       for( i=0; i<model->numTriangles; i++ )
       {
          glTexCoord2f( model->st[ model->triIndex[i].stIndex[0] ].s,
                          model->st[ model->triIndex[i].stIndex[0] ].t );
          glVertex3fv( model->pointList[ model->triIndex[i].meshIndex[0] ].pt );
          glTexCoord2f( model->st[ model->triIndex[i].stIndex[2] ].s,
                          model->st[ model->triIndex[i].stIndex[2] ].t );
          glVertex3fv( model->pointList[ model->triIndex[i].meshIndex[2] ].pt );
          glTexCoord2f( model->st[ model->triIndex[i].stIndex[1] ].s,
                          model->st[ model->triIndex[i].stIndex[1] ].t );
          glVertex3fv( model->pointList[ model->triIndex[i].meshIndex[1] ].pt );
       }
    glEnd();
    break;

Run this (you can get the finished source and executable here) and you should be able to get textures on your model.

The following snapshots were taken with the voodoo.pcx skin mapped onto the female tris.md2 mesh:

One note about program placement: It is best to place the executable in the 'baseq2' directory under the Quake II install directory. This is because if a skin name is explicitly named, it includes the full pathname starting from this directory (for example a valid skin name would be 'players\female\weapon.pcx' ). Therefore to view the images as above you would type on the command line:

  glDemo players\female\tris.md2 players\female\voodoo.pcx

(The first parameter is the model name and the second is an optional texture)

Things to try:

When I enabled textures in the code above, I disabled lighting. Having them both together is not just a matter of enabling lighting with textures enabled; some other setup should be performed. Try to get both lighting and textures active at once. Hint: If you can't get it, go back and review the column that discusses it Let There be Lit Things

Conclusion

Well that's another OpenGL column for this month! I hope you have fun playing around with all of the Quake II characters. Next month I will show you how to animate the characters by running through the frames stored in the .MD2 file.

Unless anyone has an OpenGL topic (this series was suggested by a reader), next month will be the last OpenGL column as I'm out of ideas of what to cover. If you have an idea (it can be anything, even basic stuff) just click on my signature above and send it to me at the EDM/2 address.

See you next month!

 

Linkbar