I spent the second half of the day coding. The first half of the day I took my wife and my dog to Cumberland Mountain State Park for a great day of hiking.
One of my issues today was properly rendering OBJ exports out of Poser. After tracking down and repairing the non-null terminated string bug that was seg faulting my vertex array and some minor format tweaks, I was able to get the follow up and running:
This was an exciting step, in the fact that it showed that my engine could take almost whatever OBJ format you throw at it and everything showed up correct. That is, except for the cold staring alien eyes.
The issue was that Poser does not use alpha channels and the textures were simple uncompressed jpegs.
I opened up the jpegs in Photoshop and tried to manipulate the alpha channels myself and export into png. I didn't have any immediate luck doing this, however... so time to roll up my sleeves and wade through the code.
I finally found one problem. It was the choice of data structure I was using to traverse the OBJ in the rendering loop. Basically, I was doing a straight traverse of a red-black tree that was keyed on materials. This traversal is linear just as a double-linked list but the red-black tree proved to save some computation cycles during data structure construction. The issue here, however, was the order that the materials were listed in the .mtl file was not the order that they were being traversed in the rendering loop due to the tree rebalancing that took place on the tree on an insertions.
So, since the tree was "almost in order" and n will probably never be that large, I did a simple merge sort into a double-linked list at the end of my OBJ data construction. This allowed me to render all faces in the order that the materials appear in the .mtl file.
And here is what I was rendering:
Nice eyelashes, but the blank alien eyes were not working for me. At this point I was enabling blending and setting glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
I changed the textures in Photoshop to see if this revealed anything. I changed the material properties. Disabled lighting. Nothing revealing so far. I noticed that one of the materials in the mtl was named EyeTrans. Hmmm, could the textures actually be sitting inside the eyeball? I changed the blending factor to glBlendFunc(GL_SRC_ONE, GL_SRC_ALPHA); to take a look:
Ah, that's it! The eye textures are actually set back slightly into the eyeball. It was downhill from here. A simple material check in the rendering loop let me blend the destination and source fragments correctly.
I think the texture offsets into the material looks really nice. It appears a lot more like a human eyeball and not as much like a sphere texture mapped into a skull socket mesh.
As you can see from the screenshot, I am getting around 12 frames per second on my old 2 Ghz Pentium 4 with a 9800XT. I can still push the bottleneck around the graphics hardware with some optimizations, but for over 100k polygons with hi-res textures I am happy with the results for the moment.
I have some interesting ideas on how to implement rapid eye-movement in a vertex shader that I will flesh out in the near future; pun intended.