Although I have known how this is accomplished for a while, this was the first time that I have explicitly used the normal generation. The idea is to:
1) Find the two vectors that are formed between the first vertex and pointing to the other two vertices of each face.
2) Take the cross product of these two vectors after normalizing them.
3) This produces the FACE normal. I add this value to the vertex normal for each of the three vertices of this face.
4) Since most (if not all) vertices are shared by more than one face, adding multiple vectors results in a greater than unit length normal vector. But, the vector addition of all of the face normals produces a vector pointing in the average direction of the normals - so all I have to do is normalize the new vector.
And that's it, now I have a per vertex normal that is nice and smooth around the model. There are actually a couple of small gotchas when doing something like this though. You have to make sure that the winding of the polygons is correct with the way that you are viewing your models, other wise you will end up with a set of normals pointing to the inside of the model!!! It is surprisingly difficult to look at the results and just see that the normals are inside-out. The best way that I can think of is to generate a list of lines that start at each vertex's position and end at each vertex's position + the normal vector. Then render the model and line list. If you see the lines, they are pointing the right way, if you don't see the lines, the winding is probably different.
So, to show the results that I got for the bunny I wrote a quick effect file which simply outputs the world space normals as an RGB color. Here is how it turned out:
I am pretty happy with the results, it seems to show the subtle variations in the model surface. So now I can move on and start working on a couple more interesting rendering techniques...