I've added a sprite loading method named LoadPairedSprites that takes one atlas but two textures and treats the sprites in the file as pairs, putting each pair in the same place on the two textures.
I can then build sprite files that contain the diffuse then normal maps for graphics, so I end up with two texture atlases, one for the diffuse and one for the normals. I can then set these as texture stages 0 and 1 and get the diffuse and normal from each using the same texture co-ordinates.
I haven't actually implemented this into the game yet but the idea is that an object can optionally be tagged to be drawn with normal mapping. This will then be done as a separate batch in the rendering system.
Something I am wondering about is whether I could create a vertex shader that takes in a custom vertex format that contains a z rotation matrix, then output this matrix along with the the transformed vertex into the pixel shader somehow. Not too sure how all this works but it would mean that I could then batch render normal mapped objects. Otherwise I would have to render them one at a time and set their individual rotation matrix through a shader variable, which would suck.
Hmm. Off all day again tomorrow so plenty of time to play about. Getting a bit too tired for all this complicated stuff now.
Sounds are working in the menus now as well, and I've put together a little sound effect to play over the logo (just a gust of wind and some chimes).
Hmm, well that seems to work. Seems a bit odd, but I've basically made a custom vertex declaration that has a three component texture coord instead of two. I can now set the rotation of my two dimensional object as the third value when I construct the four indexed vertices of the quad.
I can then cast this input in the pixel shader to a float2 to get the U,V and can query the Z to find the rotation in order to transform the normal map by the correct (inverse) amount.
The long and short of it is that I can now batch several normal mapped quads into a single DrawPrimitive call as each vertex is carrying the required information.
I did originally try adding a custom single float value to the vertex declaration so that (I assumed) no attempt to lerp this dummy value would take place, but for some reason that crapped out my frame rate in a big way.
I hope the above is considered creative use of, rather than abuse of, vertex declarations. [smile]
Well, it was an interesting experiment and I'm glad I got it working but I've decided that normal mapped sprites are not right for this project.
I can't normal map everything - the level blocks for example are out of the question - so the sprites that are normal mapped look out of place with the rest of the graphics.
There is some significant overhead to using the normal mapped sprites as well. The above idea of batching was not possible in the end as I need to render with at least one level of multisampling to the main view to avoid pixel shudder so I was having to render the sprites with their normal map to an offscreen texture then render to the main view without a shader enabled.
Never mind. It was fun and at least I fully get normal mapping now.
2D in 3D is a pain in the backside, regardless of what I might advise beginners about the joys of D3D over DirectDraw on the forums.