My water rendering demo was running just fine at around 120fps. Dropped in a couple of bits of code and it plummetted to 18-20fps. I really, really do not understand why [headshake]:
LPDIRECT3DSURFACE9 pOriginalRT = NULL;
LPDIRECT3DSURFACE9 pOriginalDS = NULL;
pd3dDevice->GetRenderTarget( 0, &pOriginalRT );
pd3dDevice->GetDepthStencilSurface( &pOriginalDS );
// ... some stuff here
pd3dDevice->SetRenderTarget( 0, pOriginalRT );
SAFE_RELEASE( pOriginalRT );
pd3dDevice->SetDepthStencilSurface( pOriginalDS );
SAFE_RELEASE( pOriginalDS );
I've got 10's if not 100's of other projects that use an identical construct! Making a defensive copy of the implicit swap chain before doing any render-to-texture work isn't uncommon.
So after scratching my head and wondering why it was killing my performance I found out it was the two SAFE_RELEASE() calls. Comment those out and performance jumps back up to the original 120fps; but obviously I end up with a ridiculous number of memory leaks.
I fired up a couple of other projects that use the same construct and they dont seem to show the same characteristics. They're also line-for-line identical.
A real "WTF?!?!" moment.
I'm very sure I've done something silly and just not noticed, but for now I'm gonna let it be and drink beers whilst watching Copland. I'm very sceptical, but rumour has it the film contains proof that Sylvester Stallone can act. Either way a film with Robert De Niro, Ray Liotta and Harvey Keitel in support can't be that bad...