Jump to content
  • Advertisement
Sign in to follow this  
_Flecko

Figuring out what's in memory

This topic is 4722 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm having a hard time debugging my device loss/reset handling. DX throws an exception on reset because there's resoures left in memory, but I can't figure out what I have forgotten to dispose of. Is there any way to see what resources still exist and are preventing the device from resetting? Thanks, Max

Share this post


Link to post
Share on other sites
Advertisement
Yes there is. Debug your program. There will be AllocId# numbers of resources still in memory. Then go to Control Panel -> DirectX and on the Direct3D pane there is option "Break on AllocId". Just write in edit box your AllocId number. Next time you debug your program, debuger will break on line where is ID setted, and you'll know what you need to reset.

Share this post


Link to post
Share on other sites
I see how that works, unfortunately I can't use it - I'm using Managed DirectX, and I don't have C++ .NET so I can't get unmanaged debugging messages. If there was another way to locate the allocID numbers I could, but I can't find one.

I've tracked down the particular problem I'm having now through guess-and-check, though: for some reason, my render to surface scene is causing the Reset call to mess up. If I don't render the scene, it goes fine, but this code messes everything up:


rts.BeginScene(texture.GetSurfaceLevel(0),view);
device.Clear(ClearFlags.Target,System.Drawing.Color.FromArgb(0,0,0,0),0,0);

//Draw with the effect
int passes=effect.Effect.Begin(FX.None);
for (int i=0;i<passes;i++) {
effect.Effect.BeginPass(i);
source.Draw();
effect.Effect.EndPass();
}
effect.Effect.End();

//End the RTS scene
rts.EndScene(Filter.None);



Even if I render that stuff with a normal Device scene it works fine, I just don't know what's up with this.

Share this post


Link to post
Share on other sites
Hi! I use CPP, and I don't know a thing about managed anything. I presume you're trying to create a 'lost focus/gained focus' or 'video restart' routine? Hopefully you can get something out of this...

In my game, I do a video reset like this:
Device Loss Routine
1) Release my objects' vertex buffers (LPDIRECT3DVERTEXBUFFER9).
2) Release those objects' index buffers (IDirect3DIndexBuffer9).
3) Release sprite objects (ID3DXSprite).
4) Release font objects (ID3DXFont).
5) DON'T render anymore.
(I don't worry about releasing LPDIRECT3DTEXTURE9 texture objects, sound objects, etc.)
Device Gain Routine
6) Init my presentation parameters (if doing a video change; eg. resolution change, windowed/fullscreen change, etc.)
7) Restart the D3D device (g_pd3dDevice->Reset(&d3dpp)).
8) Recreate the vertex buffers and index buffers.
9) Recreate my sprite and font objects.

You can see in Step 5 that I don't render anymore after I've lost my D3D object because my render routine will crash every time. Instead, I break out of the Render() function before I render anything. I also don't use my render function until I've successfully restarted my D3D device and recreated all of my vertex and index buffers.

Note: If your D3D device is using software vertex handling (you used the flag D3DCREATE_SOFTWARE_VERTEXPROCESSING when you created your D3D object), then you don't have to do steps 1, 2 and 8 because they don't get destroyed when D3D is restarted (pretty cool!).

Hope that helps a little.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!