Jump to content
  • Advertisement
Sign in to follow this  
Gilzu

DirectX 9.0c strange, strange memory leak

This topic is 3618 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi all, I'm having this weird memory leak. In my application (C#.NET 2005 / Managed DirectX 9.0c / 2D with Direct3D using Sprite object) I have a selection screen which launches one of the options that the user chose and when it's done, the user returnes to the main selection screen. All is fine until I moved my app to its target machine, a winXP embedded ( which is essentially win xp pro) where it suddenly started invoking me strange low memory / out of virtual memory errors. Thing is, that in my own computer, everything work smoothly. I tried ruling out the following: 1. I use the latest build of winXP embedded (2007) and the directX components. 2. Drawed on my viewport the available memory: in my own computer, once i return to the main selection screen (thus releasing/disposing all of the textures) memory was back to normal, but with the *same* code on the winXPe machine it shows a memory leak i.e. +20 mb of textures that were not freed. 3. I tried to rule out other memory leaks by not using textures - everything works just fine, so it must be this problem. 4. For every texture I use after leaving to the selection screen I .Dispose() the texture and the sprite object i use to display it and then set it to null. 5. I tried using the dbmon.exe util of the DX SDK, and it says that there's no memory leak - probably this memory is released after the program quits, when there is no referenses i might've missed out. 6. after I quit the app on the winXPe machine, memory does free up... 7. I tried using GC.Collect after quitting the module and returning to the selection screen 8. I tried using GC.WaitForPendingFinalizers after GC.Collect 9. Also tried putting Thread.Sleep(1000) before GC.Collect 10. On a deparate attempt, I tried disposing the D3DDevice and recreate it. On my computer it worked just fine, target xpe machine... well, you know, memory was out after a couple of times. 11. .Reset() the device doesnt work either. 12. EvictManagedResources() - same effect. I'm running out of ideas and I really want to put this to work. Your kind help will be most appreciated, -Gil

Share this post


Link to post
Share on other sites
Advertisement
Your steps for removing the memory sound reasonable and fairly exhaustive to me.

Can you reproduce this sort of pattern with any other code? SDK samples or benchmarks for example?

Have you tried running PIX for Windows against the XPe environment (assuming thats possible!). You should be able to step through the call stream to see which resources are alive as far as the underlying native API is concerned. This should give you some insight into whether its a more general problem or something to do with the .NET layer (e.g. GC'ing).

The next thing to try and look into is the driver layer. Can you get the RefRast up and running on XPe? If so, that bypasses any hardware-vendor code and you can prove/eliminate the possibility of this being a hardware/driver bug.

It's been a while since I played with XPe devices, but I remember some of them did suffer the same driver hell of laptops where IHV's didn't update them for years at a time etc..etc..

hth
Jack

Share this post


Link to post
Share on other sites
The chronicles of my ever continuing effort to resolve this problem:

13. Reference Rast. won't work on Windows Embedded.
14. Same with PIX
15. Software surfaces result in the same result.
16. Tried allocation of a new Texture and then releasing it - WORKED OUT JUST FINE!
17. Tried allocation of a new Texture, render a Bitmap on it and then releasing it to rule out other memory problems - WORKS.

It appears as the TextureLoader.LoadFromFile() does load the texure, but renders it (oh, the pun in that) impossible to Dispose().

Does anyone has a good snippet to bypass TextureLoader.LoadFromFile() ?

I've tried using the Bitmap class, but it is horrendously slow in comparison to TextureLoader.LoadFromFile().

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!