Jump to content
  • Advertisement
Sign in to follow this  
teo2006

weird memory leak with rendertargets

This topic is 4328 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

hello, for some days, I m struggling against a big problem. I'm working with C++/DX8.1. In my engine, I m using quite a lot of rendertarget (that are not allocated/released at every frame). It is working fine on most of the computers I tested but on some I have a big problem with the memory usage. On those PCs, at a certain time, I can see in the WinXP manager that a lot of RAM is allocated without any reason. Furthermore, this *mis*-allocated RAM is not released when I close the application. More strangely, to actually release this memory, I have to start another directX application that achieves for an unknown reason to free this lasting memory. I tried to track the problem by *emptying* as much as possible my engine. Currently, it is loading some textures and vertex/index buffers as managed. Then it creates some rendertargets by using: myDevice->CreateTexture( width, height, 1,D3DUSAGE_RENDERTARGET , D3DFMT_X8R8G8B8, D3DPOOL_DEFAULT, surface); I dont process further these rendertargets and use them as they are (getting an interesting color blur texture effect ;) ) with: myDevice->SetTexture( 0, surface); I really thing that I m not experimenting a *classic* memory leak because if I do exactly the same with D3DUSAGE_DEFAULT instead of D3DUSAGE_RENDERTARGET, I don't get this trouble. I also found out that by calling myDevice->ResourceManagerDiscardBytes(((DWORD) (0))); at every frame I solved this problem... but because I assume that it is rather expensive to do so, I'd like to find another solution... (by the way I already take care of using this command when creating a default resource after a managed one). Does someone has any idea of: -what is happening? (memory fragmentation, limited number of rendertarget????.... ) -how to solve it -why this other directX application achieves afterwards to empty the memory that my app has leaked... Thanks a lot! tEo. PS: sorry for the length of this post and its terribly poor english :$ ...

Share this post


Link to post
Share on other sites
Advertisement
If it works fine on some computers and not others and your app is executing the same code on all of them (i.e. it isn't changing behavior based on caps), then that would seemingly point to a driver issue.

You could always enable the D3D debug runtime and watch the debugger output pane when your app exits to see if you've made a mistake and haven't released something properly. You could also try to run against the reference rasterizer on one of the computers that exhibits the problem in an attempt to see if it's driver related.

What's D3DUSAGE_DEFAULT? That isn't defined in any of my D3D headers.

Share this post


Link to post
Share on other sites
I made a mistake in my post: what i'm using in place of D3DUSAGE_RENDERTARGET is 0 and not an inexisting D3DUSAGE_DEFAULT.. sorry :$

I already tried to run my app on one of the problematic PC with the directX debug runtime and dont get any feedback telling me that I forgot to release something. Even with "Break on memory leak" checked, nothing happens.

I also already updated all the drivers.

Don, what is the "reference rasterizer" you're speaking about?

By the way, I forgot two things in my previous post:
-if I never use the created RenderTargets (by never assigning them to a texture stage), I don't get the problem.
-in most of the cases, the amount of unexplainable and unreleased memory is allocated when I bring the mouse into the area where I render (the engine runs in windowed mode).

Share this post


Link to post
Share on other sites
> -if I never use the created RenderTargets (by never assigning them to a texture stage), I don't get the problem.

Trace the reference count of your render target on exit (::Release returns the current counter after decrementation). And try to SetTexture(stage# of RT,NULL).

Just a guess...

Share this post


Link to post
Share on other sites
Thanks again for the reply... but it still does not help me :'(..

As I wrote, I don't have any *classic* memory leak. I'm already checking the reference count when releasing my renderTargets... and *oooh surprise* it returns 0 as it should...

I'm also setting all the texture stage to NULL after using the renderTarget...

Share this post


Link to post
Share on other sites
The Reference Rasterizer is a driver supplied in the DXSDK that is designed to generate correct results so that you can compare its output to a hardware driver. It's horribly slow but sometimes you can use it to determine if an actual hardware driver is doing something odd.

To use the Reference Rasterizer, specify D3DDEVTYPE_REF in the call to CreateDevice.

Are all of the computers that exhibit this problem using the same brand of video card?



Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!