Jump to content

  • Log In with Google      Sign In   
  • Create Account


ATI and memory leaks


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
6 replies to this topic

#1 Naruto-kun   Members   -  Reputation: 333

Like
0Likes
Like

Posted 25 April 2014 - 08:48 AM

Hi guys

 

So a while back I was trying to find a solution as to why some users of a DX11 module I created were experiencing severe memory leaks with textures not being released while I had no such problems as many others.

 

I have narrowed it down to one common denominator. They are all running ATI video cards.

 

So my question is, is there anything special that can be done when setting up the DX device etc to play nice with ATI?

 

Thanks

JB



Sponsor:

#2 mhagain   Crossbones+   -  Reputation: 7812

Like
2Likes
Like

Posted 25 April 2014 - 09:10 AM

The only thing that springs to mind is that all of the D3D11 methods that bind objects to the pipeline will hold references to those objects, so unless you're unbinding them before destruction you're going to leak those references.  Maybe other drivers detect this happening and silently release them behind your back, whereas ATI are less forgiving?

 

The simplest way to unbind everything is to call ID3D11DeviceContext::ClearState in the appropriate place (e.g before shutdown or when loading a new map) so if you're not already doing that I'd suggest doing it first and seeing if the problem reproduces.


It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#3 Hodgman   Moderators   -  Reputation: 29383

Like
0Likes
Like

Posted 25 April 2014 - 09:14 AM

How are you diagnosing/detecting the leaks in the first place?



#4 Naruto-kun   Members   -  Reputation: 333

Like
1Likes
Like

Posted 25 April 2014 - 01:21 PM

My DX11 modules inject graphics into a DX9 program. They only do so on command and not every frame(it is a render to texture cockpit lighting system). The users who are experiencing this problem are all ATI users and they have given me video and screenshot evidence of the VAS usage of the program climbing as they change light intensity settings (Release is called on the resources within a few seconds of stopping at a particular setting since the program is 32bit and running close to the limits of 32bit VAS space). ClearState has been tried but to no avail. All NVIDIA users don't have this problem however and I ran my own tests which showed a momentary spike in VAS within the limits of what I would expect given the amount of resources I am handling but it drops back down almost immediately after the unloading is complete.

 

I found through Google that this occurs in several DX11 games on ATI cards as well so its definitely not an isolated issue either.


Edited by Naruto-kun, 25 April 2014 - 01:23 PM.


#5 Jason Z   Crossbones+   -  Reputation: 4905

Like
1Likes
Like

Posted 25 April 2014 - 02:25 PM

If the reference count on your textures is 1, and you call release on your COM pointer, then that's the best you can do.  If there is an internal driver bug that keeps the memory floating around, then that should be handled / corrected by AMD...  Have you reported the behavior?  Or have you tried to capture some details with the graphics debugger / PIX?  You will have to provide a repro case to them for there to be any chance to get the issue fixed.

 

One other point - just because it works on NVidia cards doesn't mean that you are in the clear.  Drivers can sometimes be more lenient than they are supposed to be, so it is possible that you are actually doing something incorrectly but the NV drivers happen to handle it in a desired manner.

 

Even so, if you are using shared resources between D3D9 and D3D11, then my guess is that you are probably right that there is a driver issue in handling the memory.



#6 Naruto-kun   Members   -  Reputation: 333

Like
0Likes
Like

Posted 25 April 2014 - 02:33 PM

It happens with nonshared resources as well. They are actually the worst offenders cuz they are in the form of several textures per render target (they get blended together). Releasing them doesn't clear them up on the ATI side.



#7 alek314??   Members   -  Reputation: 276

Like
0Likes
Like

Posted 28 April 2014 - 05:26 AM

Our project use multi-thread rendering with Dx9.  When threading is not handled properly, Dx9 would sometime give us E_OUTOFMEMORY error, or it just worked, or the computer crash and reboot.  Debug mode would always print warning message when these things happened. BTW, we use AMD card.


Edited by alek314??, 28 April 2014 - 05:27 AM.





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS