graphic card memory leak

Started by
6 comments, last by codetemplar 13 years ago
Hi all,

I am monitoring my graphic card memory usage by running an ati tool called memory viewer. I see that the amount of memory being used gradually increased after 30 times of running my game ( i dont mean running it 30 times at the same time. I mean running it once, exiting, then running again, repeat 30 times ). I was unaware it was possible for a windows directx game to leak graphics memory because i thought windows gaurenteed to release the resources upon the game process exiting, just like it does with system ram. Am I correct? If so then it must be another process chomping away at the memory but when i end all processes the graphics memory never falls back down to what it started at.

Thanks
Advertisement
First thing that comes to mind is driver issues, bugged tool, or your program keeps running hidden in the background.
This is however just a curiosity, the real issues is your applications memory leaks and the removal of them (if any).
This isn't really proof that your game is causing a memory leak in VRam (in the same way that Windows Task Manager shouldn't be used to demonstrate regular memory leaks).

It's possible that the driver is resizing internal allocations to meet the demands of the applications using it - "warming up" in a way.

...Or there could be a bug in the ATI driver... in which case, it's not your fault and there's not much you can do.

Which OS are you using? How much does the reported value increase by each time you run your game?
First of all thank you both for your responses

it approximately increases by about 2% every 10 loads, however it does fluctuate. Looking at my code I cannot see how I can be possibly leaking resources because I have a texture wrapper class which creates the directx texture in its constructor and releases it in its destructor. The constructor is private so can only be created by a static create function which returns a shared pointer to the texture wrapper class. So upon exiting the game when the share pointers reference count reaches zero the texture it refers to will be released. I have confirmed this works correctly.

All of this is part of a gaming library that I have wrote. I have made similar games using the same library and they do not seem to affect the amount of memory left when constantly loading and exiting them.

The reason why this is a problem is because if I load and exit long enough the game eventually starts running very slow due to the lack of vram available and causes thrashing.

I am using windows XP.

I am really out of ideas because I was convinced that the operating system should release the VRAM resources when that process has exited (I am certain the process has exited cleanly). However all my other games do not seem to suffer from this problem even though they are based on the same libraries.

Just out of curiosity could this be a contributing factor:

there is another application running that uses direct X that I wrote. Every time I start the game this other application releases the direct X device and sits and waits until the game has exited. At which point it exits itself and then it is loaded again by another process. I know that this seems an odd thing to do but there is good reason.. Its purpose is to display attracts for the games I have wrote and when a game is played it releases the device.

Thanks for any help
Try it out: disable this additional process stuff and see if you still get the leak.
The leak does disappear when not running this other process.
However when running this process and running my other games there still seems to be no leak. It is only when running this process and the one specific game.
Also every time I quit the game this other process also quits and is restarted. So Windows should also be freeing all of its resources so it should not be leaking

thanks
This sounds very complex. Allow me to summarise my understanding: you have 3 processes, a monitor, a game and an "advertising" process.


  1. The monitor restarts this "advertising" process when one of your games finishes.
  2. Your "advertising" process detects when your games aren't running and relinquishes its device. It quits when the game finishes.
  3. The game is a standard process.

I don't see why both the monitor and advert program need to detect the game closing.

Does the monitor allocate any memory or other resources? Are you 100% sure that the ad process finishes completely and cleanly?

Are you using the most up to date graphics card drivers?

We can but guess now. Unless you want to post this troublesome process we can't be more specific really.
Everything you said is correct apart from number two,. The advertising process releases its device when it detects that a game is running and not when it has finished running.

I'm not 100% convinced that the advertising process does cleanly shutdown but I don't understand why that should make a difference because when it does exit shouldnt Windows clean up for it whether it's shutdown cleanly or not? Don't get me wrong I do want everything to close down cleanly, it's just if it isn't closing down cleanly I don't believe it should make a difference. Please correct me if I'm wrong.

Someone mentioned earlier that the graphics card may allocate more memory when not in use to meet the demands of the applications. Could someone please explain this further?

Drivers up-to-date
Thanks all for your help

This topic is closed to new replies.

Advertisement