I am monitoring my graphic card memory usage by running an ati tool called memory viewer. I see that the amount of memory being used gradually increased after 30 times of running my game ( i dont mean running it 30 times at the same time. I mean running it once, exiting, then running again, repeat 30 times ). I was unaware it was possible for a windows directx game to leak graphics memory because i thought windows gaurenteed to release the resources upon the game process exiting, just like it does with system ram. Am I correct? If so then it must be another process chomping away at the memory but when i end all processes the graphics memory never falls back down to what it started at.
First thing that comes to mind is driver issues, bugged tool, or your program keeps running hidden in the background.
This is however just a curiosity, the real issues is your applications memory leaks and the removal of them (if any).
it approximately increases by about 2% every 10 loads, however it does fluctuate. Looking at my code I cannot see how I can be possibly leaking resources because I have a texture wrapper class which creates the directx texture in its constructor and releases it in its destructor. The constructor is private so can only be created by a static create function which returns a shared pointer to the texture wrapper class. So upon exiting the game when the share pointers reference count reaches zero the texture it refers to will be released. I have confirmed this works correctly.
All of this is part of a gaming library that I have wrote. I have made similar games using the same library and they do not seem to affect the amount of memory left when constantly loading and exiting them.
The reason why this is a problem is because if I load and exit long enough the game eventually starts running very slow due to the lack of vram available and causes thrashing.
I am using windows XP.
I am really out of ideas because I was convinced that the operating system should release the VRAM resources when that process has exited (I am certain the process has exited cleanly). However all my other games do not seem to suffer from this problem even though they are based on the same libraries.
Just out of curiosity could this be a contributing factor:
there is another application running that uses direct X that I wrote. Every time I start the game this other application releases the direct X device and sits and waits until the game has exited. At which point it exits itself and then it is loaded again by another process. I know that this seems an odd thing to do but there is good reason.. Its purpose is to display attracts for the games I have wrote and when a game is played it releases the device.
The leak does disappear when not running this other process.
However when running this process and running my other games there still seems to be no leak. It is only when running this process and the one specific game.
Also every time I quit the game this other process also quits and is restarted. So Windows should also be freeing all of its resources so it should not be leaking
Everything you said is correct apart from number two,. The advertising process releases its device when it detects that a game is running and not when it has finished running.
I'm not 100% convinced that the advertising process does cleanly shutdown but I don't understand why that should make a difference because when it does exit shouldnt Windows clean up for it whether it's shutdown cleanly or not? Don't get me wrong I do want everything to close down cleanly, it's just if it isn't closing down cleanly I don't believe it should make a difference. Please correct me if I'm wrong.
Someone mentioned earlier that the graphics card may allocate more memory when not in use to meet the demands of the applications. Could someone please explain this further?