Jump to content

  • Log In with Google      Sign In   
  • Create Account

Banner advertising on our site currently available from just $5!


1. Learn about the promo. 2. Sign up for GDNet+. 3. Set up your advert!


Memory leaks when Windows shuts down the display

  • You cannot reply to this topic
5 replies to this topic

#1 Tispe   Members   -  Reputation: 1187

Like
0Likes
Like

Posted Today, 11:25 AM

I posted this in another thread but I thought it needed a thread on its own.

 

When Windows power saving settings turn off the display my application stops releasing resources as they normally do. This means that if the user goes AFK and when the default 20 minutes has passed, all my GPU memory allocations starts to pile up. After a while ID3D11Device::CreateBuffer returns E_OUTOFMEMORY.

 

I'm not sure if it is the COM smart pointers (CComPtr<ID3D11Buffer>) that stop releasing, or if it happens further down in WDDM or gpu drivers.

 

As you can see in the screen-shot below: the program runs stable, then when windows turns off the display the GPU memory usage spikes to max, and on the CPU side, Commit Charge builds up, if it hits 100%, ID3D11Device::CreateBuffer returns E_OUTOFMEMORY.

 

Any help on how to alleviate or circumvent is much appreciated.

 

memleak

 



Sponsor:

#2 braindigitalis   Crossbones+   -  Reputation: 2864

Like
1Likes
Like

Posted Today, 11:37 AM

Are you checking return codes of every single DirectX call e.g. with if(FAILED())?

 

If not, do this and try again just to make sure that there is nothing you have forgotten to check on your end to prevent calling a failing function over and over.

 

You can also intercept WM_SYSCOMMAND in your message loop to watch for the screensaver coming on or power save being enabled and pause your game loop at this point which will also help. Personally in my own game I return 0 here, so that power save and screensaver are disabled while my game is active, this would work around your problem.


Edited by braindigitalis, Today, 11:38 AM.


#3 Tispe   Members   -  Reputation: 1187

Like
0Likes
Like

Posted Today, 11:48 AM


Are you checking return codes of every single DirectX call e.g. with if(FAILED())?

 

Yes, everything gets checked and I throw if failed.

 


You can also intercept WM_SYSCOMMAND in your message loop to watch for the screensaver coming on or power save being enabled and pause your game loop at this point which will also help. Personally in my own game I return 0 here, so that power save and screensaver are disabled while my game is active, this would work around your problem.

 

Is it the SC_MONITORPOWER wParam you return 0 for? So returning 0 for this will intercept and stop the monitor from shutting down, and calling DefWindowProc for the message sends it back to Windows for a shutdown?



#4 braindigitalis   Crossbones+   -  Reputation: 2864

Like
1Likes
Like

Posted Today, 01:34 PM


Is it the SC_MONITORPOWER wParam you return 0 for? So returning 0 for this will intercept and stop the monitor from shutting down, and calling DefWindowProc for the message sends it back to Windows for a shutdown?

 

Yeah, that's the one. In the event this is a driver bug this would be an effective workaround if not slightly impolite to the user to override their power saving setting :D



#5 Tispe   Members   -  Reputation: 1187

Like
0Likes
Like

Posted Today, 04:01 PM

The intercept works on Release build, but not on Debug build -.- I guess this can work, but I'd rather solve the underlying issue, which is the leak bug.



#6 Buckeye   GDNet+   -  Reputation: 7754

Like
0Likes
Like

Posted Today, 07:43 PM


all my GPU memory allocations starts to pile up.

 

When the app is running "normally," what are you allocating (and apparently later deallocating), and what specific conditions trigger the deallocation?


Please don't PM me with questions. Post them in the forums for everyone's benefit, and I can embarrass myself publicly.






PARTNERS