Jump to content
  • Advertisement
Sign in to follow this  
angelmu88

Monitoring GPU activity for a DirectX app

This topic is 2476 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi everybody I'm currently working on a DirectX project and I'm using extensively shaders. I don't know why but every time I run my program, after a while the computer fan starts to work a lot, like when playing videogames with hundreds of enemies on screen. The problem is that if I run my app on my laptop, after a short time running, the laptop switchs off, and I think this is because of the high temperatures the gpu or the cpu reaches because of my DirectX game (if I don't launch my program, the laptop never turns off, even if I'm using some graphic program like 3d studio).
I would like to know some good tools to find what's the problem?
maybe a gpu monitor or a cpu monitor; my graphic card is nvdia, my O.S is Windows 7, and I'm using visual studio and C++.

Share this post


Link to post
Share on other sites
Advertisement

Sorry, but I don't know much about v-sync, how could v-sync help me?


It prevents you from drawing more frames than your display is able to show. In some cases this may reduce the workload of your GPU/CPU.

Cheers!

[Edit] Or more precisely it'll synchronize your "present" the update frequency of the screen.

Share this post


Link to post
Share on other sites

[quote name='angelmu88' timestamp='1330862080' post='4919147']
Sorry, but I don't know much about v-sync, how could v-sync help me?


It prevents you from drawing more frames than your display is able to show. In some cases this may reduce the workload of your GPU/CPU.

Cheers!

[Edit] Or more precisely it'll synchronize your "present" the update frequency of the screen.
[/quote]

Ok thank you, I'm going to give it a try and see what happens ;)

Share this post


Link to post
Share on other sites
nvidia inspector will let you monitor GPU load, VRAM usage, clock speed and temperature.

If what you are drawing so far is relatively simple, your app will push the GPU as fast as it can to draw more, and you will get thousands or even tenthousands of fps, with your GPU at 100% load.

Presenting with vsync will make your Present call block until a vsync occurs (every 1/60th second, for example), so if you had 10.000 fps before, your GPU load might drop as low as 1% with vsync.

It's as easy as calling Present(1, 0) instead of Present(0,0).
IDXGISwapChain::Present
You could also wait for more vsyncs to save power if you like, for example when your app loses focus.

Share this post


Link to post
Share on other sites

If what you are drawing so far is relatively simple, your app will push the GPU as fast as it can to draw more, and you will get thousands or even tenthousands of fps, with your GPU at 100% load.


Yes, you are right, I just checked it with a temperature monitor and temperature rises up to 100ºC even with my simplest version of the app, with a few thousands of fps.
Limiting the number of fps to 60 fps, could be another solution? isn't it?

Share this post


Link to post
Share on other sites

[quote name='melak47' timestamp='1330903131' post='4919289']
If what you are drawing so far is relatively simple, your app will push the GPU as fast as it can to draw more, and you will get thousands or even tenthousands of fps, with your GPU at 100% load.


Yes, you are right, I just checked it with a temperature monitor and temperature rises up to 100ºC even with my simplest version of the app, with a few thousands of fps.
Limiting the number of fps to 60 fps, could be another solution? isn't it?
[/quote]

Ok forget about my previous question, I've just realized that Vsync lock fps to 60 ;)
You were right, with vsync fps are locked to 60 and the graphic card fans don't go crazy.
But know I have a question, when programming I like to see the impact on performance that a new thing I've just added could have, and with fps locked I can't see this impact. So I think, at least when developing, I will turn off Vsync, but now I'm scared of breaking down the graphic card due to the high temperature, does anyone know what temperature could reach the graphic card without being damaged, because as I said with Vsync off it can reach 100 ºC or so.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!