Jump to content
  • Advertisement
Sign in to follow this  
McZonk

Reading the GPU temperature

This topic is 5045 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Is it possible to readout the GPU Temperature with an API. Something like int getGPUTemp(); Any ideas? It would be a usefull feature in a game. I think that there will be different ways for NVIDIA, Matrox and ATI cards. But is it possible?

Share this post


Link to post
Share on other sites
Advertisement
There is no public API for that.
Though some vendors bundle a monitoring software with their cards, you don't have general access to such feature (some [most?] cards don't even have a temp. sensor on the GPU).

Share this post


Link to post
Share on other sites
heh, would be fun though to have on benchmarks. Not just see how many FPS'es it can pull during tests, but how hot it's getting :P

Share this post


Link to post
Share on other sites
Quote:
Original post by Rocket05
im sure its possible through some ugly hacks, but what use would you have for it?

It's not possible without vendor specific libraries, which do not exist for every card and model (as I already mentioned, a lot of cards simply don't have thermal sensors).

McZonk: You could, however, contact your card manufacturer (NVidia or ATI) for information about that. Maybe they are kind enough to provide you with some pointers.


Share this post


Link to post
Share on other sites
no, im sure its possible somehow through reverse engineering.

one way i could think of doing it is to run one of the video card diagnostic programs that come with most high end cards that can do things like read cpu temp, speed, bandwidth, etc., and find the location in memory that that diagnostic program stores the cpu temp. ugly solution, but it would work.

if you were a real genious, you could reverse engineer the diagnostic program and nab the code it uses to read the cpu temp, but i really dont think you want to go through that much trouble?

Share this post


Link to post
Share on other sites
Quote:
Original post by Rocket05
no, im sure its possible somehow through reverse engineering.


Not if:
1) The GPU dosn't have a heat sensor on it
2) The card never transmits the heat levels, but only uses a sensor to adjust the fan-speed onboard, or to shut off in the case of overheating
3) You can't/don't want to create and install a driver (prehaps because you don't want to void your waranty), assuming the card driver dosn't expose an API for reading it.

But that's all card specific.

OP: It will be possible for some cards, but not for others - check out any vendor specs you can find on their websites, and check google.

Share this post


Link to post
Share on other sites
As an idea... you might want to check out the DirectX Driver Development Kit (+forums/newgroups). Not sure if it's restricted access - but if anyone knows if/how this is possible, the driver writers will.

hth
Jack

Share this post


Link to post
Share on other sites
Quote:
Original post by Rocket05
if you were a real genious, you could reverse engineer the diagnostic program and nab the code it uses to read the cpu temp, but i really dont think you want to go through that much trouble?


ExtEscape() is generally used for passing stuff like this between driver and user mode apps - the tricky part is figuring out exactly what values and data structures you need to send. There are only a few 'standard' driver escapes - anything else, like those for figuring out gpu temps, are completely up to the vendors to define any way they like. There is also no guarantee that the escape will be the same between different cards from the same company - it may well get expanded or changed to add new functionality in later driver releases.

If you were really interested in getting hold of this, you could create a proxy version of gdi32.dll that logs ExtEscape() calls and use that data to figure out the parameters being used and what they mean.

All of this does of course assume that the graphics card you're using - and the driver that is running it - even supports temperature monitoring in the first place.

Share this post


Link to post
Share on other sites
Using WMI is an alternative way to get at stuff exposed by the driver. As mentioned, exactly which stuff is exposed depends entirely on the driver in question.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!