Reading the GPU temperature

Started by
9 comments, last by Rocket05 19 years, 4 months ago
Is it possible to readout the GPU Temperature with an API. Something like int getGPUTemp(); Any ideas? It would be a usefull feature in a game. I think that there will be different ways for NVIDIA, Matrox and ATI cards. But is it possible?
Advertisement
There is no public API for that.
Though some vendors bundle a monitoring software with their cards, you don't have general access to such feature (some [most?] cards don't even have a temp. sensor on the GPU).
heh, would be fun though to have on benchmarks. Not just see how many FPS'es it can pull during tests, but how hot it's getting :P
[ ThumbView: Adds thumbnail support for DDS, PCX, TGA and 16 other imagetypes for Windows XP Explorer. ] [ Chocolate peanuts: Brazilian recipe for home made chocolate covered peanuts. Pure coding pleasure. ]
im sure its possible through some ugly hacks, but what use would you have for it?
"I never let schooling interfere with my education" - Mark Twain
Quote:Original post by Rocket05
im sure its possible through some ugly hacks, but what use would you have for it?

It's not possible without vendor specific libraries, which do not exist for every card and model (as I already mentioned, a lot of cards simply don't have thermal sensors).

McZonk: You could, however, contact your card manufacturer (NVidia or ATI) for information about that. Maybe they are kind enough to provide you with some pointers.


no, im sure its possible somehow through reverse engineering.

one way i could think of doing it is to run one of the video card diagnostic programs that come with most high end cards that can do things like read cpu temp, speed, bandwidth, etc., and find the location in memory that that diagnostic program stores the cpu temp. ugly solution, but it would work.

if you were a real genious, you could reverse engineer the diagnostic program and nab the code it uses to read the cpu temp, but i really dont think you want to go through that much trouble?
"I never let schooling interfere with my education" - Mark Twain
Quote:Original post by Rocket05
no, im sure its possible somehow through reverse engineering.


Not if:
1) The GPU dosn't have a heat sensor on it
2) The card never transmits the heat levels, but only uses a sensor to adjust the fan-speed onboard, or to shut off in the case of overheating
3) You can't/don't want to create and install a driver (prehaps because you don't want to void your waranty), assuming the card driver dosn't expose an API for reading it.

But that's all card specific.

OP: It will be possible for some cards, but not for others - check out any vendor specs you can find on their websites, and check google.
As an idea... you might want to check out the DirectX Driver Development Kit (+forums/newgroups). Not sure if it's restricted access - but if anyone knows if/how this is possible, the driver writers will.

hth
Jack

<hr align="left" width="25%" />
Jack Hoxley <small>[</small><small> Forum FAQ | Revised FAQ | MVP Profile | Developer Journal ]</small>

Quote:Original post by Rocket05
if you were a real genious, you could reverse engineer the diagnostic program and nab the code it uses to read the cpu temp, but i really dont think you want to go through that much trouble?


ExtEscape() is generally used for passing stuff like this between driver and user mode apps - the tricky part is figuring out exactly what values and data structures you need to send. There are only a few 'standard' driver escapes - anything else, like those for figuring out gpu temps, are completely up to the vendors to define any way they like. There is also no guarantee that the escape will be the same between different cards from the same company - it may well get expanded or changed to add new functionality in later driver releases.

If you were really interested in getting hold of this, you could create a proxy version of gdi32.dll that logs ExtEscape() calls and use that data to figure out the parameters being used and what they mean.

All of this does of course assume that the graphics card you're using - and the driver that is running it - even supports temperature monitoring in the first place.
Using WMI is an alternative way to get at stuff exposed by the driver. As mentioned, exactly which stuff is exposed depends entirely on the driver in question.

Simon O'Connor | Technical Director (Newcastle) Lockwood Publishing | LinkedIn | Personal site

This topic is closed to new replies.

Advertisement