Jump to content
  • Advertisement
Sign in to follow this  
coderchris

OpenGL Accurately timing GPU render time?

This topic is 3791 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello, I see that in the directx docs, they have some queries avaliable (D3DQUERYTYPE_TIMESTAMP, D3DQUERYTYPE_PIPELINETIMINGS, ect...) though there isnt much of any type of description to go along with them. I assume that there are equivalent queries avaliable for opengl. My Can any of these be used to accurately time how long it takes the GPU to render a single frame? If not those, is there another way to measure this? Obviously a standard CPU timer cant be used because rendering is asyncronous... Any tips?

Share this post


Link to post
Share on other sites
Advertisement
Maybe. I haven't looked at the queries in depth. However, I suggest using PIX or NvPerfHUD for that type of statistic (and much more) instead.

Share this post


Link to post
Share on other sites
Quote:
I assume that there are equivalent queries avaliable for opengl.
http://developer.download.nvidia.com/opengl/specs/GL_EXT_timer_query.txt
As far as I know, only supported on nVidia cards at this time.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!