Jump to content
  • Advertisement
Sign in to follow this  
donguow

Processing time at vertex shader

This topic is 2218 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi guys,

At the moment, I aim to estimate the time it takes to perform vertex processing. Say, I have a graphics rendering pipeline as follows:

-------------------- ------------------------ ------------------------
Vertices --->| Vertex Shader |-------> | Geometry shader | -------> .... --------> | Fragment shader | ------ Frame buffer -----> Display
-------------------- ------------------------ ------------------------

Now, I just want to estimate the average processing time at Vertex shader. Is there any possible way to do that?

Thanks in advance,
-D

Share this post


Link to post
Share on other sites
Advertisement
The vertex shader is a program. The time it needs depends entirely on that program, so it is not possible to give a general answer. There is also the complication that the time that the main application spends waiting on the vertex shader may not be the same time as the vertex shader needs, as some operations are ongoing in parallel. Not only that, but the vertex shader itself can execute in parallel, doing more than one vertex at a time. This depends on the hardware.

Another complication is that a "real" application usually have many vertex shaders, used for different things. And maybe they are using deferred shading, in which case you call two different vertex shaders after each other (where the second time is quick).

You mention geometry shading, are you sure you are going to use that?

There are ways to actually measure the time that the GPU needs, and that is by using queries.

Please say what you need the information for, and maybe there may be ways to help you optimize.

Share this post


Link to post
Share on other sites
Thank for your response, Larspensjo, here is my case:

Giving a vertex shader program, assuming that I have 1000 vertices now I want to know how long does it take the vertex shader to process 1000 vertices. Is there any way to put the timer in the middle to measure the processing time?

Thanks,
-D

Share this post


Link to post
Share on other sites
In my case, at first I just take the vertex shader into account, I mentioned geometry shader just because it is part of rendering pipeline. There is nothing to do with it at the moment. I know there are ways to measure the processing time of GPU, or in other words, of the entire rendering pipeline. Now I just want to measure the processing time consumed by part of it.

Share this post


Link to post
Share on other sites
For a short summary on how to measure performance, see Measuring graphics performance.

If you want to measure only the vertex shader, there is a trick to disable the fragment shader: Calling glCullFace(GL_FRONT_AND_BACK) will cull all facets (but not primitives like points and lines). This would of course be something you only do for testing.

Share this post


Link to post
Share on other sites
You will never be able to track that. But also, there is no vertex shader anymore:
Read stream processor.
http://www.alteredgamer.com/pc-gaming-tech/984-painting-the-scene-graphics-cards-explained/

Share this post


Link to post
Share on other sites
Well, that is slightly misleading; there are still vertex shaders, there just hasn't been any dedicated hardware for it for some time now.

If you really want to get performance data then you'll need to break out specific tools to do timing be it NV, AMD or Intel's graphics debugging/analysis programs or any extensions which let you do profiling of the pipeline.

Simply turning on culling and trying to time that way is going to give you some results I just... wouldn't trust them too much personally...

Share this post


Link to post
Share on other sites
Thank you guys. My current approach is to disable rasterzation stage (glEnable(GL_RASTERIZER_DISCARD_NV)) by using transform feedback mode. But I think inaccuracy occurs as the time it takes to copy data to transform feedback buffer is significant.

-D

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!