Jump to content

  • Log In with Google      Sign In   
  • Create Account


[Solved] C++ timer library and Compute Shaders


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
3 replies to this topic

#1 Martin Perry   Members   -  Reputation: 1054

Like
0Likes
Like

Posted 21 March 2012 - 03:26 AM

Hi, i have problem measuring times of compute shaders.

I am using QueryPerformanceCounter (and for checking also clock()). Both gave +- same results in ms. But if I measure time on my stopwatch Posted Image I gave totally different time.
C++ timer says 10ms, but its rubbish. Application (computer) freezes for about 30 seconds during shader execution.

How can I measure this ?

Thanks

Sponsor:

#2 Hodgman   Moderators   -  Reputation: 27690

Like
0Likes
Like

Posted 21 March 2012 - 04:41 AM

I'm guessing that you're measuring the amount of time it takes for the CPU to submit your workload, not the amount of time it takes the GPU to actually execute the work?
Perhaps you should post some code or describe your testing methodology.

#3 Martin Perry   Members   -  Reputation: 1054

Like
0Likes
Like

Posted 21 March 2012 - 06:54 AM

Version with clock

clock_t initial = clock();
this->deviceContext->Dispatch(this->group.x, this->group.y, this->group.z);

clock_t current = clock();
float diff = (current - initial); //ms


#4 Martin Perry   Members   -  Reputation: 1054

Like
1Likes
Like

Posted 21 March 2012 - 09:22 AM

OK.. I solved it.. I used DX Query profiling...




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS