[Solved] C++ timer library and Compute Shaders

Started by
2 comments, last by Martin Perry 12 years, 1 month ago
Hi, i have problem measuring times of compute shaders.

I am using QueryPerformanceCounter (and for checking also clock()). Both gave +- same results in ms. But if I measure time on my stopwatch smile.png I gave totally different time.
C++ timer says 10ms, but its rubbish. Application (computer) freezes for about 30 seconds during shader execution.

How can I measure this ?

Thanks
Advertisement
I'm guessing that you're measuring the amount of time it takes for the CPU to submit your workload, not the amount of time it takes the GPU to actually execute the work?
Perhaps you should post some code or describe your testing methodology.
Version with clock

clock_t initial = clock();
this->deviceContext->Dispatch(this->group.x, this->group.y, this->group.z);

clock_t current = clock();
float diff = (current - initial); //ms
OK.. I solved it.. I used DX Query profiling...

This topic is closed to new replies.

Advertisement