I'm writing a tool to measure latency in 3d applications (mostly games).
My definition of the latency is time between:
1. First D3D command of frame N (dx is 9,10,11)
2. Actual presentation time on the GPU (imagine fullscreen, time of the actual flip)
I prefer not diving into kernel, and using only user code.
Using queries i can get:
a. CPU time of the first d3d command
b. GPU time of the present event
My question is, how to calculate a delta between those? They are obviously two different processors.
Note: assume that frequency didn't change in the middle of the test.
All i can think of is some initial synchronization phase, but it sounds a bit messy.
Any ideas will be highly appreciated =)