Execution speed of timers

Started by
1 comment, last by ludvig 21 years, 6 months ago
Hi, I have performed some tests to see how fast various timers execute (PerformanceCounter, timeGetTime, GetTickCount). On all test and on various systems the PerformanceCounter has the "worse" performance as i''m getting the following results: PerformancCounter: roughly 3 microseconds timeGetTime: roughly 2 microseconds GetTickCount: roughly 2 microseconds I have read some materials from the net, and I have only found tests saying that timeGetTime should have the worse performance, but no code to back it up. What i''m doing is using the PerformanceCounter to test all 3: *** QueryPerformanceCounter(&start); QueryPerformanceCounter(&dummy); QueryPerformanceCounter(&stop); QueryPerformanceCounter(&start); dummy = timeGetTime(); QueryPerformanceCounter(&stop); QueryPerformanceCounter(&start); dummy = GetTickCount(); QueryPerformanceCounter(&stop); *** And then calculating the microseconds with the formula: ((stop - start) / "Ticks Per Second") * 1000 * 1000 Is this the right way? Furthermore if any have an idea to how to calculate the accuracy of each timer, then I will be even more happy ;-) TIA Ludvig
Advertisement
Try QueryPerformanceFrequency for the performance timer, it should be quite reliable. All the docs on the net about it can''t be very wrong either, read them through carfefully. Also, try to make a long delay between all timings, that will give much better results.
Hi,

I''m actually using QueryPerformanceFrequency() to calculate the execution speed, and all tests is done 100000 times in a loop, and then the average is used as the result.

This topic is closed to new replies.

Advertisement