how many clock ticks take.....

Started by
0 comments, last by hpolloni 20 years, 6 months ago
im making a little test of the differents method to calculate the sqrt of a float number,and i want to know how many clock ticks take to execute a given algorithm. i was using the time command of linux,but the difference of time,between the differents algorithms,are really small. I hope you can help me, i dont know if im the right forum tho
Advertisement
A simple method is to repeat the operation and find the average time. So instead of how long it takes to do one you see how long it takes to do a thousand or a million. The time will be basically y=mx+b where x is how many iteration you performed, m is the cost per iteration and b is the overhead for looping. That isn''t true for sorting 10 records versus 100 because it isn''t a linear function, but should be roughly true for sorting 10 records once versus sorting 10 records 100 times assuming it is the same records in the same starting order each time. Generally I try to get what I''m measuring up at least into a few seconds. Then repeat the tests. The results will vary and how it varies gives you some idea of how reliable your test results are. Also check your performance meter before you start. Try to get the CPU utilization down as low as possible. Be aware of any periodic spikes that you are unable to eliminate. Basically know your test environment.
Keys to success: Ability, ambition and opportunity.

This topic is closed to new replies.

Advertisement