Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

BiGF00T

runtime of functions

This topic is 5225 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

how can i measure the time that is needed to run a certain function? should i just read the starting time then loop the function 100000000x and then look at the time difference or is there a better method? sometimes i wonder if a function would be faster if i changed it. is there another way to determine if functionA() is faster than functionB()? thx BiGF00T

Share this post


Link to post
Share on other sites
Advertisement
Are these functions in a program you are writing? Or are they just algorithms you want to time?

If it''s the former then you''ll want to use a tool called a profiler, this can give you data such as how many times a function is called, average time it takes to run the function etc. If it''s the latter then you need a function that can return the current time (this may be the actual time or the time since the system started or the time since the program started, but it doesn''t matter) and then you run this time function at the beginning and end of the function you are timing, save the somewhere and the difference between the two is how long the function took.

Share this post


Link to post
Share on other sites
thx for the reply
i''m not thinking about something specific atm. i was just curious. sometimes i wondered if i should better rewrite a function or an algorithm because it seemed too bad. next time i''ll try to check which one is the faster and go with the fast one
any suggestions on a profiler?

Share this post


Link to post
Share on other sites
It depends on what compiler you''re using. I believe VS Studio 6 comes with one (as long as you don''t have the learning edition) while with VS.Net you have to use an external one. For GCC (which is MinGW on windows, this is the compiler used by Dev-C++ and MinGW studio among others) you want to use GProff which should be in your compilers bin directory. AMD CodeAnalyst (which does work with non-AMD CPUs) is one you could use with VS.Net but IMO it ain''t really that great. I can''t think of any other free profilers for VS. Net at the moment but a couple of google searches will probably reveal more.

Share this post


Link to post
Share on other sites
You can also do this for free if you use Windows :-)

GetTickCount() is a built in function that returns the time since windows has been started. Using this information, you can do this...


//include the windows header

#include "windows.h"

//this is basically what we're going to do:

//end_time - start_time = runtime



//start the timer

int start_time = GetTickCount();

//insert code you want to time here


//stop the timer and save the result

int runtime = GetTickCount() - start_time;

//now you can type cast the runtime to print it out



[edited by - mrtuti_17 on June 5, 2004 10:15:31 AM]

Share this post


Link to post
Share on other sites
quote:
Original post by mrtuti_17
You can also do this for free if you use Windows :-)

GetTickCount() is a built in function that returns the time since windows has been started. Using this information, you can do this...
[edited by - mrtuti_17 on June 5, 2004 10:15:31 AM]


yeah, i was thinking about something like that. that will help me find the faster function or algorithm. since i own vs.net which i got from my university for free i wont switch to vc++ just because of the profiler. the gettick thing will have to do it.

thx monder i think i''ll go with the tick count thing because i dont want to waste money on a good profiler. i dont really need one. its just for my fun. expensive things arent funny

Share this post


Link to post
Share on other sites
Probably one of the most accurate ways to time something:

Loop through x times:

1. Serialize instruction excecution with CPUID.
2. Read time stamp counter with RDTSC and store it somewhere.
3. Excecute function or code segment once.
4. Serialize instruction excecution with CPUID.
5. Read time stamp counter again and compute the difference between the start value.

Then, find the LOWEST delta time and that is your approximate running time. This way you wont be counting random interrupts. Note that this is not in milliseconds, so just use it for comparing times on the same machine. This also is affected by multitasking, so leave as few processes running as possible. Instead of RDTSC, you can also use QueryPerformanceCounter/Frequency which has a much higher resolution than GetTickCount.

Share this post


Link to post
Share on other sites
There is one thing I forgot to mention about using GetTickCount().

Since windows is not a real time operating system, you want to run the code more than once. For example, if I wanted to test two sorting algorithms, bubble sort and quick sort, i would tell them to sort 100,000 random integers in an array.

When performing the test, I wouldn''t do anything that would make the program run slower (opening programs, photoshopping, etc.). I would test the sorts multiple times and average my results in a spreadsheet before drawing any conclusions.

Maybe using a profiler would be quicker, but I don''t know anything about those, so I won''t say anything more.

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!