SDL_Getticks question

Started by
3 comments, last by Yelnats 14 years ago
Say I call SDL_Getticks twice in my int main(), once at the begging and once at the end, with a fair bit of code in between. Will the value of SDL_Getticks be different at the beginning than at the end?
Advertisement
It depends how much time it takes for the CPU to go through that "fair bit of code".

SDL_GetTicks() returns the time, in millaseconds (1/1000 of a second), since your program started. If your computer can go through your code in less that one millasecond, SDL_GetTicks() would spit out the same number. And yes, your CPU can run that fast.

And remember, the time it takes for the CPU to run your code, is not measured by how many lines of code you have, but they type of operations it is doing. For example, finding the square root of a number is dramaticly more time consuming (but still so fast that you wouldn't notice it unless doing it hundreds of time) then squaring a number.
So, what you consider a "fair bit of code", because it takes of pages and pages of screen-space, or hundreds of lines, the CPU might consider it trivial, because they are all simple operations that it can run through with ease.
Thank you very much. I completely understand what you mean by the CPU being able to compute and operate very quickly. I mean "fair bit of code" as something that would take about, say 10 milliseconds. This is all hypothetical, I just needed to know if the ticks would be different.
Yes, if SDL_GetTicks returns time measured in millaseconds, and between two different calls of SDL_GetTicks, you spend 10 millaseconds doing something, then the second call should be 10 millaseconds greater.

However, if something in your code is absolutely depending upon that being true, you probably are making a mistake that may come back to bite you later. For example, what if somewhat has a CPU that is 20 times faster that yours, so what takes 10 millaseconds for you, only takes 0.5 for them? Will your program crash, or mess up somehow? Or, if your (theoretical) 10 millaseconds is spent loading files, what if one of your users (or even yourself, 2-3 years down the road) has a solid-state drive, and file loading is lightning fast? Will it cause problems for your program? If so, you aren't programming in a safe manner.

Programmers should avoid assuming something will always be true, or will be true on other PCs, as it leads to hard to find "once in a blue moon" bugs, that only affects some of your users at random, and hard to identify the reason why it only affects those ones. As programmers, we can't even assume there's 60 seconds in a minute.

This doesn't mean you have to be uber paranoid, but it does mean, if you find yourself asking, 'Will this always be true?' then you should take extra precaution by adding code to handle the sitation where it's not true... even if you think it wont ever happen.
Thanks, I will add some fail safe into the code.

This topic is closed to new replies.

Advertisement