Jump to content
  • Advertisement
Sign in to follow this  
Halsafar

Linux Performance Timer

This topic is 4866 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Okay Im using: - SDL - OpenGL - Linux (specifically Mandriva) so X11 What are my choices for a timer? I'd prefer a performance timer in nano's, but honestly I do not know of any for Linux. Thanks, Halsafar Edit: Ooops, I think this maybe shoulda been in General... but it is for a game.

Share this post


Link to post
Share on other sites
Advertisement
SDL supplies an SDL_GetTicks() function. It may not be as accurate or precise as other methods, but I'd recommend it if you're using SDL already, as it will certainly work cross platform, and should (hopefully) get decent results on platforms where decent results are possible. I guess I might not trust it for profiling, but for general purpose game time measurements, it should be sufficient.

Share this post


Link to post
Share on other sites
The normal method is using gettimeofday, which gives microseconds and is accurate to 1us on most Linux systems (All?)

If you want better than that you'll have to use one of the other functions, which are less widely available.

1us is usually accurate enough for a game. If it's not enough, you may have a design problem.

Mark

Share this post


Link to post
Share on other sites
Thanks.
I believe for the project in question I'm just use SDL_GetTick()
But for future reference, getTimeOfDay is located where?

Share this post


Link to post
Share on other sites
Quote:
The normal method is using gettimeofday, which gives microseconds and is accurate to 1us on most Linux systems (All?)


Actually, in practice it is NOT that accurate. Although it is measured in microsends, it comes through pretty chunky. I built a time-delta frequency graph for my last project to see how smooth the timer really was and was very surprised to find out it comes out in either zero, 7, 16, or 17 milliseconds. In fact, i took a screenshot of the graph it made:



The pink line represents the raw time deltas over time. You can see how they spike in regular intervals. The yellow line is a frequency graph. Theoretically, it *should* be a nice bellcurve, but it separates into discteet "chunks" (mentioned above) for some reason.

Lesson: gettimeofday() is not as accurate as you think, regardless of the units it's measured in. Get_Ticks() probably yields the same precision with the added bonus of being cross-platform.

Share this post


Link to post
Share on other sites
The most accurate timer available on MOST CPUs is RDTSC. However, on some CPUs in some configurations (SpeedStep etc) the frequency of RDTSC will vary as the CPU load varies.

RDTSC is an assembly instruction which can easily be coded as inline assembly, or built as a linkable .o using an assembler.

Share this post


Link to post
Share on other sites
There was a good link: Here on gamedev, about making your timer smooth out a bit.
Also has some discussion of timers.

Halsafer, I personally would stick with gettimeofday() or timeGetTime().
they seem to work under all computer setups, where as just about anything
that has better resolution may not work on all computers.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!