Linux Performance Timer

Started by
6 comments, last by KulSeran 18 years, 5 months ago
Okay Im using: - SDL - OpenGL - Linux (specifically Mandriva) so X11 What are my choices for a timer? I'd prefer a performance timer in nano's, but honestly I do not know of any for Linux. Thanks, Halsafar Edit: Ooops, I think this maybe shoulda been in General... but it is for a game.
Advertisement
SDL supplies an SDL_GetTicks() function. It may not be as accurate or precise as other methods, but I'd recommend it if you're using SDL already, as it will certainly work cross platform, and should (hopefully) get decent results on platforms where decent results are possible. I guess I might not trust it for profiling, but for general purpose game time measurements, it should be sufficient.
-bodisiw
The normal method is using gettimeofday, which gives microseconds and is accurate to 1us on most Linux systems (All?)

If you want better than that you'll have to use one of the other functions, which are less widely available.

1us is usually accurate enough for a game. If it's not enough, you may have a design problem.

Mark
Thanks.
I believe for the project in question I'm just use SDL_GetTick()
But for future reference, getTimeOfDay is located where?
A quick Google search suggested it was located in sys/time.h
Quote:The normal method is using gettimeofday, which gives microseconds and is accurate to 1us on most Linux systems (All?)


Actually, in practice it is NOT that accurate. Although it is measured in microsends, it comes through pretty chunky. I built a time-delta frequency graph for my last project to see how smooth the timer really was and was very surprised to find out it comes out in either zero, 7, 16, or 17 milliseconds. In fact, i took a screenshot of the graph it made:



The pink line represents the raw time deltas over time. You can see how they spike in regular intervals. The yellow line is a frequency graph. Theoretically, it *should* be a nice bellcurve, but it separates into discteet "chunks" (mentioned above) for some reason.

Lesson: gettimeofday() is not as accurate as you think, regardless of the units it's measured in. Get_Ticks() probably yields the same precision with the added bonus of being cross-platform.
The most accurate timer available on MOST CPUs is RDTSC. However, on some CPUs in some configurations (SpeedStep etc) the frequency of RDTSC will vary as the CPU load varies.

RDTSC is an assembly instruction which can easily be coded as inline assembly, or built as a linkable .o using an assembler.
enum Bool { True, False, FileNotFound };
There was a good link: Here on gamedev, about making your timer smooth out a bit.
Also has some discussion of timers.

Halsafer, I personally would stick with gettimeofday() or timeGetTime().
they seem to work under all computer setups, where as just about anything
that has better resolution may not work on all computers.

This topic is closed to new replies.

Advertisement