Timing seconds per frame in C++

Started by
7 comments, last by Trienco 11 years, 2 months ago

Hi guys,

I did a google search for "C++ seconds per frame" and was provided a million links to frames per second stuff.

I'm loading .3ds models.

So far I've almost got them on screen, however before I get to any animation stuff I'd like to implement a clock that can tell me how many seconds (a decimal such as 0.0321 will be fine) have elapsed since the last frame.

This figure will be used in curve fitting math, so I'd like it to be accurate.

Is seconds per frame the inverse of FPS?

Can anybody help? I've read a few things about c++ not providing a clock as time is dependant on the OS?

Thanks! :)

Advertisement

Is seconds per frame the inverse of FPS?

Yes.

Is seconds per frame the inverse of FPS?

Roughly yes, but that will be very inaccurate.

Can anybody help? I've read a few things about c++ not providing a clock as time is dependant on the OS?

C++11 is supposed to provide high resolution timer, but I think I heard there are some problems (see http://en.cppreference.com/w/cpp/chrono/high_resolution_clock ).

If you need timer only for Windows OS see http://msdn.microsoft.com/en-us/library/windows/desktop/ms644900(v=vs.85).aspx#high_resolution

Look at the documentation for the platform (SDK) you are working on, it should provide you with some functions for high-precision timing. For example, in Windows you can use QueryPerformanceTimer() and QueryPerformanceCounter() to get nanosecond precision.

http://msdn.microsoft.com/en-us/library/windows/desktop/ms644904.aspx

http://msdn.microsoft.com/en-us/library/windows/desktop/ms644905.aspx

In iOS you can use mach_absolute_time() and mach_timebase_info().

Here's a function that I have been using on Linux with GLUT.


int frameCount = 0;
int currentTime = 0;
int previousTime = 0;
float fps = 0;
 
...
 
void GetFPS( void )
{
    frameCount++;
    currentTime = glutGet( GLUT_ELAPSED_TIME );
    int timeInterval = currentTime - previousTime;
 
    if( timeInterval > 1000 )
    {
        fps = frameCount / ( timeInterval / 1000.0f );
        previousTime = currentTime;
        frameCount = 0;
    }
}
 
...
 

I call GetFPS function after the model gets animated, but _BEFORE_ calling display function( drawing a frame ).

And some futher reading, if you're interested: http://fabiensanglard.net/timer_and_framerate/index.php

How do you suppose certain knowledge in past times was suddenly there?
Luck, chance, coincidence?

*cough* Elite Family *cough*

You don't measure in seconds per frame, because one frame is less than a second. Usually you measure in millaseconds per frame or microseconds per frame.

Here's the code I use: (it's cross-platform, with millisecond accuracy)


#include <ctime>
 
//The time, in seconds, since the program started. (1.5f = one and a half seconds)
Seconds RunTimeInSeconds()
{
    //CPU "ticks" since the program started.
    clock_t programTickCount = clock();
 
    //Convert from ticks to seconds.
    float seconds = float(programTickCount) / CLOCKS_PER_SEC;
 
    return seconds;
}
 
//Same as above, but in millaseconds as an integer. (1500 = one and a half seconds)
Milliseconds RunTime()
{
    //CPU "ticks" since the program started.
    clock_t programTickCount = clock();
 
    //Conversion rate between ticks and millaseconds.
    float msMultiplyer = 1000.0f / CLOCKS_PER_SEC;
 
    //Convert from ticks to seconds.
    int millaSeconds = (programTickCount * msMultiplyer);
 
    return millaSeconds;
}


Using microseconds gives even more accuracy, however there are not any standard microsecond timers in C++, so you have to use third party or OS-specific timers, unless you are using the new C++11 standard's <chrono> library, which has a high precision timer that might be microsecond or better, but falls back to milliseconds if microsecond precision isn't available.

Note that Windows normally only increments the standard precision timers (like timeGetTime(), getTickCount() and clock() ) every 15 ms or so. That can be less than a single frame, so it's not going to give you perfectly smooth animation. To fix that either use timeBeginPeriod(1) and timeEndPeriod() or use QueryPerformanceCounter() which has other issues on some hardware, but is much more precise than 1ms.

Note that timeBeginPeriod() adjusts a system wide setting, so you may end up with higher precision without asking because some other program requested it.

If you are using OpenGL and glfw, then there is the glfwGetTime(). Returns time as a double float.

I am using a simple measurement utility, that collects data and regularly reports the status: WorstTime.

[size=2]Current project: Ephenation.
[size=2]Sharing OpenGL experiences: http://ephenationopengl.blogspot.com/

Using microseconds gives even more accuracy, however there are not any standard microsecond timers in C++, so you have to use third party or OS-specific timers, unless you are using the new C++11 standard's <chrono> library, which has a high precision timer that might be microsecond or better, but falls back to milliseconds if microsecond precision isn't available.

If you are using VS 2012, you should use the boost implementation of the chrono library (unless they updated it in the meantime). Why? I basically picture the conversation like this:

Marketing: "We already put chrono support on the website."

Devs: "But we can't make it in time!"

Marketing: "We don't give a damn."

Devs: "Let's base all clocks on the standard timer, pretend we implemented it and just deal with the bug reports later."

Marketing: "See, was that so hard?"

I'd definitely use some chrono implementation, though. It's the new standard, it's platform independent and you don't have to worry about the potential pitfalls of using QueryPerformanceCounter on multi-core CPUs.

f@dzhttp://festini.device-zero.de

This topic is closed to new replies.

Advertisement