Jump to content

  • Log In with Google      Sign In   
  • Create Account

Timing seconds per frame in C++


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
8 replies to this topic

#1 mynameisnafe   Members   -  Reputation: 252

Like
0Likes
Like

Posted 02 February 2013 - 04:40 AM

Hi guys,

 

I did a google search for "C++ seconds per frame" and was provided a million links to frames per second stuff.

 

I'm loading .3ds models.

So far I've almost got them on screen, however before I get to any animation stuff I'd like to implement a clock that can tell me how many seconds (a decimal such as 0.0321 will be fine) have elapsed since the last frame.

 

This figure will be used in curve fitting math, so I'd like it to be accurate.

 

Is seconds per frame the inverse of FPS?

 

Can anybody help? I've read a few things about c++ not providing a clock as time is dependant on the OS?

 

Thanks! :)



Sponsor:

#2 Adam_42   Crossbones+   -  Reputation: 2507

Like
1Likes
Like

Posted 02 February 2013 - 04:45 AM

Is seconds per frame the inverse of FPS?

 

Yes.



#3 Zaoshi Kaba   Crossbones+   -  Reputation: 4353

Like
2Likes
Like

Posted 02 February 2013 - 08:00 AM

Is seconds per frame the inverse of FPS?

Roughly yes, but that will be very inaccurate.

 

Can anybody help? I've read a few things about c++ not providing a clock as time is dependant on the OS?

C++11 is supposed to provide high resolution timer, but I think I heard there are some problems (see http://en.cppreference.com/w/cpp/chrono/high_resolution_clock ).

If you need timer only for Windows OS see http://msdn.microsoft.com/en-us/library/windows/desktop/ms644900(v=vs.85).aspx#high_resolution



#4 0r0d   Members   -  Reputation: 819

Like
1Likes
Like

Posted 02 February 2013 - 08:07 AM

Look at the documentation for the platform (SDK) you are working on, it should provide you with some functions for high-precision timing.  For example, in Windows you can use QueryPerformanceTimer() and QueryPerformanceCounter() to get nanosecond precision.

 

http://msdn.microsoft.com/en-us/library/windows/desktop/ms644904.aspx

http://msdn.microsoft.com/en-us/library/windows/desktop/ms644905.aspx

 

In iOS you can use mach_absolute_time() and mach_timebase_info().



#5 noitarenev   Members   -  Reputation: 224

Like
0Likes
Like

Posted 02 February 2013 - 08:48 AM

Here's a function that I have been using on Linux with GLUT.

 

int frameCount = 0;
int currentTime = 0;
int previousTime = 0;
float fps = 0;
 
...
 
void GetFPS( void )
{
    frameCount++;
    currentTime = glutGet( GLUT_ELAPSED_TIME );
    int timeInterval = currentTime - previousTime;
 
    if( timeInterval > 1000 )
    {
        fps = frameCount / ( timeInterval / 1000.0f );
        previousTime = currentTime;
        frameCount = 0;
    }
}
 
...
 

 

I call GetFPS function after the model gets animated, but _BEFORE_ calling display function( drawing a frame ).

 

And some futher reading, if you're interested: http://fabiensanglard.net/timer_and_framerate/index.php


Edited by Indloon, 02 February 2013 - 08:52 AM.

How do you suppose certain knowledge in past times was suddenly there?
Luck, chance, coincidence? 

*cough* Elite Family *cough* 


#6 Servant of the Lord   Crossbones+   -  Reputation: 19564

Like
2Likes
Like

Posted 02 February 2013 - 02:26 PM

You don't measure in seconds per frame, because one frame is less than a second. Usually you measure in millaseconds per frame or microseconds per frame.

 

Here's the code I use: (it's cross-platform, with millisecond accuracy)

#include <ctime>
 
//The time, in seconds, since the program started. (1.5f = one and a half seconds)
Seconds RunTimeInSeconds()
{
    //CPU "ticks" since the program started.
    clock_t programTickCount = clock();
 
    //Convert from ticks to seconds.
    float seconds = float(programTickCount) / CLOCKS_PER_SEC;
 
    return seconds;
}
 
//Same as above, but in millaseconds as an integer. (1500 = one and a half seconds)
Milliseconds RunTime()
{
    //CPU "ticks" since the program started.
    clock_t programTickCount = clock();
 
    //Conversion rate between ticks and millaseconds.
    float msMultiplyer = 1000.0f / CLOCKS_PER_SEC;
 
    //Convert from ticks to seconds.
    int millaSeconds = (programTickCount * msMultiplyer);
 
    return millaSeconds;
}

 
Using microseconds gives even more accuracy, however there are not any standard microsecond timers in C++, so you have to use third party or OS-specific timers, unless you are using the new C++11 standard's <chrono> library, which has a high precision timer that might be microsecond or better, but falls back to milliseconds if microsecond precision isn't available.


It's perfectly fine to abbreviate my username to 'Servant' rather than copy+pasting it all the time.
All glory be to the Man at the right hand... On David's throne the King will reign, and the Government will rest upon His shoulders. All the earth will see the salvation of God.
Of Stranger Flames - [indie turn-based rpg set in a para-historical French colony] | Indie RPG development journal

[Fly with me on Twitter] [Google+] [My broken website]

[Need web hosting? I personally like A Small Orange]


#7 Adam_42   Crossbones+   -  Reputation: 2507

Like
1Likes
Like

Posted 02 February 2013 - 03:21 PM

Note that Windows normally only increments the standard precision timers (like timeGetTime(), getTickCount() and clock() ) every 15 ms or so. That can be less than a single frame, so it's not going to give you perfectly smooth animation. To fix that either use timeBeginPeriod(1) and timeEndPeriod() or use QueryPerformanceCounter() which has other issues on some hardware, but is much more precise than 1ms.

 

Note that timeBeginPeriod() adjusts a system wide setting, so you may end up with higher precision without asking because some other program requested it.



#8 larspensjo   Members   -  Reputation: 1540

Like
0Likes
Like

Posted 02 February 2013 - 04:02 PM

If you are using OpenGL and glfw, then there is the glfwGetTime(). Returns time as a double float.

 

I am using a simple measurement utility, that collects data and regularly reports the status: WorstTime.


Current project: Ephenation.
Sharing OpenGL experiences: http://ephenationopengl.blogspot.com/

#9 Trienco   Crossbones+   -  Reputation: 2173

Like
1Likes
Like

Posted 03 February 2013 - 02:13 AM

Using microseconds gives even more accuracy, however there are not any standard microsecond timers in C++, so you have to use third party or OS-specific timers, unless you are using the new C++11 standard's <chrono> library, which has a high precision timer that might be microsecond or better, but falls back to milliseconds if microsecond precision isn't available.

 

If you are using VS 2012, you should use the boost implementation of the chrono library (unless they updated it in the meantime). Why? I basically picture the conversation like this:

 

Marketing: "We already put chrono support on the website."

Devs: "But we can't make it in time!"

Marketing: "We don't give a damn."

Devs: "Let's base all clocks on the standard timer, pretend we implemented it and just deal with the bug reports later."

Marketing: "See, was that so hard?"

 

I'd definitely use some chrono implementation, though. It's the new standard, it's platform independent and you don't have to worry about the potential pitfalls of using QueryPerformanceCounter on multi-core CPUs.


f@dzhttp://festini.device-zero.de




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS