View more

View more

View more

### Image of the Day Submit

IOTD | Top Screenshots

### The latest, straight to your Inbox.

Subscribe to GameDev.net Direct to receive the latest updates and exclusive content.

# Timing seconds per frame in C++

Old topic!

Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

8 replies to this topic

### #1mynameisnafe  Members

Posted 02 February 2013 - 04:40 AM

Hi guys,

I did a google search for "C++ seconds per frame" and was provided a million links to frames per second stuff.

So far I've almost got them on screen, however before I get to any animation stuff I'd like to implement a clock that can tell me how many seconds (a decimal such as 0.0321 will be fine) have elapsed since the last frame.

This figure will be used in curve fitting math, so I'd like it to be accurate.

Is seconds per frame the inverse of FPS?

Can anybody help? I've read a few things about c++ not providing a clock as time is dependant on the OS?

Thanks!

Posted 02 February 2013 - 04:45 AM

Is seconds per frame the inverse of FPS?

Yes.

### #3Zaoshi Kaba  Members

Posted 02 February 2013 - 08:00 AM

Is seconds per frame the inverse of FPS?

Roughly yes, but that will be very inaccurate.

Can anybody help? I've read a few things about c++ not providing a clock as time is dependant on the OS?

C++11 is supposed to provide high resolution timer, but I think I heard there are some problems (see http://en.cppreference.com/w/cpp/chrono/high_resolution_clock ).

If you need timer only for Windows OS see http://msdn.microsoft.com/en-us/library/windows/desktop/ms644900(v=vs.85).aspx#high_resolution

### #40r0d  Members

Posted 02 February 2013 - 08:07 AM

Look at the documentation for the platform (SDK) you are working on, it should provide you with some functions for high-precision timing.  For example, in Windows you can use QueryPerformanceTimer() and QueryPerformanceCounter() to get nanosecond precision.

http://msdn.microsoft.com/en-us/library/windows/desktop/ms644904.aspx

http://msdn.microsoft.com/en-us/library/windows/desktop/ms644905.aspx

In iOS you can use mach_absolute_time() and mach_timebase_info().

### #5noitarenev  Members

Posted 02 February 2013 - 08:48 AM

Here's a function that I have been using on Linux with GLUT.

int frameCount = 0;
int currentTime = 0;
int previousTime = 0;
float fps = 0;

...

void GetFPS( void )
{
frameCount++;
currentTime = glutGet( GLUT_ELAPSED_TIME );
int timeInterval = currentTime - previousTime;

if( timeInterval > 1000 )
{
fps = frameCount / ( timeInterval / 1000.0f );
previousTime = currentTime;
frameCount = 0;
}
}

...



I call GetFPS function after the model gets animated, but _BEFORE_ calling display function( drawing a frame ).

And some futher reading, if you're interested: http://fabiensanglard.net/timer_and_framerate/index.php

Edited by Indloon, 02 February 2013 - 08:52 AM.

How do you suppose certain knowledge in past times was suddenly there?
Luck, chance, coincidence?

*cough* Elite Family *cough*

### #6Servant of the Lord  Members

Posted 02 February 2013 - 02:26 PM

You don't measure in seconds per frame, because one frame is less than a second. Usually you measure in millaseconds per frame or microseconds per frame.

Here's the code I use: (it's cross-platform, with millisecond accuracy)

#include <ctime>

//The time, in seconds, since the program started. (1.5f = one and a half seconds)
Seconds RunTimeInSeconds()
{
//CPU "ticks" since the program started.
clock_t programTickCount = clock();

//Convert from ticks to seconds.
float seconds = float(programTickCount) / CLOCKS_PER_SEC;

return seconds;
}

//Same as above, but in millaseconds as an integer. (1500 = one and a half seconds)
Milliseconds RunTime()
{
//CPU "ticks" since the program started.
clock_t programTickCount = clock();

//Conversion rate between ticks and millaseconds.
float msMultiplyer = 1000.0f / CLOCKS_PER_SEC;

//Convert from ticks to seconds.
int millaSeconds = (programTickCount * msMultiplyer);

return millaSeconds;
}

Using microseconds gives even more accuracy, however there are not any standard microsecond timers in C++, so you have to use third party or OS-specific timers, unless you are using the new C++11 standard's <chrono> library, which has a high precision timer that might be microsecond or better, but falls back to milliseconds if microsecond precision isn't available.

It's perfectly fine to abbreviate my username to 'Servant' or 'SotL' rather than copy+pasting it all the time.
All glory be to the Man at the right hand... On David's throne the King will reign, and the Government will rest upon His shoulders. All the earth will see the salvation of God.
Of Stranger Flames -

Posted 02 February 2013 - 03:21 PM

Note that Windows normally only increments the standard precision timers (like timeGetTime(), getTickCount() and clock() ) every 15 ms or so. That can be less than a single frame, so it's not going to give you perfectly smooth animation. To fix that either use timeBeginPeriod(1) and timeEndPeriod() or use QueryPerformanceCounter() which has other issues on some hardware, but is much more precise than 1ms.

Note that timeBeginPeriod() adjusts a system wide setting, so you may end up with higher precision without asking because some other program requested it.

### #8larspensjo  Members

Posted 02 February 2013 - 04:02 PM

If you are using OpenGL and glfw, then there is the glfwGetTime(). Returns time as a double float.

I am using a simple measurement utility, that collects data and regularly reports the status: WorstTime.

Current project: Ephenation.
Sharing OpenGL experiences: http://ephenationopengl.blogspot.com/

### #9Trienco  Members

Posted 03 February 2013 - 02:13 AM

Using microseconds gives even more accuracy, however there are not any standard microsecond timers in C++, so you have to use third party or OS-specific timers, unless you are using the new C++11 standard's <chrono> library, which has a high precision timer that might be microsecond or better, but falls back to milliseconds if microsecond precision isn't available.

If you are using VS 2012, you should use the boost implementation of the chrono library (unless they updated it in the meantime). Why? I basically picture the conversation like this:

Marketing: "We already put chrono support on the website."

Devs: "But we can't make it in time!"

Marketing: "We don't give a damn."

Devs: "Let's base all clocks on the standard timer, pretend we implemented it and just deal with the bug reports later."

Marketing: "See, was that so hard?"

I'd definitely use some chrono implementation, though. It's the new standard, it's platform independent and you don't have to worry about the potential pitfalls of using QueryPerformanceCounter on multi-core CPUs.

f@dzhttp://festini.device-zero.de

Old topic!

Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.