# timer problem

This topic is 3796 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

Hi, i have a problem with my timer. I use time.h and clock() in order to make my program work on Windows and Linux. My problem is that the clock seems to be updated only every 1/60 seconds. (i use the do/while code to find this value). Why ? How can i solve my problem ? Is it a bug of Windows ? the CPU ? Visual 2005 ?
clock_t lastClock = clock();
clock_t currClock = lastClock;

double currTime    = 0.0f;
float  elapsedTime = 0.0f;

while(!bWantToExit)
{
do{
currClock = clock();
}while(currClock == lastClock);

currTime    =  (double)currClock / CLOCKS_PER_SEC;
elapsedTime =  (float)((double) (currClock-lastClock)) / CLOCKS_PER_SEC;

Render();

lastClock = currClock;
}

Thanks.

##### Share on other sites
Sounds to me like your Render() call causes a v-sync. If you're using D3D, you want to make sure that you're using D3DPRESENT_INTERVAL_IMMEDIATE rather than D3DPRESENT_INTERVAL_DEFAULT. I've no idea what the OpenGL equivalent is.

Failing that, what is the value of CLOCKS_PER_SEC?

##### Share on other sites
i don't think it comes from v-sync.
i use OpenGL
I have 75 fps with it (wglSwapIntervalEXT(1)), and i can have 800 fps without it(wglSwapIntervalEXT(0))(i tried with different scenes with different fps)(i use fraps in debug and release mode).
clock is updated every 1/60 (or 1/64) seconds, but my render do not depend on it.

CLOCKS_PER_SEC = 1000

##### Share on other sites
Actually, now I think about it, 60Hz is a perfectly reasonable speed for clock() to update at. On Windows, it uses GetTickCount() internally, which has an accuracy of about 15ms. 1000/15 = 67FPS, which is roughly what you're seeing.

clock() isn't a high resolution timer. If you need one, you'll need to use a platform specific one.

##### Share on other sites
i have already used clock() on other old projects with Visual C++ 6, and i did not have this problem.
I have just convert my program to DevCpp and i have the same problem :(

##### Share on other sites
i have the same problem when i compile my program with DevCpp and VisualCpp.
But i do not have this problem when i try other programs which use clock().

i think it comes from my program, but i don't know why.

my source code can be found here:
http://texel3d.free.fr/bugs/Win32_GLViewer_Alpha.zip
The code of my loop is in Win32_GLWindow.cpp

i will try to find the source of my problem. But if some people have some ideas...
I don't understand why clock() "bugs" only in my program. Maybe it's a stupid problem i have not seen.

##### Share on other sites
How are you using clock()? Sitting in a while() loop, waiting for it to change won't give you a change of anything less than the timer granuality (~15ms on Windows).
clock() is probably implemented as GetTickCount() on DevCpp too - it is running on Windows after all, and it needs to get the time from somewhere.
If other apps use it, but average the results it gives over several frames, then they'll work fine. What other apps are you trying?

##### Share on other sites
Ok.
I understand my mistake.
The other program i use was done by me. In this program i compute fps every 1 second. Elapsed time are added every frame. So that the problem could not be visible. 0 could be added to the "total" elapsed time. In this way i could have more than 1000 fps.

Sorry for this big big big mistake.
Thanks.

1. 1
2. 2
Rutin
16
3. 3
4. 4
5. 5

• 26
• 11
• 9
• 9
• 11
• ### Forum Statistics

• Total Topics
633703
• Total Posts
3013455
×

## Important Information

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!