# SDL Timer Subsystem

## Recommended Posts

chbrules    170
I've been racking my brain over this, I can't find any good information on it anywhere! I'm trying to develope a sprite class w/ frame timing and all. I even have that Focus on SDL book, I looked up info on the API's site, and I can't find any examples anywhere! Can someone please explain how in the world to impliment timers into your code. This callback junk is confusing me, I really don't know how to go about this. Some help please? =/ (C++)

##### Share on other sites
Drew_Benton    1861
I just want it to be known I now HATE FireFox. I just had a nice long post and it *crashed* [headshake]. Anyways another try at this (in IE).

Ok this is my first time using this system of SDL, so I may not get 'everything', but I think I got it figured out. I used the docs and constructed a little demo that shows the use of the timer system. I added some comments as well so you could follow.
#include <stdio.h>#include <stdlib.h>#include <string>#include <vector>#include <iostream>#include <SDL/SDL.h>#include <SDL/SDL_OpenGL.h>#pragma comment(lib,"SDLmain.lib")#pragma comment(lib,"SDL.lib")#pragma comment(lib,"OpenGL32.lib")#pragma comment(lib,"Glu32.lib")void RenderScene();// Just make life easier for us! // We need to track all the IDs of the timers we makestd::vector<SDL_TimerID> IDS;/*********************************************************//* type definition for the "new" timer callback function *//*********************************************************/// typedef Uint32  (*SDL_NewTimerCallback)     (Uint32 interval, void *param);// Translation:// [ return type ] [ Regular Name Goes Here]   [ Parameter List ]// The only thing that can change is the name used!// This is the custom callback functionUint32 TimerCB1( Uint32 interval, void *param ){	// In this case, I made param contain a string to output	// More than likely, you will not use this param	std::cout << (char*)param << interval << std::endl;	// Do whatever else here //	// The number you return is the interval before next call	// If you use 0, it is disabled! (Or is called once then that's it)	return interval;}int main(int argc, char *argv[]){      if ( SDL_Init(SDL_INIT_AUDIO|SDL_INIT_VIDEO|SDL_INIT_TIMER) < 0 )      {            printf("Unable to init SDL: %s\n", SDL_GetError());            exit(1);      }      atexit(SDL_Quit);	  IDS.push_back( SDL_AddTimer( 1000, TimerCB1, (void*)"Timer call: " ) );      SDL_GL_SetAttribute( SDL_GL_RED_SIZE, 5 );      SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, 5 );      SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, 5 );      SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 16 );      SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );	  if ( NULL == SDL_SetVideoMode( 640, 480, 32, SDL_OPENGL ) )            exit(1);      glMatrixMode( GL_PROJECTION );      glLoadIdentity();      glFrustum(.5, -.5, -.5 * ((float)480.0f) / 640.0f, .5 * ((float)480.0f) / 640.0f, .1, 2000);      glMatrixMode(GL_MODELVIEW);	  bool done = 0;      while( !done )      {            SDL_Event event;            while ( SDL_PollEvent(&event) )            {                  if ( event.type == SDL_QUIT )                        done = 1;                   if ( event.type == SDL_KEYDOWN )                        if ( event.key.keysym.sym == SDLK_ESCAPE )                              done = 1;            }            glClear( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT );            glLoadIdentity( );            RenderScene();            SDL_GL_SwapBuffers();      }	// Now free all the timers	  for( int x = 0; x < IDS.size(); x++ )		SDL_RemoveTimer( IDS[x] );      SDL_Quit();      return 0;}void RenderScene(){	glPushMatrix();	glPopMatrix();}

Now that should be mostly self explanatory. If there is anything about that you still ahve questions on feel free to ask! Now on to what you are doing, you do not need this system. This is more for networking apps or something that needs to have stuff called on certain intervals.

In order to get what you are looking for, frametime, you will need the regular timer stuff. Now for some theory on this topic, take a look at my post here . Then on this post I have a little demo that shows how to use it in SDL. So give those a quick look at some time. If you need any more explanations from that demo or anything else feel free to ask, it's late so I've bound to have missed something [lol]

Oh yea, the main thing is that you want one main 'global' timer that has the current frametime to use in ALL your code. You can make a class that wraps all of this as well. I have a timer class that I have just remembered. Here it is, it's part of some SDL Code I need to clean up and put up on the net [wink]
Timer.h
// SDL_Time.h#ifndef SDL_TIMER_H#define SDL_TIMER_H#include "Main.h"class SDL_Time{private:	float dt;	float runtime;	int td;	int td2;public:	void Update();	void Update_runtime();	float Get_runtime() const;	float Get_dt() const;};#endif

Time.CPP
// SDL_Time.cpp#include "SDL_Time.h"void SDL_Time::Update() { 	td2 = SDL_GetTicks(); 	dt = ((float)(td2-td)) * 0.1f; 	td = td2; }void SDL_Time::Update_runtime() { 	runtime += dt*10; }float SDL_Time::Get_runtime() const { 	return runtime; }float SDL_Time::Get_dt() const { 	return dt; }

Wow, never realized how crappy that was [lol]. Ok to use it:
SDL_Time game_time;...// Update loopgame_time.Update();game_time.Update_runtime();...game_time.Get_dt(); // difference b/t framesGet_runtime(); // total runtime in ms

Ah yea, I remember now, I have another class somewhere that is a lot better, I was starting to get worried from seeing this code. Anyways cheers! Sorry for rambling on this long post [grin]

- Drew

[Edited by - Drew_Benton on June 16, 2005 5:50:38 PM]

##### Share on other sites
chbrules    170
Hmmm, I was going about this all wrong then. So everything is according to the framerate because that's not static like time? This seems to make sense, I'll have to look more into the theory, but this is great. Thanks!

##### Share on other sites
Drew_Benton    1861
Quote:
 Original post by chbrulesHmmm, I was going about this all wrong then. So everything is according to the framerate because that's not static like time? This seems to make sense, I'll have to look more into the theory, but this is great. Thanks!

No problem! Yes, everything should be based on frames because time, while constant in our world, is not on computers. For example, if you start up a timer, then try to batch process a lot of video content, your timers are going to be way off. This is simply because personal computers are thread based, but can't do real time processing. So what you have to do is use something that changes on computers, but is common to all computers - and that is what the difference between frames is. I'll let you do some more reading as you've said you will, but there are a few gotchas with using it. It doesn't *always* come out quite right [wink].

[Edited by - Drew_Benton on June 16, 2005 5:04:29 PM]