Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

Clueless

Enginuity Profiling

This topic is 5222 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi. I want to follow the Enginuity series and make some changes. For example I don't want to use SDL. I need an alternative for the SDL timer. How precise or not is the time (NULL) function? (ctime library) I'd love to use the glfw timer but it seems to be the only thing that is broken in my version of the library. I want it to be as close to portable as possible. Are there any open source libraries? Should time (NULL) be good enough? Any other suggestions? [edited by - Clueless on May 31, 2004 3:41:47 PM]

Share this post


Link to post
Share on other sites
Advertisement
Thanks. I played a lot with it and there are some things I don''t understand.

I can''t compile the source from there:

LARGE_INTEGER tim, freq;
double seconds;

QueryPerformanceCounter(&tim);
QeryPerformanceFrequency(&freq);
seconds = (double)tim / (double)freq;

This causes errors (union LARGE_INTEGER'' used where a floating point value was expected)

With the QuadPart

seconds = (double)tim.QuadPart / (double) freq.QuadPart;

I can compile it but it doesn''t really look accurate enough to me.
( Example: seconds = 85471.4 )

Does anybody know what is happening and why?

Share this post


Link to post
Share on other sites
Well here's my timer code. It's in a header, and I'm aware it's kindof ugly. I hacked it together to play with stuff. I'm planing of cleaning it up later, (I'm playing with my programs design at the moment).

I'm using the define and ifdef's to seperate the different systems out. I should pass the define to the compiler. I put the T infront of it because I'm not sure what the default defines are the compiler may use, (or if it has them).

I took some of the timer code from SDL base code from nehe's site, and the linux code (which is untested) is from the article. Hopefully you can extract what every you need from my (it's mostly others) code.


#ifndef HEADER_TIMER
#define HEADER_TIMER

//LINUX NOT tested!!!!!

#define TWIN32
//#define TLINUX



#if defined(TWIN32)
#include <windows.h>
#elif defined(TLINUX)
#include <sys/time.h>
#include <unistd.h>
#else
#include <SDL.h>
#endif


class timer{ //may try to move into window manager, since platform specific


double New;
double Old;
double DeltaMS;
double DeltaSec;

#if defined(TWIN32)
struct{ // Create A Structure For The Timer Information

__int64 frequency; // Timer Frequency

float resolution; // Timer Resolution

unsigned long mm_timer_start; // Multimedia Timer Start Value

unsigned long mm_timer_elapsed; // Multimedia Timer Elapsed Time

bool performance_timer; // Using The Performance Timer?

__int64 performance_timer_start; // Performance Timer Start Value

__int64 performance_timer_elapsed; // Performance Timer Elapsed Time

} stimer; // Structure Is Named timer


//Timer stuff goes here

void initTimer(){
memset(&stimer, 0, sizeof(stimer)); // Clear Our Timer Structure


// Check To See If A Performance Counter Is Available

// If One Is Available The Timer Frequency Will Be Updated

if (!QueryPerformanceFrequency((LARGE_INTEGER *) &stimer.frequency)){
// No Performace Counter Available

stimer.performance_timer = FALSE; // Set Performance Timer To FALSE

stimer.mm_timer_start = timeGetTime(); // Use timeGetTime() To Get Current Time

stimer.resolution = 1.0f/1000.0f; // Set Our Timer Resolution To .001f

stimer.frequency = 1000; // Set Our Timer Frequency To 1000

stimer.mm_timer_elapsed = stimer.mm_timer_start; // Set The Elapsed Time To The Current Time

}else{
// Performance Counter Is Available, Use It Instead Of The Multimedia Timer

// Get The Current Time And Store It In performance_timer_start

QueryPerformanceCounter((LARGE_INTEGER *) &stimer.performance_timer_start);
stimer.performance_timer = TRUE; // Set Performance Timer To TRUE

// Calculate The Timer Resolution Using The Timer Frequency

stimer.resolution = (float) (((double)1.0f)/((double)stimer.frequency));
// Set The Elapsed Time To The Current Time

stimer.performance_timer_elapsed = stimer.performance_timer_start;
}
}

double GetTicks(){
__int64 time; // time Will Hold A 64 Bit Integer


if (stimer.performance_timer){ // Are We Using The Performance Timer?

QueryPerformanceCounter((LARGE_INTEGER *) &time); // Grab The Current Performance Time

// Return The Current Time Minus The Start Time Multiplied By The Resolution And 1000 (To Get MS)

return ( (double) ( time - stimer.performance_timer_start) * stimer.resolution);
}else{
// Return The Current Time Minus The Start Time Multiplied By The Resolution And 1000 (To Get MS)

return( (double) ( timeGetTime() - stimer.mm_timer_start) * stimer.resolution);
}
}
#elif defined(TLINUX)

double GetTicks(){
static struct timeval time;
static struct timezone tz;

gettimeofday(&time,&tz);
return (double)time.tv_sec + (double)time.tv_usec/(1000.0*1000.0));
}
#else
double GetTicks(){
return SDL_GetTicks()*0.001;
}
#endif





public:
timer(){//Should use getticks to set start time. but can't because SDL may not be inited

#ifdef TWIN32
initTimer();
#endif
DeltaMS=DeltaSec=New=Old=0.0; //Note Deltas at zero may lead to div by 0

}
void Update(){
Old=New;
New=GetTicks();
DeltaSec=New-Old;
DeltaMS=Delta*0.001;
}
const double DeltaMS() const{
return DeltaMS;
}
const double DeltaS() const{
return DeltaSec;
}
const double HZ() const{
return 1.0/DeltaSec;
}
};

#endif


[edited by - Cocalus on May 30, 2004 11:54:54 AM]

Share this post


Link to post
Share on other sites
Thanks

I don't get what it does and can't concentrate enough.
But I'll get back to this thread and (thanks for the tip) I'll also look in the SDL and Glfw source as soon as the timer actually plays a role.

I stole the includes for Linux from your code.
(and the //LINUX NOT tested!!!!! comment)

[edited by - Clueless on May 30, 2004 2:28:10 PM]

Share this post


Link to post
Share on other sites
Thanks. It is probably because my head spins right now ...
but I still can't figure out how to get the time in seconds at the beginning of a frame.

EDIT:

Without a warning it suddenly started working properly.
Thanks everybody

[edited by - Clueless on May 30, 2004 3:59:38 PM]

Share this post


Link to post
Share on other sites
I cleaned up the source a bit and made some functions static. So now you can call use timer::GetSec() without making a timer object. Note that timer::initTimer() should be called before using GetSec directly (the constructor handles it for objects). It'll usually work without calling initTimer, but it can't use the PerformanceCounter without it.


#ifndef HEADER_TIMER
#define HEADER_TIMER

//LINUX NOT tested!!!!!

#define TWIN32
//#define TLINUX


#include <iostream>

#if defined(TWIN32)
#include <windows.h>
#elif defined(TLINUX)
#include <sys/time.h>
#include <unistd.h>
#else
#include <SDL.h>
#endif


class timer{ //may try to move into window manager, since platform specific

private:
double New;
double Old;
double Delta;

#if defined(TWIN32)
static double resolution;
static bool usePerformanceCounter;
#endif

public:

#if defined(TWIN32)
static void initTimer(){
static bool inited=false;
__int64 freq;

if (!inited){
usePerformanceCounter=QueryPerformanceFrequency((LARGE_INTEGER *) &freq);

if (usePerformanceCounter)
resolution=1.0/(double)freq;
else
resolution=0.001;
}
}

static double GetSecs(){
static __int64 time;
if (usePerformanceCounter){
QueryPerformanceCounter((LARGE_INTEGER *) &time);
return (double)time*resolution;
}else{
return (double)timeGetTime()*resolution;
}
}
#elif defined(TLINUX)
static void initTimer(){}

static double GetSecs(){
static struct timeval time;
static struct timezone tz;

gettimeofday(&time,&tz);
return (double)time.tv_sec + (double)time.tv_usec/(1000.0*1000.0));
}
#else
static void initTimer(){
if(!SDL_WasInit(SDL_INIT_TIMER))
SDL_InitSubSystem(SDL_INIT_TIMER); //Not sure if SDL_Init checks what's already inited, I would hope so

}

static double GetSecs(){
return (double)SDL_GetTicks()*0.001;
}
#endif

timer(){
initTimer();

Delta=New=New=0.0;
Update();//Note Deltas will be very large until it's been updated again

}
void Update(){
Old=New;
New=GetSecs();
Delta=New-Old;
}
const double DeltaMS() const{
return Delta*0.001;
}
const double DeltaS() const{
return Delta;
}
const double HZ() const{
return 1.0/Delta;
}
};

#if defined(TWIN32)
double timer::resolution=0;
bool timer::usePerformanceCounter=false;
#endif

#endif


[edited by - Cocalus on May 30, 2004 5:44:57 PM]

Share this post


Link to post
Share on other sites
I edited the post because it is more complicated than I thought.
I just want to repeat: thanks for posting the new version of the code

[edited by - Clueless on May 31, 2004 6:33:12 PM]

Share this post


Link to post
Share on other sites
Eh, if you''re still interested, I can answer your question about why it returns strange values..
QueryPerformanceCounter()/QueryPerformanceFrequency() returns the number of seconds elapsed since the system was started, which is why you got 85k seconds. To calculate the time per frame, you store the value returned by QPC at the start, then at the end get the new value, and use the difference between the two to find the elapsed time.

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!