Jump to content

  • Log In with Google      Sign In   
  • Create Account

We're offering banner ads on our site from just $5!

1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


consistent Gaming Loops and Clocks


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
8 replies to this topic

#1 Vero   Members   -  Reputation: 138

Like
0Likes
Like

Posted 22 June 2012 - 10:17 AM

So, I've been going through a lot of OpenGL Tutorials using C code, Also side question, what would you guys recommend coding more complex games in C or C++? I'm not really used to C because work and so many languages I've used before use Objects and Classes if not for utilities than just to keep things organized, that I'm not certain how I would keep a C game from massing huge amounts of global variables and just becoming a mish mash of code. but so far the most useful tutorials I've found have been coded in C.

I created my first really simple game using a tutorial and adding improvements here and there and fixing all the major bugs with stuff that well works and works pretty solid but feels more like duct tape than a nice elegant fix.

Now I think most of my issues stemmed from creating consistent/smooth animationwhen traversing cross points on a grid. I did this by incrementing a position stuct for the player/enemie and then every draw cycle I translated to draw on the current position. this either created kind of choppy movement or too slow a movement. My main question is what are some implementations you would use to create movement on a 2d ortho view using C or even C++. Something that would mostly stay consistent not based on CPU power but just by a clock so it would take X amount of time to get from A to B. My next game is going to just be a simple Maze game maybe a maze editor if I get ambitious with it, but I want to make sure I have some of these concepts down first before I continue.

Sponsor:

#2 BeerNutts   Crossbones+   -  Reputation: 2985

Like
0Likes
Like

Posted 22 June 2012 - 12:47 PM

In general, you solve these issues by giving your objects a speed, using Units/Time. So, for a generic PC screen, you could use pixels/seconds. Basically, you'd do this (pseudo code):
uint32_t CurrentMilliseconds = GetTimeInMS();

while(IsGameRunning) {
  // store last time
  uint32_t PreviousTime = currentMilliseconds;

  // get new time
  CurrentMilliseconds = GetTimeInMS();

  // get elapsed time since last frame
  double ElapsedSeconds = (double)(CurrentMilliseconds - PreviousTime)/1000.0f;

  UpdateObjects(ElapsedSeconds);
  DrawAllObjects();
}

// In an objects Update function
void EnemyObject::Update(double ElapsedSeconds)
{
  // move the player based on his speed, make sure position is a double
  mPosition.x += mVelocity.x * ElapsedSeconds;
  mPosition.y += mVelocity.x * ElapsedSeconds;
  mPosition.z += mVelocity.z * ElapsedSeconds;
}

void EnemyObject::Draw()
{
  RenderEngine.Draw(Polygon, mPosition.x, mPosition.y, mPosition.z);
}

There are more elegant methods of doing this (like not using pixels/second, but your world coordinates), but this is the general idea.
My Gamedev Journal: 2D Game Making, the Easy Way

---(Old Blog, still has good info): 2dGameMaking
-----
"No one ever posts on that message board; it's too crowded." - Yoga Berra (sorta)

#3 Vero   Members   -  Reputation: 138

Like
0Likes
Like

Posted 02 July 2012 - 03:27 PM

What do you think the best way to go about implementing a GetTimeInMS()? I found this but this code is from the 2005ish maybe earlier. Also C++ or C is usable, Windows only machines. I just want to attempt to establish an animate that would have a time consistancy as said above.

struct
{
    __int64 frequency;    //timer frequency
    float resolution;    //time resolution
    unsigned long mm_timer_start;    // Multimedia Timer Start Value
    unsigned long mm_timer_elapsed; // Multimedia Timer Elapsed Time
    bool performance_timer;        //Using the performance timer?
    __int64    performance_timer_start;    //performance timer start
    __int64 performance_timer_elapsed;
} timer;
 

void TimerInit(void)
{
    memset(&timer,0,sizeof(timer));    //clear out timer structure

    //check to see if a performance Counter is availble
    //if one is availble the timer frenquency will be updated
    if(!QueryPerformanceFrequency((LARGE_INTEGER *) &timer.frequency))
    {
        //no performance counter availble
        timer.performance_timer = FALSE;
        timer.mm_timer_start    = timeGetTime();
        timer.resolution        = 1.0f/1000.0f;
        timer.frequency            = 1000;
        timer.mm_timer_elapsed    = timer.mm_timer_start;
    }
    else{
        //performance coutner is available
        QueryPerformanceCounter((LARGE_INTEGER *) &timer.performance_timer_start);
        timer.performance_timer        = TRUE;
        //calculate the Timer Resolution Using the Timer frequency
        timer.resolution	    = (float) (((double)1.0f)/((double)timer.frequency));
        //Set Elapsed time to the current time
        timer.performance_timer_elapsed=timer.performance_timer_start;
    }
}

float TimerGetTime()
{
    __int64 time;    //time will hold a 64 bit integer

    if(timer.performance_timer)
    {
        QueryPerformanceCounter((LARGE_INTEGER *) &time);
        //return the current time minus the start time Multiplied By The Resolution And 1000 (To Get MS)

        return ( (float) ( time - timer.performance_timer_start * timer.resolution)*1000);
    }
    else {
        return( (float) ( timeGetTime() - timer.mm_timer_start) * timer.resolution)*1000.0f;
    }
}

To Recap would this be a good method or are their better implementations out there?

#4 BeerNutts   Crossbones+   -  Reputation: 2985

Like
1Likes
Like

Posted 02 July 2012 - 07:39 PM

QueryPerformance counter is one way to do it, but, as you see, it's very involved

timeGetTime() in Windows is much simpler, as it returns time in milliseconds. link

So, you can just do this:
double GetTimeInMS()
{
  return  (double)timeGetTime()/1000.0f;
}

Edited by BeerNutts, 02 July 2012 - 07:42 PM.

My Gamedev Journal: 2D Game Making, the Easy Way

---(Old Blog, still has good info): 2dGameMaking
-----
"No one ever posts on that message board; it's too crowded." - Yoga Berra (sorta)

#5 Vero   Members   -  Reputation: 138

Like
0Likes
Like

Posted 03 July 2012 - 01:08 PM

Thanks Beernutts, I'm definitely going to keep that snippet of code around for later use. I figured I'd add to the thread to see the outcome of what's happened so far with the game loop implementation. I'm keeping a Psuedo FPS counter what it is really keeping track of is how many Cycles a second it will run through the loop. I say pseudo because without throttling the cycles to 200 cycles a second it was getting over 1000 cycles, I have a goal render with some animation and that was definitely getting some tearing. this might have been due to the limitations of the monitor itself, Anywhere here's a few snippets of time so far implemented in my loop:
//condensed the performance query

long long milliseconds_now() {
    static LARGE_INTEGER s_frequency;
    static BOOL s_use_qpc = QueryPerformanceFrequency(&s_frequency);
    if (s_use_qpc) {
        LARGE_INTEGER now;
        QueryPerformanceCounter(&now);
        return (1000LL * now.QuadPart) / s_frequency.QuadPart;
    } else {
        return GetTickCount();
    }
}


while(!done) //Game LOOOP!  
{
if(PeekMessage(&msg,NULL,0,0,PM_REMOVE)) //is there a message? 
{ //peek will not halt program while it looks for a message
if(msg.message==WM_QUIT) // have we received a quit message?
{
done=TRUE;
}
else{ //if not deal with the window messages
TranslateMessage(&msg); //Translate the Message
DispatchMessage(&msg); //Dispatch the Message
}
}
else{ // if there are no messages

// Draw the Scene. Watch for ESC Key and Quit Messages From DrawGLScene()
if((active && !DrawGLScene())|| keys[VK_ESCAPE])
{
done=TRUE;  // ESC Signalled a Quit
}
else{ //not time to quit, update screen
//drew scene in the if statement
while(FPS(start,milliseconds_now())>200LL)   //throttle FPS
{

/*wait*/}
if(fcount < 60)
{
fps_sum=fps_sum+FPS(start,milliseconds_now());
++fcount;
}
else
{
fps=fps_sum/fcount;
fcount=0;
fps_sum=0LL;
}
start=milliseconds_now();

SwapBuffers(hDC); //SwapBuffers (double buffering)
}
}

...
//end of loop
}

//drawing and stepping in the goal object:


GLvoid Goal::Step(long long elapse)
{
if(startTime==0LL)
startTime = elapse;  //start new animation cycle

double ElapsedSeconds = (double)(elapse - startTime)/1000.0f;

spin= Velocity.x*ElapsedSeconds;
}
GLvoid Goal::Draw(){
glLoadIdentity();                       // Reset The Modelview Matrix
    glTranslatef(fPosition.x,fPosition.y,0.0f);  //top left corner
    glColor3f(1.0f,0.2f,0.2f);                  // Make Goal Red
    glBegin(GL_LINES);                      // Start Drawing Goal
glVertex2d( 0                            , 0+((int)spin%interval));         // Top Point Of Body
        glVertex2d( interval                     , interval-((int)spin)%interval);          // Bottom Right
        glVertex2d( interval-((int)spin%interval), 0); // Left Point Of Body
        glVertex2d( 0+((int)spin%interval)  , interval); // Bottom Point Of Body
    glEnd(); // Done Drawing Enemy Body
}


this is the first game I've been developing and one of the things I've noticed with a lot of help is people put out a lot of assumptions for an implementation to things.

Basically with the current Time implements I can control the frame rate and the animation speed. so I can have choppy fast smooth fast smooth slow etc. I'm very happy with the result. Now that I have that part mostly set up i can start making the actual player :D

#6 wolfscaptain   Members   -  Reputation: 200

Like
2Likes
Like

Posted 04 July 2012 - 12:56 AM

This is worth a read http://gafferongames.com/game-physics/fix-your-timestep/

#7 Matt-D   Crossbones+   -  Reputation: 1467

Like
1Likes
Like

Posted 07 July 2012 - 02:30 PM

What do you think the best way to go about implementing a GetTimeInMS()?


It's already implemented in the (current) C++ standard library -- in particular, in the <chrono> header:
http://en.cppreferen...om/w/cpp/chrono

Depending on what you need use either std::chrono::steady_clock or std::chrono::high_resolution_clock:
http://en.cppreferen...no/steady_clock
http://en.cppreferen...esolution_clock

Edited by Matt-D, 07 July 2012 - 02:31 PM.


#8 JWBaker   Members   -  Reputation: 235

Like
0Likes
Like

Posted 10 July 2012 - 02:37 PM

This is worth a read http://gafferongames...-your-timestep/


This is the way to go, buckle down and get into it from the beginning, its a bit confusing at first, but your results will be better in the long run.

#9 turch   Members   -  Reputation: 590

Like
0Likes
Like

Posted 11 July 2012 - 08:49 AM

Note that the precision of timeGetTime "can be five milliseconds or more" (MSDN), which is almost a third of an ideal frame.
Visual Studio 11 had a known bug last time I checked (a few months ago) where std::chrono::high_resolution_clock had a precision of only about 4 milliseconds.




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS