# Smooth animation needed

This topic is 3247 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

This is my current game loop. I tried to recreate the Gaffer timestep idea but i think im missing something as my movement still has choppiness issues on some computers.
void cGame::Run()
{
double accumulator = 0;
double interpolation = 0;
double dt = 1;
while(m_IsRunning && m_Window.IsOpen())
{
accumulator += m_Timer.GetDeltaTime();
while(accumulator>=dt)
{
std::vector<cGameState*>::iterator it;
for(it=m_States.begin();it<m_States.end();it++)
{
(*it)->Update();
}
accumulator-=dt;
}
interpolation = accumulator/dt;
m_Window.ClearScreen();
std::vector<cGameState*>::iterator it;
for(it=m_States.begin();it<m_States.end();it++)
{
(*it)->Draw(interpolation);
}
m_Window.SwapBuffer();
}
}


I thought i covered everything, but the Gaffer article seems to pass the dt as well as some other t value to its update function, which i cant seem to understand the usage of. m_Timer.GetDeltaTime() returns milliseconds in case that matters. I can post more code if its needed.thanks!

##### Share on other sites
It's hard to say without seeing the rest of the code (ie m_Timer), but generally you want to follow a process like this:

1.) Calculate the time delta of the current frame from the last one.

2.) Use this time delta to drive your animations. Give each frame a length, and subtract the time delta from that length. When it is zero or less, move to the next frame. I suppose it's possible that if the length of each frame is too short, it could cause it to "hang" on a frame, although I've never encountered that problem.

3.) Update the "time since app started" variable that you're using to calculate time deltas.

One thing I noticed in your code is that you're calling Update without passing anything about the time, which seems odd. I see you pass something for Draw, but it would seem to me that you'd do the animation's frame logic in an Update call.

##### Share on other sites
What I've done for my game, which appears to work well, is to skip passing time to the update function. Instead, each physics update is quantized (i.e. represents some arbitrary amount of time), and I do a physics update any time my accumulator gains enough time. For example, say I target doing 20 physics updates per second. That means that any time my accumulator has more than 1/20th of a second, I do a physics update, and subtract 1/20 from the accumulator. I spend the rest of the time drawing and sleeping. The draw code takes as an argument the amount of time since the last physics update, expressed as a ratio with 1/20 (for example, if it's been 1/40th of a second since the last physics update, then the ratio would be .5). All of my game objects store their last two positions. They use the ratio to interpolate between the two positions, which creates the desired smooth motion.

Make sense?

##### Share on other sites
Quote:
 but i think im missing something as my movement still has choppiness issues on some computers.

You are aware that you run your update once every millisecond? I'm guessing some/most computers can't keep up with that.

Quote:
 I spend the rest of the time drawing and sleeping

Doesn't that mean that, if you are unlucky, your input may take 1/20th of a second before the game recognize it? Personally I update the physics with a fixed timestep, and the ui with a variable one, like this.

##### Share on other sites
sirGustav, yeah i know its every millisecond, but even slowed down it has an issue, ill try the code you gave and see if it help.

##### Share on other sites
Quote:
 Original post by schragnasherThis is my current game loop. I tried to recreate the Gaffer timestep idea but i think im missing something as my movement still has choppiness issues on some computers.*** Source Snippet Removed ***I thought i covered everything, but the Gaffer article seems to pass the dt as well as some other t value to its update function, which i cant seem to understand the usage of. m_Timer.GetDeltaTime() returns milliseconds in case that matters. I can post more code if its needed.thanks!

This a nice article on game loop addressing same problem : -

http://dewitters.koonsolo.com/gameloop.html

##### Share on other sites
my timer code
class cTimer{    public:        cTimer(){}        ~cTimer(){}        void StartTimer()        {            m_OldTime = timeGetTime();        }        double GetDeltaTime()        {            double NewTime = timeGetTime();            m_DeltaTime = NewTime - m_OldTime;            m_OldTime = NewTime;            return m_DeltaTime;        }    private:        double m_OldTime;        double m_DeltaTime;};

in case it helps, im still trying to get this stuff to work.

##### Share on other sites
http://dewitters.koonsolo.com/gameloop.html
this is helping, i seem to have an odd anomoly with my laser beams when they loop around the screen but the animation seems smooth as butter. Thanks.

Now iv got his useless timer though lol since this just grabs the current time and not the deltatime.

##### Share on other sites
timeGetTime() is not particularly accurate by default. You need to call timeBeginPeriod(1) at startup to improve its accuracy, although I believe this can seriously increase your CPU usage (not something I've ever worried about for a game) and call timeEndPeriod() at the end of your program.

Direct3D will set this as well if you request certain behaviour in the present parameters.

Quote:
 MSDNFull-screen mode supports similar usage as windowed mode by supporting D3DPRESENT_INTERVAL_IMMEDIATE regardless of the refresh rate or swap effect. D3DPRESENT_INTERVAL_DEFAULT uses the default system timer resolution whereas the D3DPRESENT_INTERVAL_ONE calls timeBeginPeriod to enhance system timer resolution. This improves the quality of vertical sync, but consumes slightly more processing time. Both parameters attempt to synchronize vertically.

The alternative is to base a timer on QueryPerformaceCounter(), but I believe there are still some issues with this playing up on certain multi-CPUs without fix-patches installed.

##### Share on other sites
Ok so iv been messing around some more and came to this conclusion that my origional loop was just as good as the dewitters one, i get the same jitter but the gaffer article was easier to understand.

Iv paired it down to make it simple to read and understand and set the updates to 25/second.

    double accumulator = 0;    const double TICK_TIME = 1000/25;    while(m_IsRunning && m_Window.IsOpen())    {        accumulator += m_Timer.GetDeltaTime();        while(accumulator>=TICK_TIME)        {            UpdateGame();            accumulator-=TICK_TIME;        }        DrawGame(accumulator/TICK_TIME);    }

Timer class looks like this, iv added the timeBeginPeriod(1) and such to it
class cTimer{    public:        cTimer(){}        ~cTimer(){timeEndPeriod(1);}        void StartTimer()        {            timeBeginPeriod(1);            m_OldTime = timeGetTime();        }        double GetDeltaTime()        {            double NewTime = timeGetTime();            m_DeltaTime = NewTime - m_OldTime;            m_OldTime = NewTime;            return m_DeltaTime;        }        double GetTime()        {return timeGetTime();}    private:        double m_OldTime;        double m_DeltaTime;};

Im wondering if the jitter im seeing is based on incorrect interpolation basically im taking the value i get here and changing the values when i draw my ship like so...

void cPlayerShip::Draw(cResourceManager* ResourceManager,double Interpolation){    ResourceManager->DrawTexture(m_ShipImage,m_X+(m_Vx*Interpolation),m_Y+(m_Vy*Interpolation),m_R+(m_Vr*Interpolation));}

Basically i add (speed*interpolation) to each value when drawing. Is this correct?

or perhaps im just seeing something that isn't there lol.

• 32
• 12
• 10
• 9
• 9
• ### Forum Statistics

• Total Topics
631352
• Total Posts
2999483
×