Smooth animation needed

Started by
13 comments, last by Adam_42 14 years, 9 months ago
Quote:Original post by schragnasher
Basically i add (speed*interpolation) to each value when drawing. Is this correct?


Not sure exactly what you mean, but the basic idea is:

Store the previous position from the last update and the new position after the most recent update:

vec2 last;void update(float dt) // dt is the fixed logic timestep{    last=pos;    pos+=speed*dt;}


When rendering, use the following formula to interpolate between the last and current positions based on the interpolation value:

vec2 interpolate(vec2 pos,vec2 last,float blend){    return (pos*blend)+(last*(1.0f-blend));}void render(float blend){    vec2 p=interpolate(pos,last,blend);    render_me_at(p);}


Basically the blend factor represents how much time is in the accumulator that there was not time to consume with another timestep, so you are rendering that amount between the previous and current timesteps.
Advertisement
hmmm the interpolation is definitely my problem. ill give it another try thanks for the explanation.
Quote:Original post by schragnasher
hmmm the interpolation is definitely my problem. ill give it another try thanks for the explanation.


You're welcome.

BTW, in my experience float is perfectly okay as a datatype for this sort of thing. I don't think you gain anything visually from using doubles and there could be some performance tradeoffs.
Oddly enough i seem to have fixed my main issue without changing my timing code. I had another problem with my window when running the program on windows vista. I fixed that issue by turning off aero, which seems to also be causing a lot of stuttering. My vista box was the machine i had seen a good deal of stutter on. The XP machines had seemed much smoother. So yeah, seems it was aero...
Using floating point variables to store the result of timeGetTime() or other similar functions can have some nasty side effects, especially if they are only floats and not doubles. However note that under D3D9 all floating point maths is effectively done at float precision anyway (unless you specify D3DCREATE_FPU_PRESERVE).

The problem is that floats only have about 24 bits (6-7 decimal digits) of precision. Let's say the computer has been switched on for a day or so, and timeGetTime() returns 100,000,000. The call on the next frame returns 100,000,033 (running at 30 FPS). However because you only have those 7 digits of precision subtract those two numbers and you get zero!

It gets even worse when the timer wraps round (which happens every 57 days or so in this case). You get a time delta of around 4 billion milliseconds, which unless you clamp it will mean updating one frame will take a rather long time using a fixed time step...

To fix this always store and subtract absolute time values using an integer. You can then convert the small time delta to floating point safely. I'd also recommend clamping the frame delta to a maximum value of say a tenth of a second. That means that pauses for things like loading won't cause weird issues, and also means the game will still be somewhat playable at really low frame rates (like you might get in debug builds, or when putting breakpoints in).

This topic is closed to new replies.

Advertisement