Camera Interpolation

Started by
23 comments, last by BlueSpud 10 years, 7 months ago

so you don't use time for movement. and that is really bad. since s = v * t; (distance travelled is equal to velocity multiplied by time)

intp is time. intp is the time in between the ticks, which are going at a constant rate of 20 ticks per sec on, giver er take a bit of time here and there. Really the problem at hand is why the camera keeps accelerating and decelerating.

Advertisement

Is your camera tied to the player?

Does your player only update on logical ticks?

If so, does that mean your camera only updates its position on logical ticks?

If so, we have found the problem.

The camera should update every frame based on the interpolated position of the player—the position used for rendering every frame.

If this is not your problem, you will need to provide more information.

L. Spiro

I restore Nintendo 64 video-game OST’s into HD! https://www.youtube.com/channel/UCCtX_wedtZ5BoyQBXEhnVZw/playlists?view=1&sort=lad&flow=grid

Is your camera tied to the player?
Does your player only update on logical ticks?
If so, does that mean your camera only updates its position on logical ticks?

If so, we have found the problem.
The camera should update every frame based on the interpolated position of the player—the position used for rendering every frame.

If this is not your problem, you will need to provide more information.


L. Spiro

the camera, right now is bound to the player. I have narrowed it down to using a higher resolution counter to make it look smooth. My problem not is it constantly accelerates and decelerates. The movement is on an interpolated tick.
@BlueSpud
I do like that:

float frameTime = GetFrameTime();
float t = (frameTime * 2.0);


Vector3 vecStart = vecPos (current camera position);
Vector3 vecEnd = player->GetCamOffset() * ( player->GetWorldMatrix() );


if( Length(vecEnd - vecStart) > 0.0)
{

//interpolate
vecPos = vecStart + (vecEnd - vecStart) * t;


//camera variables
vecX = player->GetRotMatrix().GetX();
vecY = player->GetRotMatrix().GetY();
vecZ = player->GetRotMatrix().GetZ();

//camera view matrix
matView = MatrixLookAt(vecPos, player->GetPosition(), Vector3(0, 1, 0) );
}
Update the camera position by the half of the frame time based on the camera offset.

@BlueSpud
I do like that:

float frameTime = GetFrameTime();
float t = (frameTime * 2.0);


Vector3 vecStart = vecPos (current camera position);
Vector3 vecEnd = player->GetCamOffset() * ( player->GetWorldMatrix() );


if( Length(vecEnd - vecStart) > 0.0)
{

//interpolate
vecPos = vecStart + (vecEnd - vecStart) * t;


//camera variables
vecX = player->GetRotMatrix().GetX();
vecY = player->GetRotMatrix().GetY();
vecZ = player->GetRotMatrix().GetZ();

//camera view matrix
matView = MatrixLookAt(vecPos, player->GetPosition(), Vector3(0, 1, 0) );
}
Update the camera position by the half of the frame time based on the camera offset.

I ended up using time elapsed instead of time between the tick, but thanks anyways. To anyone else who stumbles upon this topic here is what I ended up doing:

- I used SDL_GetPreformenceCounter() because it has a much higher precision.

- I used elapsed time not the interpolation that Dewitter's game loop entails. This is because the time in between the tick is not very good because it can greatly stutter and even though the game loop is "frame rate independent" Values can vary and not ending up what you want.

The game and view is much smoother now, thanks to everyone who has helped.

This topic is closed to new replies.

Advertisement