Sign in to follow this  
Ookami

2D camera - Smoothly sliding to target regardless of speed

Recommended Posts

Hello, I wasn't sure which forum to put this in, but since it involves more math/physics than gameplay, I stuck it in here. I've been throwing together a 2d camera class. Now, the basic features of a camera were extremely simple to implement. However, I've been working on more advanced features such as smoothly interpolating towards the target's position. Basically there are three conditions I want: 1. Define a radius around the center of the camera in which the target can freely move around. While the target is not resting in the center, the camera will smoothly interpolate towards the target's position. 2. The camera must not let the target go beyond this radius. 3. The camera must smoothly interpolate at the same speed regardless of the target's speed. I've tried several different approaches to this problem and each one had its own set of issues. The first thing I tried was simply taking the delta between the camera's position and the target's position, and simply dividing by a constant value before adding it to the camera's position. This leads to a smooth scroll, but if the target's speed is set extremely high, it will shoot off the edge of the screen before the camera can catch up. At low speeds, the target will almost always be locked in the center of the screen. Second, I tried a more physics based approach, using the equation a = ( 2 * ( d - v * t ) ) / ( t ^ 2 ) where d is the delta between camera and target, v is the camera's current velocity and t is the target time it should take to get from a to b. First problem with this is because the camera is updated every frame, it's always going to take t amount of time to cover the distance between the delta. Second problem I ran into with this approach was the camera started to jitter wildly back and forth occasionally. So my question is, does anyone have any suggestions what to try next? I hope I explained the problem sufficiently for people to try to help.

Share this post


Link to post
Share on other sites
Call the vector from the center of the camera to the player R, with a max length of D.

I would say that the camera should move at which ever is greater, R or R/D*PlayerSpeed.

That way at R=D, the camera will be moving at PlayerSpeed, and when the player stops, the camera will be moving at R.

Edit:
In response to your question below, the algorithm stated above is actually:
CameraSpeed = Max(R, R/D*PlayerSpeed);

If player speed goes to zero, R will be greater until the camera is centered on the player. It will also smoothly arrive at the player as R, and therefore its velocity, will be decreasing.

[Edited by - bzroom on November 15, 2009 10:34:24 PM]

Share this post


Link to post
Share on other sites
There are two major problems I see with that algorithm. The first is that the camera doesn't move at a consistant speed when the target's speed changes. The second is when it's outside the freeplay radius, the camera's speed is now dependant on the target's speed, which is unreliable. For example, in your algorithm R/D*PlayerSpeed, if the player somehow reaches velocity 0 just after leaving the camera's freeplay area, it will simply get stuck. I believe the solution I am looking for will not need to determine the player's speed.

Thanks for taking the time to reply, but I'm still looking for assistance with this problem.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this