Alright, in our game whenever the player jumps we calculate when he should start falling based on how long he has been jumping for. This causes issues when the game is running at the fastest setting, since the deltaTime is added more frequently. I tried normalizing the vector returned by muliplying by deltaTime, but that just made the jump return a very tiny movement vector everytime. So, how could I go about normalizing the deltaTime?
Here's our code:
private Vector3 OnPress()
{
jumpTimer = 0;
jumpVelocity = new Vector3(0f, 11.5f, 0f);
return jumpVelocity;
}
private Vector3 Rising()
{
jumpTimer += Time.deltaTime;
jumpVelocity = (gravity * jumpTimer) + jumpVelocity;
return jumpVelocity;
}
private Vector3 Dropping()
{
jumpTimer += Time.deltaTime;
jumpVelocity = gravity * jumpTimer;
return jumpVelocity;
}
private Vector3 Landing()
{
jumpVelocity = Vector3.zero;
jumpTimer = 0f;
return jumpVelocity;
}
I believe the issue is within Dropping and Rising. Thanks.
Rewriting your current code, it's pretty much this:
onJump:
acceleration = 0;
velocity = (0, 11.5, 0);
rising:
acceleration += deltaTime * gravity;
velocity += acceleration; //n.b. deltaTime not used here!!!
dropping:
acceleration += deltaTime * gravity;
velocity = acceleration; //n.b. = used, not += ???
onLand:
velocity = (0,0,0);
acceleration = 0;
...which shows that your
equations of motion are wrong. Velocity is updated without any respect to delta time, which is why it will vary greatly with framerate.
Also, I'm not sure why rising and dropping need different update functions. You should be able to just use something like this:
onJump:
acceleration = 0;
velocity = (0, 11.5, 0);
rising:
dropping:
acceleration += deltaTime * gravity;
velocity += deltaTime * acceleration;
onLand:
velocity = (0,0,0);
acceleration = 0;
Or an alternate version:
onJump:
onGround = false;
impulse = (0, 11.5, 0);
onLand:
onGround = true;
impulse = (0, -velocity.y, 0);
update:
if( !onGround )
acceleration += gravity * deltaTime ;
velocity += acceleration * deltaTime + impulse;
impulse = (0,0,0);
^^Both of those versions update velocity using deltaTime, so they should be more stable. However, they will still give slightly different results with different frame-rates, due to them being an approximate
numerical integration of the motion curve.
To solve that, you can either use a fixed timestep (as mentioned by everyone above), or you can use a version that is based on absolute time values, instead of delta time values, which makes it perfectly deterministic and always works the same regardless of framerate:
onJump:
onGround = false;
initialJumpHeight = height;
initialJumpVelocity = 11.5;
timeAtJump = timeNow;
onLand:
onGround = true;
update:
if( !onGround ) {
timeSinceJump = timeNow - timeAtJump;
//motion under constant acceleration: o' = o + ut + att/2
//(o'=new pos, o=initial pos, u=intiial velocity, t=time since initial conditions, a=acceleration)
height = initialJumpHeight + initialJumpVelocity*timeSinceJump + 0.5*gravity*timeSinceJump*timeSinceJump;
}
return height;