Can't solve interpolation problem!

Started by
4 comments, last by Crayz92 5 years, 1 month ago

My interpolation renders x frames in the past, when a packet comes in I buffer up the state with the current time.  I think my issue is caused by network jitter, when packets come in late or early, current time isn't good enough.  I implemented a queue for packets which arrive early or at the same time.  With a fixed send rate (30 times per second), I assume clients have a stable enough connection so that if a packet comes in early, I can dequeue it 1/30th of a second after the last packet came in.  I'm still getting visual jitters from what I believe is packets coming in late.

I'm trying to figure out how to assign interpolation times to reduce visual jitters as much as possible.  Been struggling with this for a while, and thought it was time to ask for help.  The attached video shows these jitters at the very beginning.

Entity update, this executes when packets come in and the interval can vary:


protected override void OnEntityUpdated(NetEntity entity, double remoteTime)
{
	var nextFrame = new EntityFrame()
	{
		PacketTime = remoteTime,
		CurTime = Time.time,
		Position = entity.Origin,
		Angles = entity.Angles,
		Velocity = entity.Velocity
	};

	_history[unlag].TryAddFrame(nextFrame);
}

TryAddFrame:


public void TryAddFrame(EntityFrame frame)
{
	var diff = frame.CurTime - _frames[0].CurTime;
	var updateRate = 1f / FSLog.GetVariable<int>("net.entityrate");
	if (diff < updateRate)
	{
		_insertDelay = updateRate - (float)diff;
	}
	else if(diff > updateRate * 1.25f)
	{
		Debug.Log("JITTER");
	}

	if (_insertDelay > 0
		|| _jitterBuffer.Count > 0)
	{
		_jitterBuffer.Enqueue(frame);
		return;
	}

	AddFrame(frame);
}

Interpolation loop, executed each frame:


double latency = frameBuffer * updateRate;
var renderTimestamp = Time.time - latency;

if(interp[0].CurTime > renderTimestamp)
{
	for (int i = 0; i < interp.FrameCount; i++)
	{
		if (interp[i].CurTime <= renderTimestamp)
		{
			var toFrame = interp[Mathf.Max(i - 1, 0)];
			var fromFrame = interp[i];
			var t = (renderTimestamp - fromFrame.CurTime) / (toFrame.CurTime - fromFrame.CurTime);
			var lerpedPos = Vector3.Lerp(fromFrame.Position, toFrame.Position, (float)t);
			var lerpedAngle = Quaternion.Lerp(Quaternion.Euler(fromFrame.Angles), Quaternion.Euler(toFrame.Angles), (float)t);
			unlag.Model.Set(lerpedPos, lerpedAngle.eulerAngles);
			break;
		}
	}
}
else
{
	// extrapolate?
}

 

2019-03-24_01-03-20.mp4
Advertisement

Do you pay attention to "when" the remote entity update said it was for? (you should)

Do you measure game time in "ticks since start" or "milliseconds/seconds" ? (you should ideally use ticks, with a fixed time step)

Is your code structured so you can just run the interpolation code in its own unit tests, and send it updates with various "remote time" and "current time" and "position" values, so you can study what it does in different situations? (it should be ?

enum Bool { True, False, FileNotFound };
3 hours ago, hplus0603 said:

Do you pay attention to "when" the remote entity update said it was for? (you should)

Do you measure game time in "ticks since start" or "milliseconds/seconds" ? (you should ideally use ticks, with a fixed time step)

Is your code structured so you can just run the interpolation code in its own unit tests, and send it updates with various "remote time" and "current time" and "position" values, so you can study what it does in different situations? (it should be ?

Nope, milliseconds/seconds, and nope :(

I made some changes though.  I updated my entity packet to include the server's tick which the entity was sent, and synced ticks on client and server with an offset.

Now what I'm trying is when an entity update comes in I subtract the offset from the remote tick for that entity to convert it to a local tick.  My interpolation latency is determined by a fixed amount (0.2) divided by the fixed timestep interval (.01), so the render time would be 20 ticks in the past.  Now I need to calculate T for interpolation, since going off ticks is going to give large steps between each frame, and packets come in unreliable intervals, my only idea is to store a float for each entity which increments per frame and gets reset to 0 when the "to frame" changes.  The problem I'm having now is keeping the tick offset in sync without affecting what frame the interpolator is going for.  Maybe I should set a constant offset for each entity, or something?

Could this be a valid solution?

Assuming that the server uses the same clock for all entities it sends, you want the same offset for all entities you've seen from the server. This offset may need to be occasionally adjusted, for changing conditions, but it should be OK to "set and forget it" most of the time.

enum Bool { True, False, FileNotFound };

Well you lead me on the right path, I think I solved it.  I was using Lidgren's timer and clock sync which produced jittery results, assuming Lidgren was correct and I was wrong, I started making things harder than they had to be.  After experimenting with syncing by tick and keeping track of my own offset, I thought I'd give syncing by time one more shot but with my own clock sync, and the results so far are good.  When a packet comes in, I set its CurTime to remoteTime - Offset, and completely removed the jitter buffer.

Not sure what's up with Lidgren's time, maybe it updates at too slow of an interval.

This topic is closed to new replies.

Advertisement