Jump to content

  • Log In with Google      Sign In   
  • Create Account

Angus Hollands

Member Since 06 Apr 2012
Offline Last Active Yesterday, 05:14 PM

Topics I've Started

Car transmission model

16 September 2015 - 04:22 PM

Hi everyone, 


I'm reviewing a car transmission model for a basic racing game, and I've noticed a slight discrepancy between what I would expect of the macroscopic properties of the model (when changing gear) and the numerical results.


Here is the basic model:

from math import pi
import numpy as np

class Car:

	gearRatios = []
	maxTorqueArray = []
	rpmAtMaxTorque = []

	wheelRadius = 0.0
	differentialRatio = 0.0
	Cd = 0.0
	Crr = 0.0
	transmissionEfficiency = 0.0
	mass = 0

	def __init__(self):
		self.speed = 0.0
		self.acceleration = 0.0
		self.currentGear = 1
	def update(self, dt):
		throttle = 1.0

		# Gear ratios
		assert self.currentGear > 0
		currentGearRatio = self.gearRatios[self.currentGear - 1]
		effectiveGearRatio = self.differentialRatio * currentGearRatio
		transmission = effectiveGearRatio * self.transmissionEfficiency

		wheelCircumference = 2 * pi * self.wheelRadius

		# Find RPM
		wheelRPM = (self.speed / wheelCircumference) * 60
		engineRPM = wheelRPM * transmission
		engineRPM = min(max(engineRPM, self.rpmAtMaxTorque[0]), self.rpmAtMaxTorque[-1])

		# Find torque at RPM
		maxTorque = np.interp(engineRPM, self.rpmAtMaxTorque, self.maxTorqueArray)
		engineTorque = maxTorque * throttle
		wheelTorque = engineTorque * transmission
		tractionForce = wheelTorque / self.wheelRadius

		# Resolve forces
		dragForce = -self.Cd * self.speed * abs(self.speed)
		rollingResistanceForce = -self.Cd * 30 * self.speed

		# Resultant
		resultantForce = tractionForce + dragForce + rollingResistanceForce

		self.acceleration = resultantForce / self.mass
		self.speed += self.acceleration * dt

		self.currentForce = resultantForce
		self.currentRPM = engineRPM
		self.currentSpeed = self.speed
		self.currentTorque = wheelTorque

class BoxsterS2004(Car):

	gearRatios = [3.82, 2.20, 1.52, 1.22, 1.02, 0.84]
	maxTorqueArray = [220, 220, 310, 220]
	rpmAtMaxTorque = [0, 1000, 4500, 6600]

	wheelRadius = 0.345
	differentialRatio = 3.42
	Cd = 0.32
	transmissionEfficiency = 0.7
	mass = 1300

if __name__ == "__main__":
	car = BoxsterS2004()
	dt = 1 / 60
	while True:

Now, in it's actual implementation, I'm running the update function in a fixed-step game loop, which allows me to inspect the car state and shift gear at runtime.


Now, the behaviour:

It is to be expected that the force of the car drops according to the speed for the same power.

It is also to be expected that the car cannot reach as high a speed in lower gears.


Yet, for these parameters, the top speed can only be reached in 1st gear, and decelerates according as the gears are shifted higher.

I'm not sure why this happens. It makes sense that the acceleration rate is lesser for higher speeds, because the drag force is proportional to v^2, and the force of the engine falls with higher gears, but my resultant forces are negative for higher gears when they're not for the lower gears. Shifting at any point reduces the max speed achievable.


Evidently, I don't quite understand the model well enough, could anyone point me in the right direction?

To my mind, given that the driving force of the car falls off at 

Concern over input jitter

11 May 2014 - 09:17 AM

Hi everyone. Recently I moved over to a sort of dejitter buffer which has been working wonderfully on LAN play tests, which due to slight timing problems wasn't so happy with my last implementation.

consume_move = self.buffer.popleft

# When we run out of moves, wait till we have enough
buffer_length = len(self.buffer)

# Ensure we don't over fill due to network jitter
buffer_limit = (self.buffer_length + self.buffer_padding)

# If we have 0 moves in the buffer
if not buffer_length:
	print("Waiting for enough inputs ...")
	self.buffer_filling = True

# Prevent too many items filling buffer
# Otherwise we may move to the past slowly and it causes long-term issues
elif buffer_length > buffer_limit:
	print("Received too many inputs, dropping ...")
	for _ in range(buffer_length - self.buffer_length):

# Clear buffer filling status when we have enough
if self.buffer_filling:
	if len(self.buffer) < self.buffer_length:
	self.buffer_filling = False

	# New debug feature, needs clamping at safe maximum
	# Ping is the RTT, and we add 1 to ensure the time is at least 1 tick
	self.buffer_length += WorldInfo.to_ticks(self.info.ping / 2) + 1

	buffered_move = self.buffer[0]

except IndexError:
	print("Ran out of moves!")

move_tick = buffered_move.tick

# Run the move at the present time (it's valid)

Here is my update code in the main update function for a player (controller)


When running over the internet (using my NAT's external IP and connecting from the machine), it isn't happy with 100 ms dejittering, and when I added the simple increment in the "if self.buffer_filling" branch, it seems happy at around 13 -> 16 ticks, which is around 400 ms. Surely this doesn't seem reasonable?


This seems far too high for a) my internet connection and b) most internet connections. I could have reason to suspect my provider as they're not the best in my opinion, but it seems unusual that so many packets are delayed, as they are each sent individually. 

I printed out the number of items in the buffer each tick and it would read something like:

No moves!

Also, I do seem to notice that every so often a packet is dropped. What would be an expected packet loss statistic for a 4 Mb/s, latency 60ms, internet connection in the UK?


I'm trying to determine if it is some deeper level network code issue (in my codebase) or just life.

Network Tick rates

18 January 2014 - 03:00 PM

I have a few questions regarding the principle of operating a network tick rate.
Firstly, I do fix my tick rate on clients. However, I cannot guarantee that if the animation code or other system code takes too long, that it will operate at such a frame rate. 
To calculate the network tick, I would simply use "(self.elapsed - self.clock_correction) * self.simulation_speed", assuming the correction is the server time + latency downstream. 

However, if the client runs at 42 fps, and the simulation speed is 60 fps (or, the server runs at 42 and the simulation is 60), I will eventually calculate the same frame successive times in a row if I round the result of the aforementioned equation. (This was an assumption, it seems unlikely in practice, but this will still occur when I correct the clock). How should one handle this?
Furthermore, should the simulation speed be the same as the server fixed tick rate, for simplicity?


One last question;

If I send an RPC call (or simply the input state as it would be) to the server every client tick (which I shall guarantee to run at a lower or equal tick rate to the server), then I believe I should simply maintain an internal "current tick" on the server entity, and every time the game tick is the same as (last_run_tick + (latest_packet_tick - last_packet_tick)) on the server I pop the latest packet and apply the inputs. This way, if the client runs at 30 fps, and the server at 60, then it would apply the inputs on the server every 2nd frame. 

However, if the client's packet arrives late for whatever reason, what is the best approach? Should I introduce an artificial buffer on the server? Or should I perform rewind (which is undesirable for me as I am using Bullet Physics, and thus I would have to "step physics, rewind other entities" and hence any collisions between rigid body and client entities would be cleared"? If I do not handle this, and use the aforementioned model, I will eventually build an accumulation of states, and the server will drift behind the client.
Regards, Angus

Algorithms / techniques to determine available bandwidth

10 December 2013 - 03:33 PM

Hi everyone,

As the question states, I was wondering how best to determine the available bandwidth of a connection. I intend to determine the update frequency of a networked entity based upon its network priority and the available bandwidth.

Any ideas?



Where to render object positions and animations

15 May 2013 - 09:01 AM

Hey everyone,


I'm a little indecisive when choosing how to render the gamestate on the client. 

I can forward extrapolate the other clients to where they would be on the server, but then things like playing animations would need to be forwarded to compensate. As a result, the animation would "jump" into a frame; such that if the latency was half a second and the animation framerate was 60 frames per second, 30 frames would be skipped. 


How should I deal with the timing issues?

  1. Extrapolate forwards by the downstream latency to get the current position of the object
  2. Render at time of receipt but still run client itself ahead using prediction

Forgetting about the jitter buffer, I'm concerned about hit detection; The first option would potentially be out by some margin, but the latter option would technically be incorrect as it was aiming at the client's old positions. Any thoughts?