T = max_rotational_velocity_konst * StickInput[-1..1] - w * CC is the drag constant and is calculated from the 'max_rotational_velocity_konst' constant and the 'time_to_reach_max_vel' constant - so that the designers can model a plane that achieves a certain angular velocity in a certain time (if the stick is kept at maximum during that time). Next, the angular velocity is equal to the integral of Torque.

w = (T / (inv_mass * inertia)) * dt;And finally, the angle of rotation is equal to

alpha = w * dtUnfortunately, several stick movements need to be input each frame (so that behaviors can be tweened - ground avoidance for one should increase in priority as the plane gets closer to the ground) and I can't change this. Thus my AI will input stick commands to make the plane roll to the right direction by measuring the angle between its current rotation and the desired rotation. I take the CrossProduct of the two 'Up' vectors so I get the sinus of the angle. I use this value as the StickInput value - thus it will achieve 0 input when its in the right angle. Thus:

T = max_rot_speed * sin(alpha) - w * C w = integral(T) alpha = integral(w)What I want is to find the angla as a function of time in order to determine the time it will take for the aircraft to achieve a certain rotation. I've already found a way to do it for constant force from here, but I don't know how to use it for a varying force. Any ideas?