I'm looking to create a threaded engine with 3 threads; one for OS input events which I store in a buffer between every tick, one for game logic and the third for rendering. I'm aiming at a constant tick rate with variable frame rate, and the renderer interpolating game states if the tick rate is lower.
Which thread should handle camera rotations e.t.c? I imagine if the tick frequency is lower than the render frequency, then interpreting mouse input and rotating the camera in the game thread would cause camera stuttering and lag, but if the tick frequency is the same or higher then it should be fine? Just wondering what the standard is, and whether I'm overlooking a way to create something independent of tick and frame rates with minimal lag.
I'm also looking to support the Oculus Rift, but I imagine camera control with it will require the same solution for reducing lag as with a mouse.