• Advertisement
Sign in to follow this  

disabled vsync and camera jerkiness

This topic is 4147 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Ok when I have vsync enabled my camera updates smoothly and predictably at 60 fps..... However when I disable vsynch, all hell breaks loose and the camera becomes very choppy. Also the framerate varies from 142fps-200fps, I think this is what is the source of the chopiness the wildly non constant frame rate? Am I right? So my solution is to update the camera on a fixed time step (as well as entities in the game) and then interpolate between fixed time step updates. I was wondering if this was a valid method of getting back a smooth camera or is there an easier way (perhaps capping the framerate)? Thanks for any ones help

Share this post


Link to post
Share on other sites
Advertisement
Hmm,

my guess is, you don't use the framerate as timestep in your animation / simulation, so your camera is updated more often, when you have a higher framerate.

Your solution doesn't sound too good to me, because with a fixed timestep for camera updates, you have to update the camera asynchronous to the render loop, which is at best, difficult.

Share this post


Link to post
Share on other sites
Yes that's it exactly I use a fixed timestep to update the animations and entities. But the camera is updated per frame using the frame time. Is there any way to resolve this? I figure that it should still be smoother since the frames take much less time, but this does not seem to be the case.

Share this post


Link to post
Share on other sites
Quote:
Original post by gcard28
Also the framerate varies from 142fps-200fps.


Just FYI - thats actually a really small difference in framerate. Remember framerate is "frames per second". So the render time for different framerates

FPS: 4
- Render Time: 250ms
FPS: 5
- Render Time: 200ms

so a change of 1 in the FPS between 4 and 5 fps is actually takes 50ms more render time. Now your framerate is:

FPS: 142
- Render Time: 7ms
FPS: 200
- Render Time: 5ms

So your rendering varies 2ms between its max/min times. Its actually a really small difference. As for your problem, it depends alot of what type of camera you have. Basically you need to take into account the time between frames for how far the camera moves - if you just do position += camSpeed; then it'll definatly break. You need something more like position += camSpeed * nDT;

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement