Camera Jitter

Started by
24 comments, last by Brother Erryn 20 years ago
EUREKA!

I found that ALL timer calls (GetTickCount, timeGetTime, and QueryPerformanceCounter) would sometimes return the same result in subsequent frames. Almost like it was "forgetting" to count at all. I changed this:

lngNow = GetTickCount

To this:

Do Until lngNow <> m_lngLastTick
lngNow = GetTickCount
Loop

No hit to framerate, but animation is silky smooth now!
Advertisement
Doing something when no real time went by is a bad idea

I meant splitting a large 150ms tick into say 5 30ms ticks, but only render 1 frame. This would only affect physics things which are affected by the time step though. I think they call it 'numerical instability' or something, and it often results in jittering.

My way for the timing would go something like this:


IngNow = GetTickCount
Do While Running
IngPrv = IngNow
IngNow = GetTickCount
TimeDelta = IngNow-IngPrv

Do while TimeDelta > 10
Tick(10)
TimeDelta = TimeDelta - 10
Loop

if TimeDelta > 0 then
Tick(TimeDelta)
end if
Render
Loop





[edited by - PyroSA on March 14, 2004 4:59:50 PM]
So you''re talking about doing the actual processing based on (for example) 10 ms increments? Wouldn''t that still be jerky if the actual render is after 150?

A curious side-effect of waiting for a real value from timeGetTime/GetTickCount that I''ve noticed: framerate is locked at 64. It''s not really a problem, as I can draw a LOT more on the screen per frame and still get that speed. I loaded the frames to twice the polygons I should ever get in the actual game, and it was still 64. Odd number.
The reason your frame rate is locked is probably because of the way you set up your D3D device. Make sure the PresentationInterval parameter of your D3DPRESENT_PARAMETERS structure is set to D3DPRESENT_INTERVAL_IMMEDIATE.

Otherwise, your framerate will be locked to the refresh rate of the monitor.

neneboricua
It''s set to D3DPRESENT_INTERVAL_IMMEDIATE for fullscreen, but I''ve been running it windowed. It performs at 64, precisely, in either mode. Hence the term "strange".
Thought I''d add a post explaining a few things I figured out, and the ultimate (and improved) solution I used, for the benefit on anyone else encountering the same problem.

First of all, the problem I was having was a simple matter of timer precision. On the machine in question, timeGetTime is normally accurate to 15.625 ms (15.625 * 64 FPS = 1000 ms). If a frame finished faster than 15.625 ms, it would read the same time as the previous frame. On a slower machine (with a precision around 30 ms), this made things jumpy, as it might be 60 ms before an additional animation stage was calculated.

The solution was to use two new API calls:
Public Declare Function timeBeginPeriod Lib "winmm.dll" (ByVal uPeriod As Long) As LongPublic Declare Function timeEndPeriod Lib "winmm.dll" (ByVal uPeriod As Long) As Long 

At the beginning of my code, I place a "timeBeginPeriod 1", which set the accuracy of timeGetTime to 1 ms. "timeEndPeriod 1" restores the nominal accuracy at the end of the code. With this, I got 1 ms precision out of the timeGetTime call...a trick I''ve still never seen mentioned anywhere, so I thought I''d share.

This topic is closed to new replies.

Advertisement