• Advertisement

Archived

This topic is now archived and is closed to further replies.

Camera Jitter

This topic is 5086 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

When tracking a mesh object with the camera, it seems to "hop" around. If I don''t move the camera, movement is very smooth. If I move the camera, the moving object acts like it''s trying to be in two places simultaneously. The object and camera are panning along the X and Y axes, and Z is constant. What sorts of things can cause this behavior? I have no idea where to even start right now.

Share this post


Link to post
Share on other sites
Advertisement
I''d also like to add that I HAVE read about precision problems, but that this happens very close to the origin as well. I''ve modified it so that the object being tracked stays at the origin, while the rest of the world moves. The object being tracked no longer shakes...but the entire world does, in concert.

Share this post


Link to post
Share on other sites
Not easily...it''s spread around many classes at the moment. I can provide little structural info, though.

It is a space sim, locked in an overhead view. Z_NEAR is 1.0, Z_FAR is 5000.0, but setting this to 100.0 (or lower) makes no difference.

Each model has a D3DVECTOR for it''s "real" position, a D3DVECTORD for its position (a copy of D3DVECTOR, but with doubles), and a D3DVECTORD for it''s movement vector, which contains X and Y travel distance per second.

The way the translation is calculated is roughly:

REAL POSITION = ( float )
(POSITION +
FRACTION OF MOVEMENT -
TRACKING OBJECT POSITION)

So everything up to the actual translation is done with doubles, then dropped to float. FPU_PRESERVE is on.

The tracked object (player''s ship) is locked in the center of the screen. The world moves reasonably well, but every now and then "hops" a pixel or so.

I CAN actually post some code, but it''s a lot to wade through at the moment, especially since I''m probably just doing something obviously wrong.

Share this post


Link to post
Share on other sites
You have to think your camera as a physical object. In real world movement is usually continuous.

What im trying to say is that your camera should have a little inertia like property. So when you attach your camera to obj your camera will try to follow it in smooth & continuous way. Because the frequency of jitters is greatly reduced by this method, it acts like suspension of your car.

[edited by - DirectxXx on March 8, 2004 3:00:49 PM]

Share this post


Link to post
Share on other sites
Actually, the camera is fixed, but yeah, mathematically it''s moving right long. I''m using time-based movement, and the numbers coming out look good. Should be nice and smooth. I even coded it to follow in a delayed spline to smooth it out, but that made no difference at all (in this case I was actually moving the camera, "chasing" after the ship...and the ship was shaking horribly).

Share this post


Link to post
Share on other sites
Well, this just gets more interesting all the time.

In my efforts to track this down, I commented out the code that actually computes the time-based animation, as well as the actual rendering code. Now I have a tight loop of (pseudo-code):

Do
Begin Scene
Calculate and dump ms since last frame (using timeGetTime)
End Scene
DoEvents()
Loop


Guess I should mention that this is in VB6.

Runs pretty quick (doing nothing). The ms times per frame dump like this:

7 8 8 6 7 10 75 2 2 1 1 3 8 6 8

The 10 to 3 are curious, and pop up every quarter to half second or so.

If I add a Sleep(5) in there somewhere (anywhere), it comes out like this:

19 18 19 40 18 18 19 18 27 20 19 18 1 18 19

Better, but still wrong. Something''s chewing up cycles, but what and where? Should I be setting the app priority?

Share this post


Link to post
Share on other sites
I hate to bump this up, but I hate being lost even more.

Could these numbers be the result of using timeGetTime? I don''t think so, since I''ve also used QueryPerformanceCounter and GetTickCount, with the same results.

Share this post


Link to post
Share on other sites
I don''t think the problem directly relates to timing accuracy.

You should sample your timing, so one bad time frame won''t cause hiccups. Hard drive/CPU activity will cause timing hiccups, which is unavoidable.

Share this post


Link to post
Share on other sites
maybe, you shoudl try using the average of the last 3 or 5 or 10 frame times instead of the single last frame. This will definately smooth it out some... I''m not sure if you will get other unwanted side-effects though. SOunds like ti would be worth a try in any case

Dwiel

My home page!!!
Find out about my diy LCD projector and programming projects!

Share this post


Link to post
Share on other sites
Time-based animation should keep things looking smooth, shouldn''t it? But if you normally have 8-9 ms per frame, and then have a 75 ms frame, it''s going to look jerky no matter what mathematical tricks you pull. There''s something wrong that this is even happening in the first place, and I''m stumped trying to figure out what it is or what to do about it.

There''s no extraneous processes running. I set the priority to maximum as a test. It ran incredibly smooth (the catch was that the priority was SO high that I couldn''t even do a reset...had to shut the power off to get out of it). It changing the priority a real solution? Is it the only solution?

Share this post


Link to post
Share on other sites
I don''t know if this will help, but are you by any chance rotating the cam to face the mesh after the page has been fliped? I had a problem with this same thing (in opengl but...) basicly I was:
rotating the cam to face the mesh,
placing the cam,
moving the mesh,
drawing,
then looping.
this gives a nasty jitter. to fix (if this is your prob) move the mesh, then face the cam at it then draw. hope this helps Klaus

Share this post


Link to post
Share on other sites
Thanks for the idea, but the camera''s facing vector never changes (it''s perpendicular to the X-Y plane, looking straight "down", as it were). The camera pans and moves in and out, but always faces the same direction.

Share this post


Link to post
Share on other sites
Let me see if I get this straight. You''re creating a top-down perspective game and you want your camera to always be on top of a certain vehicle or something?

It''s a long thread and I may have totally missunderstood, but if you just want the camera ontop of a particular object, then the solution is pretty simple. Just set the camera''s x and y coordinates to be the same as the object you''re following. Set the z coordinate to be some constant value or a predefined distance above the object.

Again, I may have totally missunderstood but hope this helps,
neneboricua

Share this post


Link to post
Share on other sites
You DO understand what I''m doing correctly.

The problem is that when I do that, the ship that I''m following with the camera jitters around, as if it''s coordinates are slightly off (even though they''re not). Something in the matrix transformations seems to be screwing it up, which research indicated was a precision problem. The extra catch is that the "precision problems" appear very near (0,0,0)...which it shouldn''t. At least the way I''m doing it now ensures that nothing is drawn far from the origin, which should keep precision good and stop the jitter...but it doesn''t.

Share this post


Link to post
Share on other sites
Hi...

Camera ''jitters'' are common in scrolling games. It is caused by the variations in your ''ms'' elapsed time. SPECIALLY SINCE YOU''RE USING VISUAL BASIC. I used to make some games with VB, and ''ms'' jumps around alot more in VB than C++.

Even C++ has similar problem, depending on someones machine and the stuff it runs in the back, the ms will just around, and that will make your camera ''jitter''

Few things you can do:

1) Use QueryPerformanceCounter to measure your time in microseconds, this help alot with getting more accurate elapsed time (don''t use GetTickCount in VB)

2) Get an Average of last 10-100 elapsed times (depending on your FPS, I usually average 1/2 second, so if my FPS is 100, i average 50 frames) so basically your average elapsed time will stay almost constant, and one outburst of 75ms against all 10-12 ms wont make any difference at all.

Averaging the elapsed times I think is a necesary thing to do if you want smooth animation that is based on your last frame time. best way i found to average these numbers is: Say you average 100 frames (that''s what im doing now), I create an array of integers with 100 elements in that array, I also have another variable that points to te first element of the array (var = 0). Every new elapsed time you get, store it in the array (array(var) = ms), then (var++ ) so you point to the next element, when var reaches last element(99) make var = 0;. Then at the same time, every frame get the average from this array (loop through all array elements and add them together, and devide by the number of elements. This gives you the average of the last 100 frames. The number stays smooth and there''s much much much less jitter.

3) Camera controls are very important for smooth scroll. Get a 2D vector that''s from camera location to character location (char.pos - cam.pos) then make camera travel that vector: (cam.pos = cam.pos + (vector*(ms/1000))
this will make the camera follow the character and move relative to character''s distance from the camera, this creates a smooth camera that accelerates and decelerates.

4) You can use a totaly different timing system, say you execute your code at 100Hz (100 fps) but i don''t mean cap your code at 100fps, you execute code every 10 ms, and render every frame, if timeelapsed is 10ms, the your code will update once per frame, if your code runs on a much faster machine say elapsed time is 1ms, your code will get executed 10 times in one loop and render once. This created framerate independant movement and gameplay still, but makes all your elapsed times say 10ms (if run at 100Hz), this gives you a very smooth update loop, and it doesn''t matter if the timeelapsed variates alot.

Hope this help you in getting your game to play smooth
Dima

Share this post


Link to post
Share on other sites
I follow what you''re saying about averaging frame times, but something seems amiss. Check my thinking here: if I set the camera X,Y to the tracked object X,Y every frame, then the tracked object shouldn''t jitter. Now, the rest of the universe might, but the tracked object shouldn''t. VB, C++, ASM...shouldn''t matter, right?

As for the averaging, if there is a 100-150 ms delay between a Flip and a BeginScene (as has been happening) all the averaging in the world won''t make the animation look smooth. The user will see a hop. Upping the priority seemed to help that, and I''ll grant that it may be a VB problem exclusively (at least on that particular machine).

So ideally, I could just hover the camera over the player''s ship, and IT, at least, shouldn''t shake. The rest of the universe might if the timing''s bad, but one thing at a time. By moving the world around and keeping the camera at origin, at least that''s working right.

VB or not, those frame times seem squirrely. And I''m not using GetTickCount (using timeGetTime), and I have used QueryPerformanceCounter, but it didn''t help, because when a slow frame pops up it''s REALLY slow.

I have noticed that this doesn''t happen on my GeForce at home, but it does happen on the ATI 7000 here. I need to try it on a couple of additional machines and see what''s what.

Share this post


Link to post
Share on other sites
Keep a separte ''simulation time'', and decide on a maximum tick time, then if the actual tick time is more than your max tick time, split it into two or more ticks. 1 tick = 1 frame isn''t the best way of doing things either, so seperate the two if possible.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
I think you''re updating the camera coordinates before updating the vehicle coordenates like this:

Camera.SetPosition( Vehicle.x, Vehicle.y );
Vehicle.SetPositon( xxxx, yyyy );

Share this post


Link to post
Share on other sites
Anonymous: True, camera position is set first, but then the ship is set to the camera position before it is drawn. Should match up, I''d think.

PyroSA: I''m not sure what you mean by "split ticks". Right now it''s time-based. If an object is moving 10 units per second (.01 units per tick), then when an 8-tick frame shows up the object is moved .08 units. If the next frame starts 45 ticks later, the object is moved .45 units, and so on. The problem is that occasionally a frame may take 150 ticks, and thus seem to move very drastically after that .15 second pause. I think setting the thread priority will help this, but not the camera tracking problem.

I''m just confused. If I do this:

Camera.SetPosition X,Y
Ship.SetPosition X,Y
Ship.Render

What in the world would cause the ship to bounce around, particularly if it''s always close to the origin?

Share this post


Link to post
Share on other sites
EUREKA!

I found that ALL timer calls (GetTickCount, timeGetTime, and QueryPerformanceCounter) would sometimes return the same result in subsequent frames. Almost like it was "forgetting" to count at all. I changed this:

lngNow = GetTickCount

To this:

Do Until lngNow <> m_lngLastTick
lngNow = GetTickCount
Loop

No hit to framerate, but animation is silky smooth now!

Share this post


Link to post
Share on other sites
Doing something when no real time went by is a bad idea

I meant splitting a large 150ms tick into say 5 30ms ticks, but only render 1 frame. This would only affect physics things which are affected by the time step though. I think they call it 'numerical instability' or something, and it often results in jittering.

My way for the timing would go something like this:


IngNow = GetTickCount
Do While Running
IngPrv = IngNow
IngNow = GetTickCount
TimeDelta = IngNow-IngPrv

Do while TimeDelta > 10
Tick(10)
TimeDelta = TimeDelta - 10
Loop

if TimeDelta > 0 then
Tick(TimeDelta)
end if
Render
Loop





[edited by - PyroSA on March 14, 2004 4:59:50 PM]

Share this post


Link to post
Share on other sites
So you''re talking about doing the actual processing based on (for example) 10 ms increments? Wouldn''t that still be jerky if the actual render is after 150?

A curious side-effect of waiting for a real value from timeGetTime/GetTickCount that I''ve noticed: framerate is locked at 64. It''s not really a problem, as I can draw a LOT more on the screen per frame and still get that speed. I loaded the frames to twice the polygons I should ever get in the actual game, and it was still 64. Odd number.

Share this post


Link to post
Share on other sites
The reason your frame rate is locked is probably because of the way you set up your D3D device. Make sure the PresentationInterval parameter of your D3DPRESENT_PARAMETERS structure is set to D3DPRESENT_INTERVAL_IMMEDIATE.

Otherwise, your framerate will be locked to the refresh rate of the monitor.

neneboricua

Share this post


Link to post
Share on other sites
It''s set to D3DPRESENT_INTERVAL_IMMEDIATE for fullscreen, but I''ve been running it windowed. It performs at 64, precisely, in either mode. Hence the term "strange".

Share this post


Link to post
Share on other sites

  • Advertisement