Time elpased since the last frame

Started by
6 comments, last by Enokh 20 years, 4 months ago
Here''s something you all are prolly familiar with:

// runs every frame

CurrentFrame = timeGetTime() * 0.001f;
DeltaFrame = CurrentFrame - LastFrame;
LastFrame = CurrentFrame;
Then you multiply DeltaFrame by the addition to the position of different objects, like the player''s, bullets'', etc. Recently I''ve been implementing a bullets-player collision system, that has to work well with high speeds as the bullets travel extremely fast. This means that they can jump over the player if their velocity is fast enough (bullet being on one side of the player, and in the next frame on his other side). So I developed a system that creates a line from the bullet to where it will be in the next frame. This means that this line''s size is basically the speed of the bullet. Then I create an amount of points on the line that depend on it''s own length. After that I check each of those points versus the player, that way the bullet can never go ''through'' the player. If the shooter was standing still and kept on shooting the size of the lines of the bullets kept varying, even though no paramter to the line''s size changed. (the shooter stood still, kept the same angle, and the bullet''s velocity never changed as the shooter did not change weapons). So the only paramter to change the size of the line was DeltaFrame. Infact it changed so much that sometimes the size of the line was so close to 0 that only one point was created on the line, and then that specific bullet could infact go through the player. It does make sense for DeltaFrame to change as the FPS varies slightly according to changes on the screen etc. How about this: You make a moving average for the DeltaFrame: Meaning you save the last 10 ''DeltaFrames'' in an array and then put an average of all 10 cells into the DeltaFrame variable that you''re actually gonna use. This smooths everything out, from the movement of the player to the bullets and everything else that uses DeltaFrame, as the changes to that variable are not so drastic anymore. But it still changes according to what''s happening on the screen as it should (since 10 frames is only one sixth of a second if your engine is running at 60 FPS, for example). Not only that but it completely solved my problem with the bullet-player collision detection, as the size of the line was very close to constant now, and never went through the player anymore. What do you guys think of using the moving average method for your DeltaFrame variable?
Advertisement
Could you not simply use the line (whatever length it may be) and see if it intersets the bounding box of the player?


-=[ Megahertz ]=-
-=[Megahertz]=-
Averaging (plain or "weighted") of the elapsed time has worked well in the places I''ve used it in the past.

If you draw a graph of frame to frame deltas on a PC game that''s doing a decent amount of stuff which isn''t using VSYNC, you''ll notice it can be very noisy/spiky - a lot of that is just the way PCs and running on top of a multitasking OS is...

If you think about it, it does make sense to average in some way. If you''re using the elapsed time for your simulation (movement etc), then that was the time of the PREVIOUS frame and might be totally wrong for the CURRENT frame (which hasn''t yet finished).
The previous frame may have executed really quickly so you''ll have a low elapsed time - the current frame could completely be different and have a really high elapsed time, the network card may have recieved some packets and interrupted the CPU to announce the fact for example. The visible result is stutteryness.

The average elapsed time smooths out sudden "spikes".


Something else to do is keep track of elapsed times that go over a certain amount i.e. have a "maximum allowable" elapsed and set elapsed to that if the current average elapsed goes over - otherwise if something really bad happens to the previous frame(s), you get a huge elapsed value which when plugged into your simulation sends things shooting through walls etc

--
Simon O''Connor
3D Game Programmer &
Microsoft DirectX MVP

Simon O'Connor | Technical Director (Newcastle) Lockwood Publishing | LinkedIn | Personal site

Another thing to note here is that, if I recall correctly, timeGetTime() doesn''t return completely accurate readings. I forget the details...but I''m pretty sure that''s the case. If you use QueryPerformanceCounter instead, I believe you''ll get much more accurate time readings.

Also, as MHz said, using a line to test for collision will work much better than a bunch of sampled points on a line. With samples, it''s inevitable that you''ll run into errors every now and then.

-John
- John
so how do people render things like bullet time?
One cheap way is to do deltaTime_in_calculations = deltaTime_between_frames / the_bigger_the_slower.

[edited by - Enselic on December 1, 2003 7:04:28 AM]
[s]--------------------------------------------------------[/s]chromecode.com - software with source code
Enokh: Smoothing your framerate counter will give you a smoother framerate counter, but I think relying on something like that is a sign your algorithm might be flawed. After all, if a frame is rendered very quickly, your bullet line SHOULD be shorter. Making it longer will return bad results sometimes, as the bullet will register as colliding with objects that it hasn''t actually gotten to yet. As another poster suggested, I would look into calculating the intersection of your objects with the bullet line itself, instead of a limited number of points ON that line. It''s pretty simple, and probably a lot faster to compute as well.

Enselic: I think that''s a pretty *good* way of doing it!
--Riley
Didn''t Quake use that method for its weapons? Trace-scanned weapons for guns and others where you dont see the projectile. And projectile type of weapons like grenades launcher where you could actually see the projectile?

This topic is closed to new replies.

Advertisement