• Advertisement
Sign in to follow this  

FrameRate dependant movement

This topic is 4578 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

What i'm trying to do is keep the framerate as high as it can go but make the movement of my sprites be dependant upon that incase it changes so it doesn't look choppy. What i THOUGHT i would do is" extern float FrameRate; #define FRAMEDELAY (GetTickCount()-FrameDelay>0 ? GetTickCount()-FrameDelay : 1) At the beginning of every frame i do: FrameDelay = GetTickCount(); then when ever i want to move a sprite i multiply the step value times FRAMEDELAY, but sometimes my sprite moves so fast it looks like it does the step value a hundred times in one frame... Am i doing this the wrong way?

Share this post


Link to post
Share on other sites
Advertisement
GetTickCount has a low resolution, i tried on my computer and i could only squeeze a little over 60 updates per second. It says in the documentation that it has milisecond precision, but it doesn't.

You may want to check out a thread i created a while ago about it and high resolution performance counters. Link

Share this post


Link to post
Share on other sites
i don't really understand that... What type is your timeslice, currenttime and all that? you have other members in there like lowdword, what class is that? and can you explain what you're doing in that code?

Share this post


Link to post
Share on other sites
Don't pay too much attention to my pseudocode in the OP, i did some low level arithmetic that confused the code but like someone said in the same thread that was unnecessary.

Check your SDK documentation for QueryPerformanceCounter, QueryFrequencyCounter and LARGE_INTEGER, you should figure out what the types and members are in no time, and should be able to come up with a better solution then the crappy pseudocode in the OP.

Share this post


Link to post
Share on other sites
i got everything to work, instead of using GetTickCount() i'm just calling timeGetTime() and getting the difference from that. But is there a reason y my FPS is topped off at 61 FPS?

Share this post


Link to post
Share on other sites
nv/md, found out y that is, b/c i had vsync on (_IMMEDIATE instead of _ONE), but now everything still looks choppy...

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
Yup, vsync. If you're in fullscreen mostly likely your vid card is locked to 60fps max. You'll have to either turn off vsync or switch to windowed mode.

Share this post


Link to post
Share on other sites
is there anything else i'm doing wrong? i'm just tryinbg to get the maximum output of frames (_IMMEDIATE) but still have the movement be dependant on the framerate

Share this post


Link to post
Share on other sites
ok, i have everything running correctly with the frame dependant movement using timeGetTime() but things still look choppy at times... is there any other way or something i'm doing wrong?

Share this post


Link to post
Share on other sites
Quote:
Original post by EvilKnuckles666
ok, i have everything running correctly with the frame dependant movement using timeGetTime() but things still look choppy at times... is there any other way or something i'm doing wrong?


What you mean by choppy?

im not quite sure how your define is suppsed to work. The further on through your code you go from the start of the main loop the bigger the FRAMEDELAY is going to be, so all the objects that you process last will move more than the objects you process first.



Share this post


Link to post
Share on other sites
well, of course i make FrameDelay = timeGetTime() at the beginning of every frame. and it looks choppy like it moves fast then slow then fast then slow and i can actually see that happening. not really the frames, but it's not moving smoothly like i want it

Share this post


Link to post
Share on other sites
Quote:
Original post by EvilKnuckles666
well, of course i make FrameDelay = timeGetTime() at the beginning of every frame. and it looks choppy like it moves fast then slow then fast then slow and i can actually see that happening. not really the frames, but it's not moving smoothly like i want it


i know but say you have 10 sprites you going to be multiplying each one by FRAMEDELAY whish will probably always be GetTickCount()-FrameDelay

so it will be:

Sprite 1: step value * (GetTickCount()-FrameDelay)
Sprite 2: step value * (GetTickCount()-FrameDelay)
Sprite 3: step value * (GetTickCount()-FrameDelay)
Sprite 4: step value * (GetTickCount()-FrameDelay)
Sprite 5: step value * (GetTickCount()-FrameDelay)
Sprite 6: step value * (GetTickCount()-FrameDelay)
Sprite 7: step value * (GetTickCount()-FrameDelay)
Sprite 8: step value * (GetTickCount()-FrameDelay)
Sprite 9: step value * (GetTickCount()-FrameDelay)
Sprite 10: step value * (GetTickCount()-FrameDelay)

but by the time you get to 10 GetTickCount will be returning a greater number than at the start so (GetTickCount()-FrameDelay) will work out to be a bigger number by the end of the frame then at the start.

What you need to do is measure the difference between frame times

so like at the start of each frame have:

oldframetime = newframetime;
newframetime = GetTickCount();
framedelay = newframetime - oldframetime;

Share this post


Link to post
Share on other sites
awesome, everything works great now. but only in fullscreen mode, in windowed mode, things still look a little choppy. the game is going to be in fullscreen anyway, but for editting purposes it in windowed mode. is the choppyness just something i'm going to have to deal with in windowed mode or can i fix that too somehow?

Share this post


Link to post
Share on other sites
... theoretically, shouldn't it never be choppy becuase i'm multiplying the differnt from frame to frame by the step???

Share this post


Link to post
Share on other sites
When creating a game usually you would want to have a movement that is independent to framerate and not the opposite. The independent framerate movement allows the game to run at the same (phisical)speed on computers with different processing power.
If you want to have a smooth movement in your game you will have to chose between:

1.upgrading your computer.

2.use the computer with the same configuration but without GetTickCount()function.

Manipulating GetTickCount function will not give you a higher or a lower FPS. It will ony allow you to speed up character/camera movement in your game(that will be acomplished by skipping several frames on each movement step).




P.S. You can also improve your FPS by changing the screen refresh rate in Windows but I would not recommend that.

Edit: To rephrase what I have sad:
Don`t use GetTickCount function at all if you want a smooth sprite movement without upgrading.

[Edited by - Calin on August 4, 2005 8:41:37 PM]

Share this post


Link to post
Share on other sites
soooo... should i just have the have the sprite move there assigned step and for slower computers they will move slow and for fast computers they will move fast?? o.O

Is there a way to NOT use GetTickCount() and have sprites move independant of the speed of the computer, but still have the game run at it's top speed?

Share this post


Link to post
Share on other sites
Let`s suppose you have a game with the following (pseudo) code:

if ( "Up Arrow" is being pressed)
Soldier.Xposition += 5; // moves the soldier 5 units on the X axis.

Now suppose that you run your game on a comp that has a 1000 Mhz processor. Also suppose that while you hold down the "Up Arrow" for 1 second your character walks 200 units. This means that during a second your comp has run 40 times ( 200 /5 ) through the game loop. This also means that the game has 40 FPS.

Now, if you take your game and run it on a 3000 Mhz Processor computer the game will run roughly three times faster: when the "Up Arrow" will be pressed for a second the character will walk 600 units instead of 200.

If you want to make your soldier walk 200 units/second in the second situation too, you will have to reduce the motion speed of character to compensate for the greater processing power of the 3000 Mhz computer. To put it another way, you will have to find a way to knot the soldiers movement speed with the computer speed. This is where GetTickCount ( or better QueryPerformanceCounter ) function comes useful. These functions help you find the amount of time nedeed for a computer to run once throught the game loop.

Here you cand find a short and easy tutorial on using QueryPerformanceCounter function.

Quote:
Is there a way to NOT use GetTickCount() and have sprites move independant of the speed of the computer, but still have the game run at it's top speed?


The game should run at the top speed no matter wether you are using GetTickCount() or not. As I mentioned before the functions in question affect only the character/objects motion speed and not the frame rate of the game. GetTickCount() is not used to increase/decrease FPS. For instance, in the example above, you can make the your character move at 600 units per second on the 1000 Mhz comp by changing Soldier.Xposition += 5 to Soldier.Xposition += 15; however this does not cause a FPS increase. The FPS will remain the same (40 in our case), the only difference will be a much choppyer animation(the comp will use 1 frame for every 15 motion units instead of 3 frame for the same motion distance.)

I tryed to be as explicit as possible.
One last note: there might be 1000+ things that can cause a low FPS rate. The best advice I can give you is to run your game on another computer( a better one). If you still get low framerates most likely the problem hangs in your game code.

[Edited by - Calin on August 4, 2005 9:43:30 PM]

Share this post


Link to post
Share on other sites
First off as far as I can tell the “choppy ness” is due to a mismatch between the number of frames your app is trying to draw per second and the refresh rate of your monitor (if v-sync is on). Lets say your monitor refresh is 60 fps and your timer is not all that accurate – as it sounds in your case – and your actually drawing 59.5 frames per second so your half way through processing half a frame when the monitor is done it’s 60, so that frame may end up getting thrown away, not drawn at all or delayed. Because it’s probably quite hard to match any system timer with the monitor refresh timer, there may be an amount of phase difference that is causes the frame skips. The solution is turn off v-sync. Perhaps.

Wow I can’t believe no one has yet suggested using some simple maths to solve your movement problem. Remember that if you travel 1 meter in one second then your speed is 1m/s ? So why not give all your objects velocities then calculate the actual distance covered

V = d / t [EDIT: * <> / opps]

So

d = v * t

You know t, don’t you because you are controlling how often frames get drawn. Using a timer, so it should all end up being independent of how fast the computer is.

[Edited by - bobason456 on August 5, 2005 10:18:45 PM]

Share this post


Link to post
Share on other sites
that would work too, but i wouldn't know t becuase i want the cpu to output the maximum amount of frames, i know that for my copmuter, but i wouldn't know that for anyone else's computer so i would need to calculate that

Share this post


Link to post
Share on other sites
ok, i kinda editted that class u showed me in that tut. does this seem correct? cuz it's working, but when i output the "time elapsed" i'm getting huge numbers!

class cFrameRate
{
private:
LARGE_INTEGER m_TicksPerSecond;
LARGE_INTEGER m_CurrentTick;
LARGE_INTEGER m_LastTick;

float m_FPS;
double m_TimeElapsed;

public:
void Init();

void Update();
double GetTimeElapsed() {return m_TimeElapsed;};
float GetFPS() {return m_FPS;};
};

void cFrameRate::Init()
{
QueryPerformanceCounter(&m_LastTick);
QueryPerformanceFrequency(&m_TicksPerSecond);
}

void cFrameRate::Update()
{
QueryPerformanceCounter(&m_CurrentTick);

//This frame's length out of desired length
m_TimeElapsed = (double)(m_CurrentTick.QuadPart - m_LastTick.QuadPart)/(double)m_TicksPerSecond.QuadPart;
m_FPS = 1000 / (float)(m_CurrentTick.QuadPart - m_LastTick.QuadPart);
if (m_TimeElapsed < 0.001)
m_TimeElapsed = 0.001;

m_LastTick = m_CurrentTick;
}

i'm thinking that the problem has something to do with the float and double being different or something...

Share this post


Link to post
Share on other sites
Choppiness maybe caused by round off errors when doing math with time displacements.

Check out this thread
Link

Share this post


Link to post
Share on other sites
here is another good link on the topic link

Just added such timers to my engine and every thing is going perfectly smooth So it just goes to show that my original thoughts on vsync were wrong :)

Share this post


Link to post
Share on other sites
i looked at that link and it's the same as what i'm doing. the numerator and denominator are (double)

m_TimeElapsed = (double)(m_CurrentTick.QuadPart - m_LastTick.QuadPart)/(double)m_TicksPerSecond.QuadPart;

how would i figure out what the FPS is? at first i thought it was:
m_FPS = 1000 / (float)(m_CurrentTick.QuadPart - m_LastTick.QuadPart);

then i thought it was:
m_FPS = (float)(m_TicksPerSecond.QuadPart / m_TimeElapsed);

but i keep getting really huge numbers...

1. How do i fix the little jidderyness that i can still see?
2. How do i calculate FPS from info. that i have?

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement