Sign in to follow this  
p997

[.net] Timer for sprite animation

Recommended Posts

p997    145
Hi, i've incorporated the dxmutmisc.sc from DirectX SDK to my program, so that i could use the high resolution timer (FrameworkTimer class) to time my sprite animation. in my code i used a float number animationRate to determine the frame rate: public void Run() { InitializeGraphics(); InitializeResources(); FrameworkTimer .QueryPerformanceFrequency (ref timerFrequency ); while (gameRunning) { Show(); Application.DoEvents(); FrameworkTimer.QueryPerformanceCounter(ref timeStart); Render(); FrameworkTimer.QueryPerformanceCounter(ref timeEnd); animationRate = ((float)timeEnd - (float)timeStart) /timerFrequency; } however, the result of animationRate is less than 0.02xxx on some computer while it is more than 0.07xxx on others. Now i got 2 questions. how do i use this animationRate to control my animation speed(changing sprite frames)since the number is so small? and how do i make the timer less CPU speed dependent, so that my program can run on different computers and still have the same effects. Thanks for any comments in advance.

Share this post


Link to post
Share on other sites
Daaark    3553
Use delta timing.

Every update you can add 0.1 to a float variable, and switch frames at 1.0. The thing is, you use the delta and only add (0.1 * delta) every update. The delta is time since the last update. This way every PC will more or less go from 0.0 to 1.0 at the same rate.

Here. This article talks about physics programming, but it's the same idea. Use your high rez timer to get the time every update, and then get the delta from them.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this