Archived

This topic is now archived and is closed to further replies.

Stupid time-based movement issues - help!

This topic is 4947 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

This is a followup question for my last thread about accurate timing. With a low fps (60), my camera moves slowly; with a higher fps (1000), the camera moves faster, despite these steps: 1. Calculate time delta between frames
QueryPerformanceFrequency((LARGE_INTEGER*)&freq);
QueryPerformanceCounter((LARGE_INTEGER*)&start);

DrawGLScene();				
SwapBuffers(hDC);
		
QueryPerformanceCounter((LARGE_INTEGER*)&end);
float milliseconds = (float(end - start) * 1000 / (float)freq);
tdelta = milliseconds / 1000;
2. Increment camera velocity by a constant times the tdelta
 
Cam.Move(1.0f * tdelta);


where

   
Camera::Move(float v) {vel += v;};
3. Update camera
        //always decelerate small amount to have smooth stops

	if (vel > 0.0f)
		vel-=0.5f * tdelta;
	else if (vel < 0.0f)
		vel+=0.5f * tdelta;
	
	//make sure we stick to the speed limit

	if (fabs(vel) > maxvel)
	{
		if (vel < 0.0f)
			vel=-maxvel;
		else
			vel=maxvel;
	}
	
	//scale direction vector

	dir *= vel;

	//increment position by dir vector

	pos.x += dir.i;
	pos.y += dir.j;
	pos.z += dir.k;
I'm thinking there must be a bug in step 3 unless I'm forgetting to scale something else. I'm pretty sure the dir vector is calculated correctly, so I doubt there's any problem with that part. Any help would be appreciated. [edited by - Silex on May 31, 2004 5:19:14 PM]

Share this post


Link to post
Share on other sites
I will post the class I have made for timing
It may help you discover what is wrong


class CTimer
{

protected:

double m_Frequency;

__int64 m_StartClock;

float m_FrameTime;
float m_FrameStart;
float m_FrameEnd;

float m_FpsCount;
float m_FpsUpdate;
float m_Fps;

public:

float GetFrameTime() { return m_FrameTime; }

float GetFps() { return m_Fps; }

double GetTime();

void Init();
void Update();

};



#include <windows.h>
#include "timer.h"

double CTimer::GetTime()
{
__int64 EndClock;

QueryPerformanceCounter((LARGE_INTEGER*)&EndClock);

return (double)(EndClock-m_StartClock)*m_Frequency;
}

void CTimer::Init()
{
__int64 rate;

// Get the performance frequency

QueryPerformanceFrequency((LARGE_INTEGER*)&rate);

// Invert it so we can multiply instead of divide

m_Frequency = 1.0/(double)rate;

// Get the start time

QueryPerformanceCounter((LARGE_INTEGER*)&m_StartClock);

m_FrameTime = 0.0f;
m_FrameStart = (float)GetTime();
m_FrameEnd = 0.0f;
m_FpsCount = 0.0f;
m_FpsUpdate = 0.0f;
m_Fps = 0.0f;
}

void CTimer::Update()
{
// Update the timing

m_FrameEnd = (float)GetTime();
m_FrameTime = m_FrameEnd - m_FrameStart;
m_FrameStart = m_FrameEnd;

// Increase the Fps counter

m_FpsCount++;

// Update the Fps

if( (m_FrameStart - m_FpsUpdate) > 1.0f )
{
m_FpsUpdate = m_FrameStart;
m_Fps = m_FpsCount;
m_FpsCount = 0.0f;
}
}

Share this post


Link to post
Share on other sites
I think I know what the problem could be. You're clamping your velocity to the maxvel AFTER applying the time scaling. Hence, after reaching a maximum velocity, the distance that the camera moves per frame is the same irrespective of the framerate.

The velocity of an object should not be related to the framerate in any way. Clamp the velocity to the maximum, and then scale the resulting velocity vector by tdelta when you're moving the object.

[edited by - MumbleFuzz on June 1, 2004 7:18:29 AM]

Share this post


Link to post
Share on other sites