framerate-problem

Started by
3 comments, last by hajkr 22 years, 5 months ago
hi, i tried to limit the framerate to 60 fps in the direct3d samples of the dx8sdk( in the file d3dapp.cpp ). I added the following code in Render3dEnvironment(): // Get the app''s time, in seconds. FLOAT fAppTime = DXUtil_Timer( TIMER_GETAPPTIME ); FLOAT fElapsedAppTime = DXUtil_Timer( TIMER_GETELAPSEDTIME ); if( ( 0.0f == fElapsedAppTime ) && m_bFrameMoving ) return S_OK; // !!! Code added to d3dapp.cpp // Force a given framerate Beginnnging of added code static float fNextTime; if( fAppTime >= fNextTime ) { fNextTime = fAppTime + 1.0f / m_fFramerate; } else { return S_OK; } // end of added code // FrameMove (animate) the scene if( m_bFrameMoving || m_bSingleStep ) { // Store the time for the app m_fTime = fAppTime; m_fElapsedTime = fElapsedAppTime; // Frame move the scene if( FAILED( hr = FrameMove() ) ) return hr; m_bSingleStep = FALSE; } // Render the scene as normal if( FAILED( hr = Render() ) ) return hr; // Keep track of the frame count { static FLOAT fLastTime = 0.0f; static DWORD dwFrames = 0L; FLOAT fTime = DXUtil_Timer( TIMER_GETABSOLUTETIME ); ++dwFrames; // Update the scene stats once per second if( fTime - fLastTime > 1.0f ) { m_fFPS = dwFrames / (fTime - fLastTime); fLastTime = fTime; dwFrames = 0L; // ... My problem is: the framerate in window-mode of the app is little less than 60 fps ( 59.8), but if i change to REF or to fullscreen it is much less ( 30.fps ). can anybody please help?
Advertisement
I''m not sure what the question is - yes, your framerate is going to be much slower with the ref device - it is not meant to be used for anything other than testing (read the docs...)

Why are you trying to limit framerate, especially when your imposed limit is higher than what your hardware can do??
Author, "Real Time Rendering Tricks and Techniques in DirectX", "Focus on Curves and Surfaces", A third book on advanced lighting and materials
oh, yes you are right.
I changed something in my code for the camera and:

if( FAILED( hr = m_pInput->GetKeyboardInput( ) ) )
return DisplayErrorMsg( hr, MSGERR_APPMUSTEXIT );

if( KEYDOWN( m_pInput->m_cKeyboardbuffer, DIK_W ) )
m_pCamera->Translate( 0.0f, 0.0f, 10.0f* m_fElapsedTime );

if( KEYDOWN( m_pInput->m_cKeyboardbuffer, DIK_S ) )
m_pCamera->Translate( 0.0f, 0.0f, -10.0f * m_fElapsedTime );

// ...

i expected that the camera would translate smothly, but it laggs,
even in HAL. if i remove the "m_fElapsedTime" in the translate function it translates smothly, but then i have a different gamespeed on different computers.
Does anybody have an solution to this pronlem?
translate based on time, not framerate. Instead of saying "Move x units every frame", say "move x units every second" and then see how much time has passed each frame.
Author, "Real Time Rendering Tricks and Techniques in DirectX", "Focus on Curves and Surfaces", A third book on advanced lighting and materials
i thought i would translate based on time because it is
10.0f * m_fElapsedTime. i don''t understand this.
if i would only write translate( 10.0f ); then it would be a translation based on frames or is that not right?

This topic is closed to new replies.

Advertisement