Jump to content
  • Advertisement
Sign in to follow this  

Why unstable frame?

This topic is 2944 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I try to render the number of trees that users choose, e.g. when user choose 100 then I render 100 trees. To simplify the implementation, I create instance vertex buffer(VB)with the maximum number of trees ( CreateVertexBuffer(treeMaxNum*sizeof(CreateTree::TreeInstance), D3DUSAGE_DYNAMIC,0,D3DPOOL_DEFAULT,&m_pTreeInstanceVB,0)) in initial stage that runs only one time. In rendering stage, according to the number of user chose (treeNumber) to render trees with hardware instancing: pDevice->SetStreamSource(0, m_pTreeVB, 0,  D3DXGetDeclVertexSize(declTreeInstance,0)); pDevice->SetStreamSourceFreq(0, D3DSTREAMSOURCE_INDEXEDDATA | treeNumber); pDevice->SetStreamSource(1, m_pTreeInstanceVB, 0, D3DXGetDeclVertexSize(declTreeInstance,1)); pDevice->SetStreamSourceFreq(1, D3DSTREAMSOURCE_INSTANCEDATA | 1UL); Why the time costed in every frame differs so much: 2.92969ms,1.46484ms,0.976563ms,0.488281ms,1.46484ms,0.976563ms,10.7422ms,402.344ms,23.4375ms.......... I dont know where is wrong, the time in every frame differs so much so the animation is not affluent. I computed the time that used in every frame as following: void renderScene() { double currentTime,preTime; preTime = GetTime(); //here is the function to render scene pDevice->Present(NULL,NULL,NULL,NULL); currentTime = GetTime();//the time difference (currentTime - preTime )is used in every frame. } Any advice is appreciated.

Share this post


Link to post
Share on other sites
Advertisement
1. What function are you using to measure time?
2. Hardware interrupts, the OS doing time slicing differently, the video driver doing things differently on different frames, all sorts of things can cause slight variations.

The usual way to do this is to keep track of the frame time for the last few (say 10) frames, and then average them, and use that averaged time to run your animation and other time-dependant code.

Share this post


Link to post
Share on other sites
Quote:
Original post by Evil Steve
1. What function are you using to measure time?

I use QueryPerformanceFrequency and QueryPerformanceCounter to get time. Detailed code is:

inline double GetTime()
{
static BOOL init = FALSE;
static BOOL hires = FALSE;
static unsigned __int64 pf = 1;
unsigned __int64 baseTime_;

if(!init)
{
hires = QueryPerformanceFrequency((LARGE_INTEGER *)&pf);
if(!hires)
pf = 1000;
init = true;
}


if(hires)
QueryPerformanceCounter((LARGE_INTEGER *)&baseTime_);//
else
baseTime_ = GetTickCount();

return ((double)baseTime_ / (double)pf);
}

Quote:

2. Hardware interrupts, the OS doing time slicing differently, the video driver doing things differently on different frames, all sorts of things can cause slight variations.

The variations are not slight: 0.976563ms,10.7422ms,402.344ms,23.4375ms
The way I set instance VB is correct?

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!