• Advertisement

Archived

This topic is now archived and is closed to further replies.

hi! is this how you get the FPS for your game?....

This topic is 5782 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

DWORD ticknow = 0, tickdiff = 0, ticklast = 0; DWORD spritetick = 0; GameLoop() { ticknow = timeGetTime(); tickdiff = ticknow - ticklast; ticklast = ticknow; // 1000 / tickdiff = FPS // to move a sprite every one sec, spritetick += tickdiff; if(spritetick >= 1000) { // move sprite, spritick -= 1000; } } it doesn''t seem to work with me, by the, my fps there shows 50 - 70, however my sprite doesn''t move every sec, it moves faster, thanks!

Share this post


Link to post
Share on other sites
Advertisement
hmm. Should work. TimeGetTime isn''t very accurate though, so you could try using a high res timer. Look up QueryPerformanceCounter and QueryPerformanceFrequency on MSDN

____________________________________________________________
www.elf-stone.com

Share this post


Link to post
Share on other sites
sorry i made a mistake, my sprite moves not fast but very slow...,

so if i made this,

spritetick += tickdiff;

if(spritetick >= 1000)
{
spritePosX++;
spritick -= 1000; // reset sprite tick
}
}

the sprite will not move within a second, but around 4 seconds or so,

ie, the spritetick variable increments by 10 to 30,

what is wrong here?

thanks!


Share this post


Link to post
Share on other sites
oh and one more thing,

so is this formula correct?

1000 / tickdiff = Frames per second

Share this post


Link to post
Share on other sites

  
// FPS calculations

__int64 end_count, start_count, dClockFrequency;
QueryPerformanceCounter( (LARGE_INTEGER*)&start_count);

// Timer related.

QueryPerformanceFrequency( (LARGE_INTEGER*)&dClockFrequency );

//glLoadIdentity();

glClear(GL_DEPTH_BUFFER_BIT | GL_COLOR_BUFFER_BIT);

pScene->Process();

// FPS calculations again

QueryPerformanceCounter((LARGE_INTEGER*)&end_count);
//find the time

int fps = (int)(1.0f / ((float)(end_count - start_count) / (float)dClockFrequency));

// print fps

glColor3f(1, 0, 0);
m_pText->DrawText(20, 20, "%d fps", fps);

SwapBuffers(((XGLSetup*)m_pRendererSetup)->GetDC());
[\source]


-----------
my quote is under construction

Share this post


Link to post
Share on other sites

    
// FPS calculations

__int64 end_count, start_count, dClockFrequency;
QueryPerformanceCounter( (LARGE_INTEGER*)&start_count);

// You can take this outside not to call each frame

QueryPerformanceFrequency( (LARGE_INTEGER* &dClockFrequency );

glClear(GL_DEPTH_BUFFER_BIT | GL_COLOR_BUFFER_BIT);

pScene->Process();

// FPS calculations again

QueryPerformanceCounter((LARGE_INTEGER*)&end_count);
//find the time

int fps = (int)(1.0f / ((float)(end_count - start_count) / (float)dClockFrequency));

// print fps

glColor3f(1, 0, 0);
m_pText->DrawText(20, 20, "%d fps", fps);

SwapBuffers(m_hDC);


sorry for the double post
-----------
my quote is under construction

[edited by - mentat on April 19, 2002 12:23:54 PM]

Share this post


Link to post
Share on other sites
quote:
Original post by mickey
oh and one more thing,

so is this formula correct?

1000 / tickdiff = Frames per second




Well you''d commented that bit out, so I assumed it wasn''t part of the code. I was talking about the sprite timing bit when I said it should work.

To get FPS you want to divide your frames (1) by your seconds (strangely enough ), so

float FPS = 1.0/(float)(timedif*1000)

You''ll get vastly inaccurate results if you use timeGetTime for that though. You could solve that by calculating the fps across 10 or more frames, or you could use the performance counter (as I suggested above)

____________________________________________________________
www.elf-stone.com

Share this post


Link to post
Share on other sites
quote:
Original post by mentat
sorry for the double post

You saw the edit button - but you didn''t see the Delete Post check-box??

John B

Share this post


Link to post
Share on other sites
hi guys i got it working, it was my mistake, anyway pls last question...

why will this statement returns 0?

x = tickDiff / 1000;
// x is now 0

and instead, I need to do this so it will output correctly,
x = tickDiff / 1000.0f;

my tickDiff is
DWORD tickDiff;

thanks guys,

mentat: thanks too for your code, in what way is it better than using timeGetTime() wherin timeGetTime() already returns 1 millisec.,?

Share this post


Link to post
Share on other sites
quote:
Original post by mickey
mentat: thanks too for your code, in what way is it better than using timeGetTime() wherin timeGetTime() already returns 1 millisec.,?

timeGetTime returns a value in milliseconds, but it is not accurate to a millisecond.
And depending on the hardware, QueryPerformanceCounter will be accurate to much more than a millisecond (on my machine, QueryPerformanceFrequency tells me that I''m getting 3579545 ticks per second).

John B

Share this post


Link to post
Share on other sites
You get zero because you''re doing integer division. When tickDiff / 1000 is evaluated, the type of the result is the same as the type of the operands, which is DWORD. Since there''s no way of storing fractional values in a DWORD, the fractional part is discarded and you get zero. Only when the assignment is evaluated the zero is converted to a floating point 0.0. To do floating point division, you have to make one of the operands a float before the division takes place.

Share this post


Link to post
Share on other sites

if (timeGetTime()>LastTime+1000)
{
FrameRate = FrameCounter;
FrameCounter = 0;
DeltaTime = 1/FrameRate; // Float
LastTime = timeGetTime();
}



use FrameCounter++; just after each page flip or present etc.

SpeedPerFrame = SpeedPerSecond*DeltaTime;

Will work with slow-downs and average out per frame blips. known as ''varible step timing''.

,Jay


Share this post


Link to post
Share on other sites
Should also mention:

The performance counter is buggy on some chipsets (search MSDN for list), its to do with the PCI bridge causing it to jump upto several seconds at once.

The performance counter is not guarranted to be availible of all hardware, although x86 PC''s all have them.

I''d use timeGetTime, its accurate to approx 10ms (MS says 1ms) and should be enough for any game.

,Jay

Share this post


Link to post
Share on other sites
indirectX, ahh i see, thanks!

well, thanks so much for all your help guys! and yeah, when i compared the timeGetTime to a real second, it moves slightly faster,

Share this post


Link to post
Share on other sites

  • Advertisement