hehe - I had a bizarre case once where GDI actually increased my frame rate....
For some reason whenever I turned my quick and dirty fps counter off, my framerate plummeted to around 1 fps. (ie, it was so slow you didnt need an fps counter to measure it)
I never did figure out why.... I think it was something to do with my network code.
Quick and Dirty fps display
Allright, for now I''m just dumping the framerate to a file. I''ll try c3dfont tonight. Here''s whats bothering me though:
I am writing a single triangle to the display, doing practically nothing else (other than checking windows messages etc.), and only getting ~60-70fps. Is there something in my initialization code that could slow everything down? I have a Radeon VE 32mb, 128mb ram and 450mhz proc, so I think it should be able to go faster than that?
I am wondering if there are any flags in initialization that have to be set/cleared to speed up rendering. Some people boast a complete scene with 100s of triangles renders at >200fps. What gives?
I am writing a single triangle to the display, doing practically nothing else (other than checking windows messages etc.), and only getting ~60-70fps. Is there something in my initialization code that could slow everything down? I have a Radeon VE 32mb, 128mb ram and 450mhz proc, so I think it should be able to go faster than that?
I am wondering if there are any flags in initialization that have to be set/cleared to speed up rendering. Some people boast a complete scene with 100s of triangles renders at >200fps. What gives?
Hmmm...
Could be how you''re computing the fps.
For example, the actual time it takes to get to the screen will be a function of the vertical frequency of the monitor. An LCD runs at 60Hz, so the actual framerate never exceeds 60. If you time just the processing, then you will be timing the processing time of the frame, regardless of screen frame rate.
Take a look at how you are computing the rate. Also, are you using the high performance timers? That will make a difference as well. I think there was a recent GameDEv article called "frame rate independent animation" that talked about the timers.
Could be how you''re computing the fps.
For example, the actual time it takes to get to the screen will be a function of the vertical frequency of the monitor. An LCD runs at 60Hz, so the actual framerate never exceeds 60. If you time just the processing, then you will be timing the processing time of the frame, regardless of screen frame rate.
Take a look at how you are computing the rate. Also, are you using the high performance timers? That will make a difference as well. I think there was a recent GameDEv article called "frame rate independent animation" that talked about the timers.
Another thing to watch out for is don''t do any initialisation during your main loop. I know some people have tried loading their graphics etc, every frame which is, undeniably, slow.
Are you sure you''re not running up against the refresh rate and just waiting for the flip with your one triagle thing?
Jack
Jack
Yeah, I was running into the 70 hz refresh rate of the monitor, but only in fullscreen mode. I am now rendering ~33000 triangles/sec windowed. Is that acceptable?
I don''t know how to find a fps rate above 70 (actually 76) in fullscreen ''cause my code syncs with the monitor. Anyone have a workaround for this? It would be nice not having to wait for the refresh but instead to have time for unit movement etc. THx.
I don''t know how to find a fps rate above 70 (actually 76) in fullscreen ''cause my code syncs with the monitor. Anyone have a workaround for this? It would be nice not having to wait for the refresh but instead to have time for unit movement etc. THx.
Most drivers will allow you to disable the VSync, at least for OGL. Seeing as we''re using DX, I''m not quite sure - all I know is that my NVidia Detonator3 drivers don''t have VSync enabled and for that I am thankful!
But... you could see how much time it actually took for each frame, find the average, and divide 1 second by it. This way you could see how fast your game *could* have run if it were not for VSync.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement