FPS drops dramatically....can someone explain why?

Started by
8 comments, last by Spartacus 21 years, 9 months ago
Hey! Can someone explain to me why my FPS drops by 200 just because I draw a textured quad that covers all of the screen? Is that normal? So what happens when I start to add sprites and so (im doing a 2D game btw), then my fps will go down like crazy. Is there anything I can do to optimize this? Thanks René

Real programmers don't document, if it was hard to write it should be hard to understand

Advertisement
Not sure what you''re expecting... Drawing a "textured quad
that covers all of the screen" isn''t exactly "cheap". The
bigger the screen is, the more pixels need to be filled.

No matter which way you do it, the graphics card has to fill
up those pixels.


~~~~
Kami no Itte ga ore ni zettai naru!
神はサイコロを振らない!
Thanks for the quick reply! I know filling the whole screen is expensive, but I didn''t expect it to be that expensive. I just thought 200 fps was alot but if that''s the way it is I suppose there is not much I can do about it

René


Real programmers don''t document, if it was hard to write it should be hard to understand


Real programmers don't document, if it was hard to write it should be hard to understand

From my personal experience, I lose about 10 fps for every
model I add to the scene (just cylinders). The window
size is 800x600. With 23 of these things running around,
I get about 350 fps.

I started with an empty world (~900 fps), added a "field"
which dropped it to 450 fps.


~~~~
Kami no Itte ga ore ni zettai naru!
神はサイコロを振らない!
FPS rates are not exactly a good measure of what's going on. At 1000 fps, a 200 fps drop is about a 0.00025 second increase in rendering time for each frame (or 25 percent of 0.001, the time it takes to render one frame at 1000 fps). At 400 fps, the render-time increase of a 200 fps drop is 0.0025 (100 percent of 0.0025, the per-frame render time at 400 fps), or ten times the per-frame time increase at 1000 fps (four times the percentage increase). Just mentioning the fps delta is really quite worthless, as it's a somewhat hyberbolic measure. Perhaps the per-frame render-time would be more useful in diagnosing the effect of a certain item on how fast things are going.

As a rather extreme example, going from 100 fps to 150 is not nearly as great an increase as going from 10 fps to 60, although the fps delta is the same in both cases.

[edited by - Assassin on July 15, 2002 5:53:05 PM]

[edited by - Assassin on July 16, 2002 1:06:30 AM]
Assassin, aka RedBeard. andyc.org
200 fps down from how many?
well just look at some very simple numbers.
if your card fillrate is 3.2gb/s and you''re using 1024x768x32bpp mode your screen size is 1024*768*4 ->800kb*4->3.2mb in size. So you can perform a simple fill of whole memory 1000 times per sec. This number will get much lower if you notice that achieving 100% performace isn''t possible in real-life and add a big penalty for reading/writing z-buffer... you might end up with filling your screen only 500times per sec with a graphic card which fillrate is 3.2gb/s... so drawing only 10 fullscreen quads will make you run at about 50fps... but at the same time you can probably draw 10k quads where 1k quads fill one screen... so in your example you are very much fillrate limited.
The other limitation is triangle throughput limitation, where you''re trying to draw 100k polys per sec and even when they take very little area on screen your graphic cards gets a big performace hits... there are many reasons for that. (including quite slow transfer of vertexes/etc from system memory)
With best regards, Mirek Czerwiñski
assassin:I think your example should be going the other way around An increase from 100 to 150fps isn''t as great as an 10 to 60fps. In our first example you get 50% fps incresae while in the second one you get 500% fps increase.
With best regards, Mirek Czerwiñski
No I agree with assassin. I start out with a screen with nothing on it and I get 300 fps. I draw 1 model on it, I get 260. Draw 2, I get 250, 3 240. When I have a fully textured and model world going I''m at about 40fps. Frame rate is like an exponentional (or maybe hyperbolic) function. The lowe you go the more the difference and harder it will be to go lower. At high frames rates, frames are nothing, at low, they mean everything.

"Love all, trust a few. Do wrong to none." - Shakespeare

Dirge - Aurelio Reis
www.CodeFortress.com
Current Causes:
Nissan sues Nissan
"Artificial Intelligence: the art of making computers that behave like the ones in movies."www.CodeFortress.com
dirge:gosh, you have totally missed assassins points (hope I didn''t
the fps changes come from a very simple fact.
Let us say one screen fill takes 2ms.
Let us also say that screen cleaning and all other graphic card related stuff which it has to do every frame takes 2ms as well.
1ms=1/1000s so 2ms=1/500.
So just displaying an fps counter might show you 500fps.
if we fill the whole screen once we will additionally use 2ms for it every frame, so our average frame will take 2+2ms=4ms=1/250s so we will get 250fps. A drop of 250fps!
if we fill screen twice per frame we will get 2+2*2ms=6ms=1/166s - a drop of 84fps
if we fill our screen three times per frame we get 2+2*3ms=8ms=1/125s - drop of 41fps
So you see, it''s all is just some simple math.

PS.because of many reasons it''s far from being as simple as it is shown here because it''s not easy to predict exactly how cpu/graphic card will talk to eachother and how fast it all will run. Althrought it''s easily possible to get good predictions with such simple examples in a real-life game it''s very hard to predict everything that happens in a PC, especially because different components like cpu, gpu, chipsets etc etc often work differently.
With best regards, Mirek Czerwiñski
Hey thanks for the help everyone! This really cleared something up for me! After reading these replies, I really don''t find it very strange that my FPS drops by 200 just because I fill the whole screen.

Thanks!
René


Real programmers don''t document, if it was hard to write it should be hard to understand


Real programmers don't document, if it was hard to write it should be hard to understand

This topic is closed to new replies.

Advertisement