You can calculate the number of polygons your computer or a system can handle

Published October 04, 2013
Advertisement
Graphics cards have something called triangles/clock. The number of triangles a graphics card can handle is the triangles/clock multiplied by the core clock speed of the graphics processor.

I think my GTX 570 card at stock clock can handle up to 732 million polygons, or about 12 million polygons at 60 frames per second.

While it can handle 12 million polygons at 60 frames per second, I really shouldn't go that high. I have heard that each Pass you do causes the graphics card to basically process the model over. So a 20k model might become 40-60k polygons.

The Wii U I plan on developing for can handle 9 million polygons at 60 frames per second if I have figured correctly for this, but I plan on using 1.1 million polygons total.

You can usually search how many triangles/clock your graphics card can handle, or for a system like the Wii U for example, but the number of results are limited so searches are hard. If anyone wants to know what their card or a system can handle, I can assist you though. Let me know the graphics card or system and I will try to tell you what it can do.

Low-polygon development is best, but there are situations and developers that call for a large number of polygons.
Previous Entry Introduction
Next Entry The PS4
1 likes 1 comments

Comments

Bacterius

Yep, those figures sound about right, but are really kind of meaningless in some sense because what they really mean is "this is how many triangles per second you can push onto the screen with a single pass, no shading, no postprocessing, no nothing". In practice you want to stay well below the theoretical maximum if you're doing anything beyond rasterizing white polygons (iirc the original Crysis on "ultra" settings hovered around 2-3 million triangles per frame, which is a good point of reference in my opinion).

Arguably though poly count is not as relevant as it used to be, as I am told most games nowadays are limited by either the processor or the pixel shading stage, and many tricks to fake high resolution models have been developed, but knowing the limits of the graphics card - even as a rough estimate - is still important.

October 05, 2013 01:03 AM
You must log in to join the conversation.
Don't have a GameDev.net account? Sign up!
Profile
Author
Advertisement
Advertisement