You can calculate the number of polygons your computer or a system can handle
Graphics cards have something called triangles/clock. The number of triangles a graphics card can handle is the triangles/clock multiplied by the core clock speed of the graphics processor.
I think my GTX 570 card at stock clock can handle up to 732 million polygons, or about 12 million polygons at 60 frames per second.
While it can handle 12 million polygons at 60 frames per second, I really shouldn't go that high. I have heard that each Pass you do causes the graphics card to basically process the model over. So a 20k model might become 40-60k polygons.
The Wii U I plan on developing for can handle 9 million polygons at 60 frames per second if I have figured correctly for this, but I plan on using 1.1 million polygons total.
You can usually search how many triangles/clock your graphics card can handle, or for a system like the Wii U for example, but the number of results are limited so searches are hard. If anyone wants to know what their card or a system can handle, I can assist you though. Let me know the graphics card or system and I will try to tell you what it can do.
Low-polygon development is best, but there are situations and developers that call for a large number of polygons.