Sign in to follow this  
Tyler Camp

"Standard" Resolution Performance

Recommended Posts

Tyler Camp    134

One of my professors has made the claim that using a "standard" resolution (i.e. 1920x1080, 1024x768, etc.) will provide better performance than using  "non-standard" resolution (1500x755, etc.) I've never heard of this before and can't seem to find anything to back his claims. I know a lot of games on consoles render to lower (and "non-standard") resolutions and then upscale for better performance, which is the opposite of what he's stated. I don't know what he really means by "resolution" and he hasn't clarified. (The display resolution? The resolution of any framebuffer that is being rendered to?) He didn't say that this was specifically for any platform, but I was in an XNA on Windows class at the time.

 

I've tried benchmarking it myself (basic 2D sprite rendering via XNA, fullscreen with different backbuffer resolutions) and didn't see any performance penalties/gains that were out of the ordinary. Asking him about it just got the response of "well, it only happens for more advanced graphics algorithms." Has anyone else heard anything like this?

Edited by tylercamp

Share this post


Link to post
Share on other sites
MilchoPenchev    1178

I don't think there's anything special about the 'standard' resolutions. There's a ton of standard resolutions, given all monitor sizes and ratios. 

As far as I know the graphics pipeline doesn't optimize anywhere based on what resolution you're rendering at.

 

There's two things sort of related that I know can sometimes make a difference - using textures that have are square with a size that's a power of 2 - which was only a big deal on older graphics cards. 

 

The other thing is that running a game in fullscreen (not the fake borderless window style, but actual fullscreen) gives a small performance boost, since there are optimizations done about rendering via the OS. 

Share this post


Link to post
Share on other sites
cowsarenotevil    3005
The best rule of thumb I can think of here would be to prefer outputting at the monitors maximum supported resolution (which should be it's native resolution).

 

Even that's not always perfect, though. I've run into quite a few LCD projectors, for instance, that "support" higher resolutions than they can actually display, and actually seem to downsample the input signal. Worse yet, the downsampling was very crude, but just "good" enough that it was obvious that it was supposed to be a "feature."

Share this post


Link to post
Share on other sites

Asking him about it just got the response of "well, it only happens for more advanced graphics algorithms."

I'd ask him for an example of such "advanced graphics algorithms".

 

Frankly it sounds like he doesn't know what he's talking about, or accidentally said something he didn't mean and has low enough self-esteem that he can't bring himself to back out of it.

 

It's plausible that some algos would work best for resolutions exhibiting certain qualities, like being divisible by a certain number. And for a final render target that is a certain resolution, some lower resolutions will upscale to it more cleanly than others. But a resolution being "standard" has little to do with it.

Share this post


Link to post
Share on other sites
mhagain    13430

 

Asking him about it just got the response of "well, it only happens for more advanced graphics algorithms."

I'd ask him for an example of such "advanced graphics algorithms".

 

Frankly it sounds like he doesn't know what he's talking about, or accidentally said something he didn't mean and has low enough self-esteem that he can't bring himself to back out of it.

 

Precisely.  This sounds exactly like the kind of theoretical nonsense academics sometimes come out with.  If it was the case then the API documentation would be shouting about it, the hardware vendors would be shouting about it, John Carmack would be tweeting about it, Johan Andersson would be blogging about it.

 

The fact that those who define pretty basic stuff such as how APIs and hardware actually work, the fact that those who have a track record of actually using this stuff in the field for real programs that real people use, are not doing so is evidence enough.  It's nonsense.

Share this post


Link to post
Share on other sites
MarkS    3502

I remember a time when using a non-standard resolution required the CPU to scale the image to fit the monitor. But this was many years ago. If I'm not mistaken, most LCDs today have built-in hardware to scale the incoming image to the monitor's native resolution.

Share this post


Link to post
Share on other sites
marcClintDion    435

There are too many different resolutions available for this to be entirely true these days.  Maybe in the past when there were only a few available resolutions this would have been sometimes the case.

 

These days there must be at least a dozen or even 2 dozen common screen sizes available for desktops and laptops, it seems to me that the best optimizations would be the one's which are resolution independent.   

 

GPU manufacturers go out of their way to avoid this issue by abstracting away the notion of pixels.  This is why the fragment processor is called the fragment processor. 

The word 'fragment' is an abstraction of the word 'pixel'.   It implies that the pixel information is unknown before the GPU is plugged into a specific machine.

Fragment is used in place of pixel because 'fragment' is resolution and screen size agnostic.

 

A console may possibly have hardware and driver optimizations for 720, and 1080 since these are common.

A desktop GPU will certainly make no assumptions about such things.

 

So far as game programmers go??? Holy merd!  This would be like the days before GPU API's existed when people had to program for every possible graphics card.  

2/3's of your software would be conditional switches and branches and code which is not even being used on the current machine. 

There must be at least 100 distinct GPU's by now, with dozen's of sub-versions for many of them. 

 

That sounds like a nightmare.  I'd rather pluck out all my eyebrow hairs with bolt-cutters than worry about something like this.  Then again, some people like to study what comes out of the rear-ends of animals.  There is something for everyone. 

 
Edited by Josh Petrie
rollback

Share this post


Link to post
Share on other sites
swiftcoder    18432


If I'm not mistaken, most LCDs today have built-in hardware to scale the incoming image to the monitor's native resolution.

Most, but not all by any stretch (pun unintended).

 

I tend to buy those quad-def IPS monitors from S. Korea, and they explicitly do not included a scaler. Downside, it won't work as a TV, but on the upside you cut down the lag, and desktop GPU's tend to do pretty well at scaling anyway.

Share this post


Link to post
Share on other sites
Tyler Camp    134

Got a lot more responses than I initially thought I would get; the whole thing seemed especially weird to me when another part of his reasoning for my tests not showing anything was that "In 2d (only GDI layer) the resolution is less dependent on hardware specs". I really doubted that XNA used GDI for 2D.

 

Thanks everyone for the info, the part about TV latency sounds like it might have been what he had been referring to and I'll keep that in mind for the future.

Share this post


Link to post
Share on other sites
mhagain    13430

Got a lot more responses than I initially thought I would get; the whole thing seemed especially weird to me when another part of his reasoning for my tests not showing anything was that "In 2d (only GDI layer) the resolution is less dependent on hardware specs". I really doubted that XNA used GDI for 2D.

 

You can be quite certain he's talking nonsense then as the only real difference between 2D and 3D (assuming the same API used for both) is the projection matrix used.

Share this post


Link to post
Share on other sites

Thanks everyone for the info, the part about TV latency sounds like it might have been what he had been referring to and I'll keep that in mind for the future.

There's no way he was referring to TV latency. Or if he did, he's even more clueless than he initially sounded like, because 2D/3D rendering and choice of APIs have nothing to do with TV latencies.

 

Seriously, ask him what those "advanced graphics algorithms" are. Ask for names. Names of algorithms, names of researchers, names of books or other publications. If he "doesn't remember", ask him exactly what problems those algorithms solve. If you manage to get any answers, we'll look them up.

Share this post


Link to post
Share on other sites

The other thing is that running a game in fullscreen (not the fake borderless window style, but actual fullscreen) gives a small performance boost, since there are optimizations done about rendering via the OS. 

This had to do with DirectX and GDI constantly conflicting with each other (since GDI had no idea about DirectX at all). Basically, in windowed mode DirectDraw and Direct3D were required to copy the back buffer into screen no matter what, while in fullscreen they could just swap buffers since GDI wouldn't be drawing anything. They changed that in Vista and since then the performance difference is gone (the change they made is also why Direct3D 10 finally can keep the resources around when switching out of fullscreen, something earlier versions couldn't).

Edited by Sik_the_hedgehog

Share this post


Link to post
Share on other sites

While I haven't been to college, I do know that those in academia do not like to be challenged

There are individuals who do not like to be challenged. Someone like that teaching higher education is terrible at their job, and in the minority. That kind of attitude runs counter to the whole purpose of higher education, not just in principle but in practice. If it's prevalent at some institution, then that institution is bad.

 

I'm excited whenever a student challenges me. It shows they are engaged and actually trying to digest what I'm telling them. If they turn out to be mistaken, they are giving me a window to their thinking and enabling me to correct whatever they got wrong. If they turn out to be right, then they are doing something far more valuable, helping me plug the holes in my teaching and helping all of the other students.

 

The only times I've encountered any negativity from my own teachers for questioning something has been when those questions revealed that I hadn't done the prep work I was supposed to have done, and asking the questions was essentially wasting everyone's time. Some have standing offers to buy students coffee/candy/whatever if the student manages to find errors in the course materials. The last professor I corrected ended up giving me a job.

Share this post


Link to post
Share on other sites
MarkS    3502
I apologize. I am not known for very eloquent speech. I should have written that as "...some in academia...". I know that not everyone is like that, but I've heard and read so many horror stories. I did not many any offense.

I still believe and stand behind my statement that this professor is stuck in the past. There was a time where he would have been correct, but that time ended about a decade ago. To teach this as current fact is puzzling at best. Edited by MarkS

Share this post


Link to post
Share on other sites
Tyler Camp    134

I'm hesitant to bring it up since I'm a freshman (from what I've heard/seen he has a "what I say is final in the classroom" sort of thing going on, and I can't imagine that being corrected by a freshman would end well) and the topic was actually brought up months ago.

 

There's a general consensus that what he proposed didn't make sense and I'll take that as the answer; I'll correct those that quote him on the subject and will continue to bring questionable ideas to you guys for clarification.

Share this post


Link to post
Share on other sites
swiftcoder    18432


I'm hesitant to bring it up since I'm a freshman (from what I've heard/seen he has a "what I say is final in the classroom" sort of thing going on, and I can't imagine that being corrected by a freshman would end well)

Good call.

 

If you plan to become one of those students who argues with the professor, better be damn sure you are always right :)

Share this post


Link to post
Share on other sites
Krypt0n    4721

some hardware is optimized for the default output resolutions, sometimes it's the memory layout (tile size), sometimes drivers have some weird checks to ship around limitations of the API (e.g. some resources are limited on the GPU and you just want to use them for frame buffers, but how do you know that something gonna be a framebuffer if the API does not pass a flag? you check the resolution).

 

some of those 'algorithms' might be aligned to 'tiles' also, e.g. deferred shading HW might work on 32x32 tiles, in a simple example, your display might be 640x480 ->20x15 tiles, now you might change the resolution slightly (for whatever reason) to 641x470, you effectively reduced that amount of pixel, yet you've increased the tile count to 21x15 and it might be slower.

Share this post


Link to post
Share on other sites
Tyler Camp    134

Looking back, I now remember how he claimed that DirectX would drop back to software-based rendering and stop using the GPU if you didn't use standard resolutions. At this point I don't know if he was just joking or really misinformed, but I'm now much more wary of his statements.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this