Jump to content

  • Log In with Google      Sign In   
  • Create Account


"Standard" Resolution Performance


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
20 replies to this topic

#1 tylercamp   Members   -  Reputation: 119

Like
0Likes
Like

Posted 19 July 2013 - 08:14 AM

One of my professors has made the claim that using a "standard" resolution (i.e. 1920x1080, 1024x768, etc.) will provide better performance than using  "non-standard" resolution (1500x755, etc.) I've never heard of this before and can't seem to find anything to back his claims. I know a lot of games on consoles render to lower (and "non-standard") resolutions and then upscale for better performance, which is the opposite of what he's stated. I don't know what he really means by "resolution" and he hasn't clarified. (The display resolution? The resolution of any framebuffer that is being rendered to?) He didn't say that this was specifically for any platform, but I was in an XNA on Windows class at the time.

 

I've tried benchmarking it myself (basic 2D sprite rendering via XNA, fullscreen with different backbuffer resolutions) and didn't see any performance penalties/gains that were out of the ordinary. Asking him about it just got the response of "well, it only happens for more advanced graphics algorithms." Has anyone else heard anything like this?


Edited by tylercamp, 19 July 2013 - 08:16 AM.


Sponsor:

#2 Milcho   Crossbones+   -  Reputation: 1171

Like
0Likes
Like

Posted 19 July 2013 - 08:53 AM

I don't think there's anything special about the 'standard' resolutions. There's a ton of standard resolutions, given all monitor sizes and ratios. 

As far as I know the graphics pipeline doesn't optimize anywhere based on what resolution you're rendering at.

 

There's two things sort of related that I know can sometimes make a difference - using textures that have are square with a size that's a power of 2 - which was only a big deal on older graphics cards. 

 

The other thing is that running a game in fullscreen (not the fake borderless window style, but actual fullscreen) gives a small performance boost, since there are optimizations done about rendering via the OS. 



#3 Hodgman   Moderators   -  Reputation: 26967

Like
5Likes
Like

Posted 19 July 2013 - 09:43 AM

It won't make a difference to performance unless you're on some platform that can only output video signals in some restricted set of resolutions.

e.g. a hypothetical current gen console might be built to always output at 720p or 1080p, so if you internally use a lower resolution, then you have to explicitly pay the cost of resizing your framebuffer to 720p. However, these systems might also provide hardware support to assist in this operation... Nonetheless, you will burn a fraction of a millisecond performing the scaling.

 

If we take what he's saying less literally, and replace "performance" with "latency", he might be slightly more true, at least when displaying on a television.

Some displays will be optimized specifically for their native resolutions only. If you send them a signal for a different resolution, they might be forced to internally re-scale the signal to match their native resolution. This operation, when done by a television, often takes one frame's worth of time -- i.e. so a 60Hz signal will have 16.6ms of latency added to it if rescaling is performed inside the TV.

Technically, there is no performance penalty, because this rescaling happens "for free" in parallel, inside the television... but your overall input latency is hugely affected.

Modern HDTV's (especially early or cheap ones) are often offenders in this category -- sometimes playing current gen console games that are 720p on a 1080p TV will result in an extra 16.6ms of latency, which is rediculous sad.png

However, there's no way to query this. Some TV's wont add any extra latency depending on the resolution. Some will add latency if you don't output a "standard" signal. Some will add latency if you don't output a specific signal (e.g. 1080p only). Some TV's will always add a huge amount of latency for no good reason!

The best rule of thumb I can think of here would be to prefer outputting at the monitors maximum supported resolution (which should be it's native resolution).


Edited by Hodgman, 19 July 2013 - 09:47 AM.


#4 cowsarenotevil   Crossbones+   -  Reputation: 1928

Like
2Likes
Like

Posted 19 July 2013 - 10:32 AM

The best rule of thumb I can think of here would be to prefer outputting at the monitors maximum supported resolution (which should be it's native resolution).

 

Even that's not always perfect, though. I've run into quite a few LCD projectors, for instance, that "support" higher resolutions than they can actually display, and actually seem to downsample the input signal. Worse yet, the downsampling was very crude, but just "good" enough that it was obvious that it was supposed to be a "feature."


-~-The Cow of Darkness-~-

#5 Yrjö P.   Crossbones+   -  Reputation: 1412

Like
1Likes
Like

Posted 20 July 2013 - 08:49 AM

Asking him about it just got the response of "well, it only happens for more advanced graphics algorithms."

I'd ask him for an example of such "advanced graphics algorithms".

 

Frankly it sounds like he doesn't know what he's talking about, or accidentally said something he didn't mean and has low enough self-esteem that he can't bring himself to back out of it.

 

It's plausible that some algos would work best for resolutions exhibiting certain qualities, like being divisible by a certain number. And for a final render target that is a certain resolution, some lower resolutions will upscale to it more cleanly than others. But a resolution being "standard" has little to do with it.



#6 mhagain   Crossbones+   -  Reputation: 7328

Like
0Likes
Like

Posted 20 July 2013 - 09:50 AM

 

Asking him about it just got the response of "well, it only happens for more advanced graphics algorithms."

I'd ask him for an example of such "advanced graphics algorithms".

 

Frankly it sounds like he doesn't know what he's talking about, or accidentally said something he didn't mean and has low enough self-esteem that he can't bring himself to back out of it.

 

Precisely.  This sounds exactly like the kind of theoretical nonsense academics sometimes come out with.  If it was the case then the API documentation would be shouting about it, the hardware vendors would be shouting about it, John Carmack would be tweeting about it, Johan Andersson would be blogging about it.

 

The fact that those who define pretty basic stuff such as how APIs and hardware actually work, the fact that those who have a track record of actually using this stuff in the field for real programs that real people use, are not doing so is evidence enough.  It's nonsense.


It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#7 MarkS   Prime Members   -  Reputation: 875

Like
0Likes
Like

Posted 20 July 2013 - 10:02 AM

I remember a time when using a non-standard resolution required the CPU to scale the image to fit the monitor. But this was many years ago. If I'm not mistaken, most LCDs today have built-in hardware to scale the incoming image to the monitor's native resolution.



#8 marcClintDion   Members   -  Reputation: 431

Like
0Likes
Like

Posted 20 July 2013 - 12:39 PM

There are too many different resolutions available for this to be entirely true these days.  Maybe in the past when there were only a few available resolutions this would have been sometimes the case.

 

These days there must be at least a dozen or even 2 dozen common screen sizes available for desktops and laptops, it seems to me that the best optimizations would be the one's which are resolution independent.   

 

GPU manufacturers go out of their way to avoid this issue by abstracting away the notion of pixels.  This is why the fragment processor is called the fragment processor. 

The word 'fragment' is an abstraction of the word 'pixel'.   It implies that the pixel information is unknown before the GPU is plugged into a specific machine.

Fragment is used in place of pixel because 'fragment' is resolution and screen size agnostic.

 

A console may possibly have hardware and driver optimizations for 720, and 1080 since these are common.

A desktop GPU will certainly make no assumptions about such things.

 

So far as game programmers go??? Holy merd!  This would be like the days before GPU API's existed when people had to program for every possible graphics card.  

2/3's of your software would be conditional switches and branches and code which is not even being used on the current machine. 

There must be at least 100 distinct GPU's by now, with dozen's of sub-versions for many of them. 

 

That sounds like a nightmare.  I'd rather pluck out all my eyebrow hairs with bolt-cutters than worry about something like this.  Then again, some people like to study what comes out of the rear-ends of animals.  There is something for everyone. 

 

Edited by Josh Petrie, 22 July 2013 - 10:19 AM.
rollback

Consider it pure joy, my brothers and sisters, whenever you face trials of many kinds, because you know that the testing of your faith produces perseverance. Let perseverance finish its work so that you may be mature and complete, not lacking anything.


#9 swiftcoder   Senior Moderators   -  Reputation: 9501

Like
0Likes
Like

Posted 22 July 2013 - 10:27 AM


If I'm not mistaken, most LCDs today have built-in hardware to scale the incoming image to the monitor's native resolution.

Most, but not all by any stretch (pun unintended).

 

I tend to buy those quad-def IPS monitors from S. Korea, and they explicitly do not included a scaler. Downside, it won't work as a TV, but on the upside you cut down the lag, and desktop GPU's tend to do pretty well at scaling anyway.


Tristam MacDonald - Software Engineer @Amazon - [swiftcoding]


#10 tylercamp   Members   -  Reputation: 119

Like
0Likes
Like

Posted 23 July 2013 - 01:00 PM

Got a lot more responses than I initially thought I would get; the whole thing seemed especially weird to me when another part of his reasoning for my tests not showing anything was that "In 2d (only GDI layer) the resolution is less dependent on hardware specs". I really doubted that XNA used GDI for 2D.

 

Thanks everyone for the info, the part about TV latency sounds like it might have been what he had been referring to and I'll keep that in mind for the future.



#11 mhagain   Crossbones+   -  Reputation: 7328

Like
0Likes
Like

Posted 23 July 2013 - 06:17 PM

Got a lot more responses than I initially thought I would get; the whole thing seemed especially weird to me when another part of his reasoning for my tests not showing anything was that "In 2d (only GDI layer) the resolution is less dependent on hardware specs". I really doubted that XNA used GDI for 2D.

 

You can be quite certain he's talking nonsense then as the only real difference between 2D and 3D (assuming the same API used for both) is the projection matrix used.


It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#12 Yrjö P.   Crossbones+   -  Reputation: 1412

Like
0Likes
Like

Posted 23 July 2013 - 08:49 PM

Thanks everyone for the info, the part about TV latency sounds like it might have been what he had been referring to and I'll keep that in mind for the future.

There's no way he was referring to TV latency. Or if he did, he's even more clueless than he initially sounded like, because 2D/3D rendering and choice of APIs have nothing to do with TV latencies.

 

Seriously, ask him what those "advanced graphics algorithms" are. Ask for names. Names of algorithms, names of researchers, names of books or other publications. If he "doesn't remember", ask him exactly what problems those algorithms solve. If you manage to get any answers, we'll look them up.



#13 Sik_the_hedgehog   Crossbones+   -  Reputation: 1467

Like
0Likes
Like

Posted 23 July 2013 - 09:04 PM


The other thing is that running a game in fullscreen (not the fake borderless window style, but actual fullscreen) gives a small performance boost, since there are optimizations done about rendering via the OS. 

This had to do with DirectX and GDI constantly conflicting with each other (since GDI had no idea about DirectX at all). Basically, in windowed mode DirectDraw and Direct3D were required to copy the back buffer into screen no matter what, while in fullscreen they could just swap buffers since GDI wouldn't be drawing anything. They changed that in Vista and since then the performance difference is gone (the change they made is also why Direct3D 10 finally can keep the resources around when switching out of fullscreen, something earlier versions couldn't).


Edited by Sik_the_hedgehog, 23 July 2013 - 09:05 PM.

Don't pay much attention to "the hedgehog" in my nick, it's just because "Sik" was already taken =/ By the way, Sik is pronounced like seek, not like sick.

#14 MarkS   Prime Members   -  Reputation: 875

Like
6Likes
Like

Posted 23 July 2013 - 09:57 PM

Seriously, ask him what those "advanced graphics algorithms" are. Ask for names. Names of algorithms, names of researchers, names of books or other publications. If he "doesn't remember", ask him exactly what problems those algorithms solve. If you manage to get any answers, we'll look them up.


I would disagree. While I haven't been to college, I do know that those in academia do not like to be challenged and a professor can make your life a living hell. He is clearly stuck in the past on this issue. This is a time where you smile and nod. You gain nothing from challenging him and stand to lose a great deal.

#15 Yrjö P.   Crossbones+   -  Reputation: 1412

Like
2Likes
Like

Posted 27 July 2013 - 04:33 PM

While I haven't been to college, I do know that those in academia do not like to be challenged

There are individuals who do not like to be challenged. Someone like that teaching higher education is terrible at their job, and in the minority. That kind of attitude runs counter to the whole purpose of higher education, not just in principle but in practice. If it's prevalent at some institution, then that institution is bad.

 

I'm excited whenever a student challenges me. It shows they are engaged and actually trying to digest what I'm telling them. If they turn out to be mistaken, they are giving me a window to their thinking and enabling me to correct whatever they got wrong. If they turn out to be right, then they are doing something far more valuable, helping me plug the holes in my teaching and helping all of the other students.

 

The only times I've encountered any negativity from my own teachers for questioning something has been when those questions revealed that I hadn't done the prep work I was supposed to have done, and asking the questions was essentially wasting everyone's time. Some have standing offers to buy students coffee/candy/whatever if the student manages to find errors in the course materials. The last professor I corrected ended up giving me a job.



#16 MarkS   Prime Members   -  Reputation: 875

Like
1Likes
Like

Posted 28 July 2013 - 01:05 PM

I apologize. I am not known for very eloquent speech. I should have written that as "...some in academia...". I know that not everyone is like that, but I've heard and read so many horror stories. I did not many any offense.

I still believe and stand behind my statement that this professor is stuck in the past. There was a time where he would have been correct, but that time ended about a decade ago. To teach this as current fact is puzzling at best.

Edited by MarkS, 28 July 2013 - 01:06 PM.


#17 tylercamp   Members   -  Reputation: 119

Like
0Likes
Like

Posted 28 July 2013 - 08:35 PM

I'm hesitant to bring it up since I'm a freshman (from what I've heard/seen he has a "what I say is final in the classroom" sort of thing going on, and I can't imagine that being corrected by a freshman would end well) and the topic was actually brought up months ago.

 

There's a general consensus that what he proposed didn't make sense and I'll take that as the answer; I'll correct those that quote him on the subject and will continue to bring questionable ideas to you guys for clarification.



#18 swiftcoder   Senior Moderators   -  Reputation: 9501

Like
0Likes
Like

Posted 29 July 2013 - 06:00 AM


I'm hesitant to bring it up since I'm a freshman (from what I've heard/seen he has a "what I say is final in the classroom" sort of thing going on, and I can't imagine that being corrected by a freshman would end well)

Good call.

 

If you plan to become one of those students who argues with the professor, better be damn sure you are always right :)


Tristam MacDonald - Software Engineer @Amazon - [swiftcoding]


#19 Krypt0n   Crossbones+   -  Reputation: 2270

Like
0Likes
Like

Posted 29 July 2013 - 06:25 AM

some hardware is optimized for the default output resolutions, sometimes it's the memory layout (tile size), sometimes drivers have some weird checks to ship around limitations of the API (e.g. some resources are limited on the GPU and you just want to use them for frame buffers, but how do you know that something gonna be a framebuffer if the API does not pass a flag? you check the resolution).

 

some of those 'algorithms' might be aligned to 'tiles' also, e.g. deferred shading HW might work on 32x32 tiles, in a simple example, your display might be 640x480 ->20x15 tiles, now you might change the resolution slightly (for whatever reason) to 641x470, you effectively reduced that amount of pixel, yet you've increased the tile count to 21x15 and it might be slower.



#20 tylercamp   Members   -  Reputation: 119

Like
0Likes
Like

Posted 30 July 2013 - 07:18 PM

Looking back, I now remember how he claimed that DirectX would drop back to software-based rendering and stop using the GPU if you didn't use standard resolutions. At this point I don't know if he was just joking or really misinformed, but I'm now much more wary of his statements.






Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS