• Announcements

    • khawk

      Download the Game Design and Indie Game Marketing Freebook   07/19/17

      GameDev.net and CRC Press have teamed up to bring a free ebook of content curated from top titles published by CRC Press. The freebook, Practices of Game Design & Indie Game Marketing, includes chapters from The Art of Game Design: A Book of Lenses, A Practical Guide to Indie Game Marketing, and An Architectural Approach to Level Design. The GameDev.net FreeBook is relevant to game designers, developers, and those interested in learning more about the challenges in game development. We know game development can be a tough discipline and business, so we picked several chapters from CRC Press titles that we thought would be of interest to you, the GameDev.net audience, in your journey to design, develop, and market your next game. The free ebook is available through CRC Press by clicking here. The Curated Books The Art of Game Design: A Book of Lenses, Second Edition, by Jesse Schell Presents 100+ sets of questions, or different lenses, for viewing a game’s design, encompassing diverse fields such as psychology, architecture, music, film, software engineering, theme park design, mathematics, anthropology, and more. Written by one of the world's top game designers, this book describes the deepest and most fundamental principles of game design, demonstrating how tactics used in board, card, and athletic games also work in video games. It provides practical instruction on creating world-class games that will be played again and again. View it here. A Practical Guide to Indie Game Marketing, by Joel Dreskin Marketing is an essential but too frequently overlooked or minimized component of the release plan for indie games. A Practical Guide to Indie Game Marketing provides you with the tools needed to build visibility and sell your indie games. With special focus on those developers with small budgets and limited staff and resources, this book is packed with tangible recommendations and techniques that you can put to use immediately. As a seasoned professional of the indie game arena, author Joel Dreskin gives you insight into practical, real-world experiences of marketing numerous successful games and also provides stories of the failures. View it here. An Architectural Approach to Level Design This is one of the first books to integrate architectural and spatial design theory with the field of level design. The book presents architectural techniques and theories for level designers to use in their own work. It connects architecture and level design in different ways that address the practical elements of how designers construct space and the experiential elements of how and why humans interact with this space. Throughout the text, readers learn skills for spatial layout, evoking emotion through gamespaces, and creating better levels through architectural theory. View it here. Learn more and download the ebook by clicking here. Did you know? GameDev.net and CRC Press also recently teamed up to bring GDNet+ Members up to a 20% discount on all CRC Press books. Learn more about this and other benefits here.
Sign in to follow this  
Followers 0
Tyler Camp

"Standard" Resolution Performance

20 posts in this topic

One of my professors has made the claim that using a "standard" resolution (i.e. 1920x1080, 1024x768, etc.) will provide better performance than using  "non-standard" resolution (1500x755, etc.) I've never heard of this before and can't seem to find anything to back his claims. I know a lot of games on consoles render to lower (and "non-standard") resolutions and then upscale for better performance, which is the opposite of what he's stated. I don't know what he really means by "resolution" and he hasn't clarified. (The display resolution? The resolution of any framebuffer that is being rendered to?) He didn't say that this was specifically for any platform, but I was in an XNA on Windows class at the time.

 

I've tried benchmarking it myself (basic 2D sprite rendering via XNA, fullscreen with different backbuffer resolutions) and didn't see any performance penalties/gains that were out of the ordinary. Asking him about it just got the response of "well, it only happens for more advanced graphics algorithms." Has anyone else heard anything like this?

Edited by tylercamp
0

Share this post


Link to post
Share on other sites

I don't think there's anything special about the 'standard' resolutions. There's a ton of standard resolutions, given all monitor sizes and ratios. 

As far as I know the graphics pipeline doesn't optimize anywhere based on what resolution you're rendering at.

 

There's two things sort of related that I know can sometimes make a difference - using textures that have are square with a size that's a power of 2 - which was only a big deal on older graphics cards. 

 

The other thing is that running a game in fullscreen (not the fake borderless window style, but actual fullscreen) gives a small performance boost, since there are optimizations done about rendering via the OS. 

0

Share this post


Link to post
Share on other sites
The best rule of thumb I can think of here would be to prefer outputting at the monitors maximum supported resolution (which should be it's native resolution).

 

Even that's not always perfect, though. I've run into quite a few LCD projectors, for instance, that "support" higher resolutions than they can actually display, and actually seem to downsample the input signal. Worse yet, the downsampling was very crude, but just "good" enough that it was obvious that it was supposed to be a "feature."

2

Share this post


Link to post
Share on other sites

Asking him about it just got the response of "well, it only happens for more advanced graphics algorithms."

I'd ask him for an example of such "advanced graphics algorithms".

 

Frankly it sounds like he doesn't know what he's talking about, or accidentally said something he didn't mean and has low enough self-esteem that he can't bring himself to back out of it.

 

It's plausible that some algos would work best for resolutions exhibiting certain qualities, like being divisible by a certain number. And for a final render target that is a certain resolution, some lower resolutions will upscale to it more cleanly than others. But a resolution being "standard" has little to do with it.

1

Share this post


Link to post
Share on other sites

 

Asking him about it just got the response of "well, it only happens for more advanced graphics algorithms."

I'd ask him for an example of such "advanced graphics algorithms".

 

Frankly it sounds like he doesn't know what he's talking about, or accidentally said something he didn't mean and has low enough self-esteem that he can't bring himself to back out of it.

 

Precisely.  This sounds exactly like the kind of theoretical nonsense academics sometimes come out with.  If it was the case then the API documentation would be shouting about it, the hardware vendors would be shouting about it, John Carmack would be tweeting about it, Johan Andersson would be blogging about it.

 

The fact that those who define pretty basic stuff such as how APIs and hardware actually work, the fact that those who have a track record of actually using this stuff in the field for real programs that real people use, are not doing so is evidence enough.  It's nonsense.

0

Share this post


Link to post
Share on other sites

I remember a time when using a non-standard resolution required the CPU to scale the image to fit the monitor. But this was many years ago. If I'm not mistaken, most LCDs today have built-in hardware to scale the incoming image to the monitor's native resolution.

0

Share this post


Link to post
Share on other sites

There are too many different resolutions available for this to be entirely true these days.  Maybe in the past when there were only a few available resolutions this would have been sometimes the case.

 

These days there must be at least a dozen or even 2 dozen common screen sizes available for desktops and laptops, it seems to me that the best optimizations would be the one's which are resolution independent.   

 

GPU manufacturers go out of their way to avoid this issue by abstracting away the notion of pixels.  This is why the fragment processor is called the fragment processor. 

The word 'fragment' is an abstraction of the word 'pixel'.   It implies that the pixel information is unknown before the GPU is plugged into a specific machine.

Fragment is used in place of pixel because 'fragment' is resolution and screen size agnostic.

 

A console may possibly have hardware and driver optimizations for 720, and 1080 since these are common.

A desktop GPU will certainly make no assumptions about such things.

 

So far as game programmers go??? Holy merd!  This would be like the days before GPU API's existed when people had to program for every possible graphics card.  

2/3's of your software would be conditional switches and branches and code which is not even being used on the current machine. 

There must be at least 100 distinct GPU's by now, with dozen's of sub-versions for many of them. 

 

That sounds like a nightmare.  I'd rather pluck out all my eyebrow hairs with bolt-cutters than worry about something like this.  Then again, some people like to study what comes out of the rear-ends of animals.  There is something for everyone. 

 
Edited by Josh Petrie
rollback
0

Share this post


Link to post
Share on other sites


If I'm not mistaken, most LCDs today have built-in hardware to scale the incoming image to the monitor's native resolution.

Most, but not all by any stretch (pun unintended).

 

I tend to buy those quad-def IPS monitors from S. Korea, and they explicitly do not included a scaler. Downside, it won't work as a TV, but on the upside you cut down the lag, and desktop GPU's tend to do pretty well at scaling anyway.

0

Share this post


Link to post
Share on other sites

Got a lot more responses than I initially thought I would get; the whole thing seemed especially weird to me when another part of his reasoning for my tests not showing anything was that "In 2d (only GDI layer) the resolution is less dependent on hardware specs". I really doubted that XNA used GDI for 2D.

 

Thanks everyone for the info, the part about TV latency sounds like it might have been what he had been referring to and I'll keep that in mind for the future.

0

Share this post


Link to post
Share on other sites

Got a lot more responses than I initially thought I would get; the whole thing seemed especially weird to me when another part of his reasoning for my tests not showing anything was that "In 2d (only GDI layer) the resolution is less dependent on hardware specs". I really doubted that XNA used GDI for 2D.

 

You can be quite certain he's talking nonsense then as the only real difference between 2D and 3D (assuming the same API used for both) is the projection matrix used.

0

Share this post


Link to post
Share on other sites

Thanks everyone for the info, the part about TV latency sounds like it might have been what he had been referring to and I'll keep that in mind for the future.

There's no way he was referring to TV latency. Or if he did, he's even more clueless than he initially sounded like, because 2D/3D rendering and choice of APIs have nothing to do with TV latencies.

 

Seriously, ask him what those "advanced graphics algorithms" are. Ask for names. Names of algorithms, names of researchers, names of books or other publications. If he "doesn't remember", ask him exactly what problems those algorithms solve. If you manage to get any answers, we'll look them up.

0

Share this post


Link to post
Share on other sites

The other thing is that running a game in fullscreen (not the fake borderless window style, but actual fullscreen) gives a small performance boost, since there are optimizations done about rendering via the OS. 

This had to do with DirectX and GDI constantly conflicting with each other (since GDI had no idea about DirectX at all). Basically, in windowed mode DirectDraw and Direct3D were required to copy the back buffer into screen no matter what, while in fullscreen they could just swap buffers since GDI wouldn't be drawing anything. They changed that in Vista and since then the performance difference is gone (the change they made is also why Direct3D 10 finally can keep the resources around when switching out of fullscreen, something earlier versions couldn't).

Edited by Sik_the_hedgehog
0

Share this post


Link to post
Share on other sites

While I haven't been to college, I do know that those in academia do not like to be challenged

There are individuals who do not like to be challenged. Someone like that teaching higher education is terrible at their job, and in the minority. That kind of attitude runs counter to the whole purpose of higher education, not just in principle but in practice. If it's prevalent at some institution, then that institution is bad.

 

I'm excited whenever a student challenges me. It shows they are engaged and actually trying to digest what I'm telling them. If they turn out to be mistaken, they are giving me a window to their thinking and enabling me to correct whatever they got wrong. If they turn out to be right, then they are doing something far more valuable, helping me plug the holes in my teaching and helping all of the other students.

 

The only times I've encountered any negativity from my own teachers for questioning something has been when those questions revealed that I hadn't done the prep work I was supposed to have done, and asking the questions was essentially wasting everyone's time. Some have standing offers to buy students coffee/candy/whatever if the student manages to find errors in the course materials. The last professor I corrected ended up giving me a job.

2

Share this post


Link to post
Share on other sites
I apologize. I am not known for very eloquent speech. I should have written that as "...some in academia...". I know that not everyone is like that, but I've heard and read so many horror stories. I did not many any offense.

I still believe and stand behind my statement that this professor is stuck in the past. There was a time where he would have been correct, but that time ended about a decade ago. To teach this as current fact is puzzling at best. Edited by MarkS
1

Share this post


Link to post
Share on other sites

I'm hesitant to bring it up since I'm a freshman (from what I've heard/seen he has a "what I say is final in the classroom" sort of thing going on, and I can't imagine that being corrected by a freshman would end well) and the topic was actually brought up months ago.

 

There's a general consensus that what he proposed didn't make sense and I'll take that as the answer; I'll correct those that quote him on the subject and will continue to bring questionable ideas to you guys for clarification.

0

Share this post


Link to post
Share on other sites


I'm hesitant to bring it up since I'm a freshman (from what I've heard/seen he has a "what I say is final in the classroom" sort of thing going on, and I can't imagine that being corrected by a freshman would end well)

Good call.

 

If you plan to become one of those students who argues with the professor, better be damn sure you are always right :)

0

Share this post


Link to post
Share on other sites

some hardware is optimized for the default output resolutions, sometimes it's the memory layout (tile size), sometimes drivers have some weird checks to ship around limitations of the API (e.g. some resources are limited on the GPU and you just want to use them for frame buffers, but how do you know that something gonna be a framebuffer if the API does not pass a flag? you check the resolution).

 

some of those 'algorithms' might be aligned to 'tiles' also, e.g. deferred shading HW might work on 32x32 tiles, in a simple example, your display might be 640x480 ->20x15 tiles, now you might change the resolution slightly (for whatever reason) to 641x470, you effectively reduced that amount of pixel, yet you've increased the tile count to 21x15 and it might be slower.

0

Share this post


Link to post
Share on other sites

Looking back, I now remember how he claimed that DirectX would drop back to software-based rendering and stop using the GPU if you didn't use standard resolutions. At this point I don't know if he was just joking or really misinformed, but I'm now much more wary of his statements.

0

Share this post


Link to post
Share on other sites


At this point I don't know if he was just joking or really misinformed

More likely, just a decade or so out of date. The world of graphics rendering was a whole different ballgame in the 90's.

0

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0