"Standard" Resolution Performance

Started by
19 comments, last by swiftcoder 10 years, 8 months ago

Got a lot more responses than I initially thought I would get; the whole thing seemed especially weird to me when another part of his reasoning for my tests not showing anything was that "In 2d (only GDI layer) the resolution is less dependent on hardware specs". I really doubted that XNA used GDI for 2D.

You can be quite certain he's talking nonsense then as the only real difference between 2D and 3D (assuming the same API used for both) is the projection matrix used.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

Advertisement

Thanks everyone for the info, the part about TV latency sounds like it might have been what he had been referring to and I'll keep that in mind for the future.

There's no way he was referring to TV latency. Or if he did, he's even more clueless than he initially sounded like, because 2D/3D rendering and choice of APIs have nothing to do with TV latencies.

Seriously, ask him what those "advanced graphics algorithms" are. Ask for names. Names of algorithms, names of researchers, names of books or other publications. If he "doesn't remember", ask him exactly what problems those algorithms solve. If you manage to get any answers, we'll look them up.


The other thing is that running a game in fullscreen (not the fake borderless window style, but actual fullscreen) gives a small performance boost, since there are optimizations done about rendering via the OS.

This had to do with DirectX and GDI constantly conflicting with each other (since GDI had no idea about DirectX at all). Basically, in windowed mode DirectDraw and Direct3D were required to copy the back buffer into screen no matter what, while in fullscreen they could just swap buffers since GDI wouldn't be drawing anything. They changed that in Vista and since then the performance difference is gone (the change they made is also why Direct3D 10 finally can keep the resources around when switching out of fullscreen, something earlier versions couldn't).

Don't pay much attention to "the hedgehog" in my nick, it's just because "Sik" was already taken =/ By the way, Sik is pronounced like seek, not like sick.

Seriously, ask him what those "advanced graphics algorithms" are. Ask for names. Names of algorithms, names of researchers, names of books or other publications. If he "doesn't remember", ask him exactly what problems those algorithms solve. If you manage to get any answers, we'll look them up.


I would disagree. While I haven't been to college, I do know that those in academia do not like to be challenged and a professor can make your life a living hell. He is clearly stuck in the past on this issue. This is a time where you smile and nod. You gain nothing from challenging him and stand to lose a great deal.

While I haven't been to college, I do know that those in academia do not like to be challenged

There are individuals who do not like to be challenged. Someone like that teaching higher education is terrible at their job, and in the minority. That kind of attitude runs counter to the whole purpose of higher education, not just in principle but in practice. If it's prevalent at some institution, then that institution is bad.

I'm excited whenever a student challenges me. It shows they are engaged and actually trying to digest what I'm telling them. If they turn out to be mistaken, they are giving me a window to their thinking and enabling me to correct whatever they got wrong. If they turn out to be right, then they are doing something far more valuable, helping me plug the holes in my teaching and helping all of the other students.

The only times I've encountered any negativity from my own teachers for questioning something has been when those questions revealed that I hadn't done the prep work I was supposed to have done, and asking the questions was essentially wasting everyone's time. Some have standing offers to buy students coffee/candy/whatever if the student manages to find errors in the course materials. The last professor I corrected ended up giving me a job.

I apologize. I am not known for very eloquent speech. I should have written that as "...some in academia...". I know that not everyone is like that, but I've heard and read so many horror stories. I did not many any offense.

I still believe and stand behind my statement that this professor is stuck in the past. There was a time where he would have been correct, but that time ended about a decade ago. To teach this as current fact is puzzling at best.

I'm hesitant to bring it up since I'm a freshman (from what I've heard/seen he has a "what I say is final in the classroom" sort of thing going on, and I can't imagine that being corrected by a freshman would end well) and the topic was actually brought up months ago.

There's a general consensus that what he proposed didn't make sense and I'll take that as the answer; I'll correct those that quote him on the subject and will continue to bring questionable ideas to you guys for clarification.


I'm hesitant to bring it up since I'm a freshman (from what I've heard/seen he has a "what I say is final in the classroom" sort of thing going on, and I can't imagine that being corrected by a freshman would end well)

Good call.

If you plan to become one of those students who argues with the professor, better be damn sure you are always right :)

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

some hardware is optimized for the default output resolutions, sometimes it's the memory layout (tile size), sometimes drivers have some weird checks to ship around limitations of the API (e.g. some resources are limited on the GPU and you just want to use them for frame buffers, but how do you know that something gonna be a framebuffer if the API does not pass a flag? you check the resolution).

some of those 'algorithms' might be aligned to 'tiles' also, e.g. deferred shading HW might work on 32x32 tiles, in a simple example, your display might be 640x480 ->20x15 tiles, now you might change the resolution slightly (for whatever reason) to 641x470, you effectively reduced that amount of pixel, yet you've increased the tile count to 21x15 and it might be slower.

Looking back, I now remember how he claimed that DirectX would drop back to software-based rendering and stop using the GPU if you didn't use standard resolutions. At this point I don't know if he was just joking or really misinformed, but I'm now much more wary of his statements.

This topic is closed to new replies.

Advertisement