Changing graphics settings at real time

Started by
6 comments, last by tanzanite7 10 years, 1 month ago

I recently watched the latest update video of Overgrowth, see here:

Why is it that with all of the new, modern game engines out there, most of them still have to re-start the game before new graphics settings can take effect? Yes, the D3D device is lost when doing so, but there's no need to also unload all resources, is there?

I'm tinkering around with Ogre3D at the moment, and I don't see an easy way to change any graphics settings without having to unload and re-load everything.

I'm aware of the fact that some resources will have to be reloaded and/or regenerated, but I still think this should be possible.

What are your experiences on this topic? Do you think it's a necessary feature? How would/do you go about implementing this feature?

"I would try to find halo source code by bungie best fps engine ever created, u see why call of duty loses speed due to its detail." -- GettingNifty
Advertisement

I recently watched the latest update video of Overgrowth, see here:

Why is it that with all of the new, modern game engines out there, most of them still have to re-start the game before new graphics settings can take effect? Yes, the D3D device is lost when doing so, but there's no need to also unload all resources, is there?

I'm tinkering around with Ogre3D at the moment, and I don't see an easy way to change any graphics settings without having to unload and re-load everything.

I'm aware of the fact that some resources will have to be reloaded and/or regenerated, but I still think this should be possible.

What are your experiences on this topic? Do you think it's a necessary feature? How would/do you go about implementing this feature?

I suspect code inertia is probably a big factor.

like, people could implement it, but this would require them going and doing so.

it is sort of like asking why most games don't include in-game video recording (forcing the user to resort to 3rd party screen-capture software), ...

but, given it can be done, why doesn't everybody do it already?

probably because it would require a lot of people to go and write the code to do so.

likewise, it is easier to have the user restart the game, and changing settings isn't a major gameplay feature, ..., so this is how it goes.

actually, it is like with many limitations in computing:

why does some arbitrary limitation exist?

typically because either the required code hasn't been written, and/or everyone else hasn't bothered to either use the code in question or do something similar, and/or the code once existed but was later dropped/forgotten/stopped-working/...

usually, features gain popularity and spread little-by-little, usually by being sufficiently "cool" or "must have" to overcome the levels of code inertia involved.

can't really answer for the specifics of doing this in Ogre3D though.

This is the death by a thousand cuts for a game engine.

Every feature implemented takes time away from implementing others. You need to decide which ones are most important, and sometimes that means you need to chuck someone else's pet feature.

Every feature implemented has to integrate with others. They have to co-exist. Peacefully. If implementing feature A means that you're going to get headaches in the future with features B, C and D, then sometimes that means feature A needs to die.

Changing graphics settings at runtime is something that I'd consider being firmly in the "nice to have" bracket. It's nice to have for sure, but it shouldn't take priority over other stuff and can be easily chucked if it causes trouble.

The reality is that most people don't change their graphics settings all the time. In the console space they typically don't even need to do it at all.

In the PC space nowadays it means asking "can I max it out on my hardware?", putting everything to the highest setting, then when it runs slowly bitch, moan and accuse the developers of not testing, not optimizing or not caring about their preferred 3D card vendor.

The more rational users will spend half an hour or so finding settings that work well on their machine, and that's the last time they ever touch those options.

So for the most part it's actually a feature that doesn't really need to exist.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

I read and understand your arguments. Saying "it's not meant to be" seems awfully closed-minded, but it's the truth. The graphics API's really didn't want it to be. Take OpenGL for instance:


GLFWwindow* window;
window = glfwCreateWindow( 1024, 768, "MyWindow", NULL, NULL);
glfwMakeContextCurrent(window);

You'd have to re-create the OpenGL-context to change the resolution. The same goes for D3D, you lose the device when you change settings.

I guess what I'm saying is, I'm just amazed that this hasn't been solved yet.

Ogre3D is working on a feature to restore the underlying device/context to allow changing graphics settings at real time. I look forward to trying that feature out when they finally release 2.0, but it still seems hackish to me.

"I would try to find halo source code by bungie best fps engine ever created, u see why call of duty loses speed due to its detail." -- GettingNifty

I read and understand your arguments. Saying "it's not meant to be" seems awfully closed-minded, but it's the truth. The graphics API's really didn't want it to be. Take OpenGL for instance:


GLFWwindow* window;
window = glfwCreateWindow( 1024, 768, "MyWindow", NULL, NULL);
glfwMakeContextCurrent(window);

You'd have to re-create the OpenGL-context to change the resolution. The same goes for D3D, you lose the device when you change settings.

That is definitely not the case with modern D3D. In D3D9 you needed to recreate resources (except those in the managed pool), but in D3D9Ex and especially D3D10/D3D11 all you need is a few API calls.

I'm not sure about exclusive fullscreen mode in OpenGL, but if we're talking about windowed/borderless resolution then you can easily change it without destroying the context.

I read and understand your arguments. Saying "it's not meant to be" seems awfully closed-minded, but it's the truth. The graphics API's really didn't want it to be. Take OpenGL for instance:


GLFWwindow* window;
window = glfwCreateWindow( 1024, 768, "MyWindow", NULL, NULL);
glfwMakeContextCurrent(window);

You'd have to re-create the OpenGL-context to change the resolution. The same goes for D3D, you lose the device when you change settings.

That is definitely not the case with modern D3D. In D3D9 you needed to recreate resources (except those in the managed pool), but in D3D9Ex and especially D3D10/D3D11 all you need is a few API calls.

I'm not sure about exclusive fullscreen mode in OpenGL, but if we're talking about windowed/borderless resolution then you can easily change it without destroying the context.

yeah.

side note is also that GLFW builds on OpenGL, but it is not itself OpenGL.

different restrictions and limitations may apply depending on if one goes through an API like this, or uses lower-level / OS-level APIs (such as WGL or GLX). it may be needed to investigate what the API provides, and if it really isn't usable, to side-step it where appropriate.

dunno about others engines, but fullscreen mode can be done (on Windows) basically by invoking an OS API call to change the resolution, and also changing the current window-style (disabling title bar and borders, ...) and if needed stretching the window to the new screen resolution.

if no in-game resolution change is involved, then this is basically just switching the screen resolution (if needed) and positioning the window to cover the whole screen.

(in my case done mostly via WGL and GDI calls... not looked into it, but similar may apply to GLX+X11).

it seems like one can do a bigger or variable size window by first creating it at the maximum size (and/or full-screen/desktop resolution), and then resizing the window to the target size (if needed). if there is a better way, I am not really aware of it (trying to increase the window past its size when it was originally created seems to result in a not-correctly-updated / garbage area around its edges).

when the window is first created bigger and then downsized, I suspect it uses bigger-resolution buffers or something.

an advantage I have found of changing the game resolution for fullscreen (vs changing screen resolution to match the window), is that if the full-screen resolution is the same as the desktop resolution, then it doesn't go and shove all the windows off into a corner (or onto my secondary monitor), and also can match the native resolution of an LCD flat-panel (otherwise there seems to be big black-borders around the edges of the screen and other issues).

IOW, for example:

create window initially, say, at 1680x1050 (assuming full-screen resolution, on other monitors may be 1920x1080 or whatever else);

set up GL context and similar at this point;

if windowed, resize the window to 1440x900 or 1280x800 or similar, and center on-screen;

or, if full-screen, switch to fullscreen mode and position window in the screen-corner.

it may also make sense to create any internal FBOs and similar at the maximum resolution, such as to avoid needing to destroy/recreate them (if entering/leaving fullscreen).

a lot of the rest is mostly things inside the engine, like whatever is passed to glViewport, ...

for example, the viewport will use the current resolution, rather than the max resolution.

so, yeah, no destruction/recreation of the context (or window) is needed here.

much beyond this, dunno...


Why is it that with all of the new, modern game engines out there, most of them still have to re-start the game before new graphics settings can take effect?

It is one of the most cost efficient ways to do it. A clean 'reset' has many benefits concerning stability and performance. Btw., changing the visual settings is often only done a few times and do not have any gameplay impact, that is , is not really part of the game experience.

You need to consider, that the visual settings not only apply to the graphics API. Eg. dynamic texture and/or model creation which depends on low/hi settings, dynamic lights vs static lightmaps, lot of effects which needs some kind of pre-processing or other data sets, some fancy shader recompilation framework etc., might influence the 'reset' behavior and stability.

I read and understand your arguments. Saying "it's not meant to be" seems awfully closed-minded, but it's the truth. The graphics API's really didn't want it to be. Take OpenGL for instance:

GLFWwindow* window;
window = glfwCreateWindow( 1024, 768, "MyWindow", NULL, NULL);
glfwMakeContextCurrent(window);
You'd have to re-create the OpenGL-context to change the resolution. The same goes for D3D, you lose the device when you change settings.

I guess what I'm saying is, I'm just amazed that this hasn't been solved yet.

GLFW is not OGL and window management and what-not has nothing to-do with OGL either (WGL is for that side).

Short story:
* Each window has a device context => encapsulated stuff for GDI
* We want a sensible pixelformat (RGBA8, doublebuffered) for it that has hardware acceleration.
* We get a rendering context for it.
* Windows/GDI is there to composite all that stuff on screen.

At no point does anything we care about care what the screen resolution is (*)(**).
Just change the resolution and reposition/resize your window, add/remove borders for windowed mode if user wanted that too. No need to recreate the OpenGL context (why would you? want windows software GL1.1 implementation instead?).


(*) Windows/GDI will have a bit more work todo internally when your framebuffer is RGBA8, but screen is in some silly format (like 256-palette). But that is not our concern (it slightly mattered in the old ancient days where using an equally silly framebuffer was a reasonable compromise option).
(**) OpenGL API specification does not mention screen resolution changes => nothing is allowed to be lost because of screen resolution change. This fact is even mentioned by some specs, ex: http://www.opengl.org/registry/specs/ARB/vertex_buffer_object.txt "Do buffer objects survive screen resolution changes, etc.?". However WGL is not OGL, so for example p-buffer (***), if you happen to use them, might be lost.
(***) do not confuse with pixelbuffer, p-buffer -> https://www.opengl.org/wiki/P-buffer

edit: Oh, forgot to comment the last line: it has never been a problem to begin with ... at the OGL side of the fence at least. At D3D side, afaik, things were considerably less rosy.

This topic is closed to new replies.

Advertisement