What display mode do you play PC games in?

Started by
18 comments, last by 3Ddreamer 11 years, 3 months ago

Yeah, superVGA, your CRT did not show a "crisp" image, it showed a blurred image, and its in fact the LCD wich is crisp, and thats why you see the alias smile.png
So now you turn your game in to low detail so you can blur it, and simulate the blurry CRT image you are used to tongue.png[/quote]

Umm no. Just no.

CRTs have higher fidelity than LCDs on most applications (this may not apply to extremely high resolutions as most CRTs and the VGA port weren't designed for that).

The reason the AA is more noticeable in LCDs is because CRTs have a different arrangement in the RGB pixel layout. It's even more obvious if your LCD is bigger than your CRT... (which is very common now)

When comparing screens, resolution isn't just enough. Concepts such as visual acuity, vernier acuity, antialising, ghosting, in-monitor post processing, refresh rate and input lag come into play.

I still prefer LCDs, but it's not for their aliasing/antialiasing/blurriness/sharpness (they come in bigger models, occupy less space, consume less energy, cause less heat in the office, and don't have that eye-destroying flicker caused by refresh rates. I personally can notice the flickering if the refresh rate is <85hz in a crt, but some people don't notice it at all at any hz)

Advertisement
You have a lot more temporal and spatial bleeding of colors, which makes everything look smother and rounder, and lines more straight, but I'm not sure that means "higher fidelity", and it's not the same thing as anti-alias, even though the effect on straight lines are similar.
It hides error, but it does it in a way more like blurring then proper anti-alias.

Visual and vernier acuity is properties of your eyes (in combination with our brains), not of your monitor, though your right these concepts are important, and makes more measurements then pure resolution important for deciding image quality.
Sure, the arrangement of RGB cells in an LCD will produce artefacts that are not present on a CRT, which increase the blocky feeling, but I'd still say an LCD is more "crisp", and also that this is part of the problem.

The thing I reacted most on though was to reduce the resolution of your LCD as a solution, and calling the resulting image "crisper".
you can't get a crisper image from undersampling...
since he reduced it to probably the same resolution as his old CRT (unless he had a really expensive one), I felt simulating CRT was not that far off.

Proper antialias (however implemented) will calculate your image at a higher resolution to be able to accurately represent features smaller then a pixel in the pixel grid.
This truly increases fidelity specially in moving images.
The thing I reacted most on though was to reduce the resolution of your LCD as a solution, and calling the resulting image "crisper".
you can't get a crisper image from undersampling...
since he reduced it to probably the same resolution as his old CRT (unless he had a really expensive one), I felt simulating CRT was not that far off.

Well, it depends on our definition of "crisp". True image sharpness can be described as an image with high acutance and high resolution.

However, a low resolution image could be experienced as "crisper" if it contains high acutance (which is likely possible that the LCD monitor contains some postprocessing device to run a sharpen pass)

This site has an excellent explanation on the subject. That site made me remember that, while not really observable on PC CRT screens, it's still worth noting that analog signals are subject to noise, and noise can sometimes can cause an image to be perceived as "sharp".

In summary, it is possible that he experienced a "crisper" image by lowering the resolution in his LCD, causing monitor's postprocessing to kick in. However if we define crisp as high resolution, high acutance; he only experienced an optical illusion, and not a truly crisp image.

Let's not forget after all, antialiaising (as in SSAA or MSAA) is an optical ilusion too, that samples a higher frequency signal into a lower one.

The thing I reacted most on though was to reduce the resolution of your LCD as a solution, and calling the resulting image "crisper"
Yeah, I neither said nor implied that, though.
I said that i thought the CRT displayed crisper images than the LCD, in cases besides the native resolution of the LCD. (I mentioned this because i originally had fill rate issues, and I could gain performance and dispose of jagged edges at once with the mentioned approach. -this defeats the image crispiness I mentioned, those were intended as separate comments.)

Windows Mode [Fixed] (Fake fullscreen) - only because of my dual screen so I can easily click on a window on my other screen easily.

Fullscreen, but I also make sure that it's running in my desktop resolution. If my desktop is running at 1440x900 (native), then I'll make sure the game is also running at that resolution, even if I may have to downgrade the graphics quality.

This makes switching to desktop way faster. I noticed games typically have to reload textures if the resolution changes.

But, the last real game I played on PC was Legend of Grimrock.

Just saw this thread now. Real fullscreen all the way, especially being able to set the video mode. Implement it properly, though! A lot of the bad reputation real fullscreen has comes from badly programmed games completely breaking things =/

If that's not supported, fullscreen window, which is the same as the desktop resolution, i.e. the LCD's native resolution. Nothing else makes sense, really. Non-native resolutions are soooooo damaging to my eyes (and why wouldn't you want to use the pixels that you paid for!).

It makes sense if you have weak video hardware and can't upgrade for whatever reason (e.g. not enough money). For example, it may be good enough to handle the desktop by itself at the native resolution, but for games you may want to use a lower resolution for the sake of performance. Those who insist that letting games change the resolution is bad always tend to overlook this =P

(mind you, if you have a high resolution monitor I suppose it could make sense to drop to a divisor of the native resolution, e.g. 2048×1536 vs 1024×768 - pixels shouldn't bleed if you do this)

Does anyone know if there is any performance or latency difference in fake vs real fullscreen?
I could guess you might have a more direct path to the screen in fullscreen, but that might be irrelevant today.

There used to be issues in the early DirectX days (due to DirectX interfering with GDI) but not anymore. And fullscreen OpenGL on Windows was always windowed technically (it was a borderless window and the resolution was changed manually).

The main advantage of real fullscreen is the fact it can change the video mode, giving the game more control over the display. The main disadvantage is that several games like to screw up fullscreen support so sometimes you get stuff like things not working properly after alt+tab or other monitors blacking out =/

This makes switching to desktop way faster. I noticed games typically have to reload textures if the resolution changes.

This issue has mainly to do with DirectX I believe (see what I said about it interfering with GDI above), since OpenGL requires textures to be preserved. Last I checked Direct3D 10 was supposed to fix that too. This is also why many games used to close if you do alt+tab, because they had to reload the textures and in many cases programmers chose to quit the program instead of handling that properly.

I could be wrong on what I said about the APIs, so don't take what I said for granted, but if I recall correctly this is from where several of the issues come.

Don't pay much attention to "the hedgehog" in my nick, it's just because "Sik" was already taken =/ By the way, Sik is pronounced like seek, not like sick.

I like my games fullscreen because I get distracted otherwise. But I am also apprehensive of tabbing out, because while many games handle it flawlessly with no delay (assuming I run the game at native resolution) and robustly, an equal amount of games just completely fail, outright crash or are impossible to bring back up requiring me to kill them, or even sometimes lose their fullscreen attribute, forcing me to open up the game's graphics settings are re-enable it (yes, Crysis!).

“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”

am i crazy, or does this thread keeping getting bumped when no one has actually said anything new?(does voting in a poll bump it?)

Check out https://www.facebook.com/LiquidGames for some great games made by me on the Playstation Mobile market.

Yeah, I think votes bump it =/

Don't pay much attention to "the hedgehog" in my nick, it's just because "Sik" was already taken =/ By the way, Sik is pronounced like seek, not like sick.

This topic is closed to new replies.

Advertisement