Jump to content

  • Log In with Google      Sign In   
  • Create Account


What display mode do you play PC games in?


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
20 replies to this topic

Poll: Display Mode (65 member(s) have cast votes)

What display mode do you play games in?

  1. Windows Mode (Window with border) (3 votes [4.62%])

    Percentage of vote: 4.62%

  2. Windows Mode [Fixed] (Fake fullscreen) (16 votes [24.62%])

    Percentage of vote: 24.62%

  3. Fullscreen (46 votes [70.77%])

    Percentage of vote: 70.77%

Vote Guests cannot vote

#1 Memories are Better   Prime Members   -  Reputation: 769

Posted 23 December 2012 - 08:16 AM

I forgot the technical details of what happens in Fullscreen but its the mode that forces a minimize if you alt tab from the game oppose to Windows Mode [Fixed] where the game lingers in the background.

 

Also, would you care if a game only had Windows Mode (borders for resizing) and Windows Mode [Fixed] (fake fullscreen) options?



Sponsor:

#2 Waterlimon   Crossbones+   -  Reputation: 2368

Posted 23 December 2012 - 08:38 AM

I would like a mode where it showed the task bar but no borders...<br /><br />I dont like games that go into some haxy fullscreen where i cant exit to desktop without closing the game. And i dont really like waiting 20 seconds every time i want to transition...

Waterlimon (imagine this is handwritten please)


#3 slicer4ever   Crossbones+   -  Reputation: 3211

Posted 23 December 2012 - 11:00 AM

i prefer faked fullscreen if possible, this is mostly because games that do real fullscreen, for whatever reason, tend to have issues with tabbing out and in(I'm looking at you skyrim.)


Check out https://www.facebook.com/LiquidGames for some great games made by me on the Playstation Mobile market.

#4 Luckless   Crossbones+   -  Reputation: 1669

Posted 23 December 2012 - 11:09 AM

'real' full screen doesn't seem to be the actual issue, but rather how things are handled during the transition in the game. I play more than a few games that appear to be 'real' full screen that are very snappy for transitions. However, most are rather horrid.

Biggest thing that I demand from a game is that it can fill my primary monitor, and trap the mouse to its instance. There are more than a few games out there that will black out the secondary monitor, or allow the mouse to escape if you drag it out the edge to the secondary monitor. (Which REALLY sucks for scrolling around strategy games)


I just wish that more games took advantage of multiple monitors in a graceful way.
Old Username: Talroth
If your signature on a web forum takes up more space than your average post, then you are doing things wrong.

#5 samoth   Crossbones+   -  Reputation: 4516

Posted 23 December 2012 - 11:25 AM

Window with border, since it allows me to switch windows, say to the IM program, or the browser (yes you guessed right, I'm too old for ego-shooters, not enough twitch left in me).

 

If that's not supported, fullscreen window, which is the same as the desktop resolution, i.e. the LCD's native resolution. Nothing else makes sense, really. Non-native resolutions are soooooo damaging to my eyes (and why wouldn't you want to use the pixels that you paid for!).



#6 thePyro_13   Members   -  Reputation: 629

Posted 24 December 2012 - 01:30 AM

Fake fullscreen is my preferred mode. I fall back to real fullscreen if my performance is pushing it. Or sometimes go window for slower paced or casual games(so that I can read or youtube while I wait for stuff to happen biggrin.png).

I think that it's perfectly fine to cut support for real fullscreen. So long as you're still supporting multiple resolutions and fake fullscreen/window mode.

Edited by thePyro_13, 24 December 2012 - 01:30 AM.


#7 SuperVGA   Members   -  Reputation: 1118

Posted 24 December 2012 - 03:24 AM

I used to have a rather slow system. When I changed to an LCD monitor i was first bothered by the fact that there was
apparent aliasing at resolutions where my CRT showed a crisp image.
(Yeah I know why, but this is still how I felt)
I found that if i then played on a lower resolution with the same aspect ratio,
the cell bleeding artifacts produced something a bit like AA,
So I run with 1280x720 instead of 1920x1080, even though my current system can handle it.
I rarely have alt-tab issues. Morrowind does it without any issues. :D

#8 Olof Hedman   Crossbones+   -  Reputation: 2657

Posted 24 December 2012 - 05:15 AM

Fake fullscreen is nice.
I don't use the desktop for anything anyhow, like it clean :)
Does anyone know if there is any performance or latency difference in fake vs real fullscreen?
I could guess you might have a more direct path to the screen in fullscreen, but that might be irrelevant today.

Yeah, superVGA, your CRT did not show a "crisp" image, it showed a blurred image, and its in fact the LCD wich is crisp, and thats why you see the alias :)
So now you turn your game in to low detail so you can blur it, and simulate the blurry CRT image you are used to :P
Real antialias will calculate _more_ details then what you have pixels. Running in low res is nothing like AA, and is only producing an image with less detail and more blur...

Sorry for the rant ;)

Right now I'm a bit sad that Civ 5 is crashing my system in 2560x1440, so I have to run it in 1920x1200... looks horribly blurred :P

Edited by Olof Hedman, 24 December 2012 - 05:24 AM.


#9 SuperVGA   Members   -  Reputation: 1118

Posted 24 December 2012 - 06:24 AM

Yeah, superVGA, your CRT did not show a "crisp" image, it showed a blurred image, and its in fact the LCD wich is crisp, and thats why you see the alias :)
So now you turn your game in to low detail so you can blur it, and simulate the blurry CRT image you are used to :P
Real antialias will calculate _more_ details then what you have pixels. Running in low res is nothing like AA, and is only producing an image with less detail and more blur...
Sorry for the rant ;)
Well, besides the fact that I'd say a CRT would produce a sharper image in any other cases than the native resolution,
it's not like I disagree. I even stated that the artifacts could double as cheap AA to remove jaggies. But sure, it's not AA and it does produce blurring artifacts on an LCD.

I think the high resolutions we use nowadays are sort of overrated, though.

#10 ryan20fun   Members   -  Reputation: 661

Posted 24 December 2012 - 07:08 AM

The only time i like "Windowed Mode" is if the game does not play nice with my LCD's native resolution( 1366x768 ).

If i try to use "1360x768" i get a black border on the top or bottom of the screen after using "Auto Adjust".


Never say Never, Because Never comes too soon. - ryan20fun

Disclaimer: Each post of mine is intended as an attempt of helping and/or bringing some meaningfull insight to the topic at hand. Due to my nature, my good intentions will not always be plainly visible. I apologise in advance and assure you I mean no harm and do not intend to insult anyone.

#11 Matias Goldberg   Crossbones+   -  Reputation: 3007

Posted 25 December 2012 - 01:11 AM

Yeah, superVGA, your CRT did not show a "crisp" image, it showed a blurred image, and its in fact the LCD wich is crisp, and thats why you see the alias smile.png
So now you turn your game in to low detail so you can blur it, and simulate the blurry CRT image you are used to tongue.png

Umm no. Just no.

CRTs have higher fidelity than LCDs on most applications (this may not apply to extremely high resolutions as most CRTs and the VGA port weren't designed for that).

The reason the AA is more noticeable in LCDs is because CRTs have a different arrangement in the RGB pixel layout. It's even more obvious if your LCD is bigger than your CRT... (which is very common now)

When comparing screens, resolution isn't just enough. Concepts such as visual acuity, vernier acuity, antialising, ghosting, in-monitor post processing, refresh rate and input lag come into play.

 

I still prefer LCDs, but it's not for their aliasing/antialiasing/blurriness/sharpness (they come in bigger models, occupy less space, consume less energy, cause less heat in the office, and don't have that eye-destroying flicker caused by refresh rates. I personally can notice the flickering if the refresh rate is <85hz in a crt, but some people don't notice it at all at any hz)



#12 Olof Hedman   Crossbones+   -  Reputation: 2657

Posted 25 December 2012 - 06:24 AM

You have a lot more temporal and spatial bleeding of colors, which makes everything look smother and rounder, and lines more straight, but I'm not sure that means "higher fidelity", and it's not the same thing as anti-alias, even though the effect on straight lines are similar.
It hides error, but it does it in a way more like blurring then proper anti-alias.

Visual and vernier acuity is properties of your eyes (in combination with our brains), not of your monitor, though your right these concepts are important, and makes more measurements then pure resolution important for deciding image quality.
Sure, the arrangement of RGB cells in an LCD will produce artefacts that are not present on a CRT, which increase the blocky feeling, but I'd still say an LCD is more "crisp", and also that this is part of the problem.

The thing I reacted most on though was to reduce the resolution of your LCD as a solution, and calling the resulting image "crisper".
you can't get a crisper image from undersampling...
since he reduced it to probably the same resolution as his old CRT (unless he had a really expensive one), I felt simulating CRT was not that far off.

Proper antialias (however implemented) will calculate your image at a higher resolution to be able to accurately represent features smaller then a pixel in the pixel grid.
This truly increases fidelity specially in moving images.

Edited by Olof Hedman, 25 December 2012 - 07:50 AM.


#13 Matias Goldberg   Crossbones+   -  Reputation: 3007

Posted 25 December 2012 - 12:25 PM

The thing I reacted most on though was to reduce the resolution of your LCD as a solution, and calling the resulting image "crisper".
you can't get a crisper image from undersampling...
since he reduced it to probably the same resolution as his old CRT (unless he had a really expensive one), I felt simulating CRT was not that far off.

Well, it depends on our definition of "crisp". True image sharpness can be described as an image with high acutance and high resolution.

However, a low resolution image could be experienced as "crisper" if it contains high acutance (which is likely possible that the LCD monitor contains some postprocessing device to run a sharpen pass)

This site has an excellent explanation on the subject. That site made me remember that, while not really observable on PC CRT screens, it's still worth noting that analog signals are subject to noise, and noise can sometimes can cause an image to be perceived as "sharp".

 

In summary, it is possible that he experienced a "crisper" image by lowering the resolution in his LCD, causing monitor's postprocessing to kick in. However if we define crisp as high resolution, high acutance; he only experienced an optical illusion, and not a truly crisp image.

Let's not forget after all, antialiaising (as in SSAA or MSAA) is an optical ilusion too, that samples a higher frequency signal into a lower one.



#14 SuperVGA   Members   -  Reputation: 1118

Posted 26 December 2012 - 07:06 AM

The thing I reacted most on though was to reduce the resolution of your LCD as a solution, and calling the resulting image "crisper"
Yeah, I neither said nor implied that, though.
I said that i thought the CRT displayed crisper images than the LCD, in cases besides the native resolution of the LCD. (I mentioned this because i originally had fill rate issues, and I could gain performance and dispose of jagged edges at once with the mentioned approach. -this defeats the image crispiness I mentioned, those were intended as separate comments.)

#15 Xanather   Members   -  Reputation: 703

Posted 26 December 2012 - 11:22 AM

Windows Mode [Fixed] (Fake fullscreen) - only because of my dual screen so I can easily click on a window on my other screen easily.



#16 alnite   Crossbones+   -  Reputation: 2055

Posted 26 December 2012 - 02:41 PM

Fullscreen, but I also make sure that it's running in my desktop resolution.  If my desktop is running at 1440x900 (native), then I'll make sure the game is also running at that resolution, even if I may have to downgrade the graphics quality.

 

This makes switching to desktop way faster.  I noticed games typically have to reload textures if the resolution changes.

 

But, the last real game I played on PC was Legend of Grimrock.


Edited by alnite, 26 December 2012 - 02:41 PM.


#17 Sik_the_hedgehog   Crossbones+   -  Reputation: 1492

Posted 27 December 2012 - 06:04 AM

Just saw this thread now. Real fullscreen all the way, especially being able to set the video mode. Implement it properly, though! A lot of the bad reputation real fullscreen has comes from badly programmed games completely breaking things =/

 

 

If that's not supported, fullscreen window, which is the same as the desktop resolution, i.e. the LCD's native resolution. Nothing else makes sense, really. Non-native resolutions are soooooo damaging to my eyes (and why wouldn't you want to use the pixels that you paid for!).

 

It makes sense if you have weak video hardware and can't upgrade for whatever reason (e.g. not enough money). For example, it may be good enough to handle the desktop by itself at the native resolution, but for games you may want to use a lower resolution for the sake of performance. Those who insist that letting games change the resolution is bad always tend to overlook this =P

 

(mind you, if you have a high resolution monitor I suppose it could make sense to drop to a divisor of the native resolution, e.g. 2048×1536 vs 1024×768 - pixels shouldn't bleed if you do this)

 

Does anyone know if there is any performance or latency difference in fake vs real fullscreen?
I could guess you might have a more direct path to the screen in fullscreen, but that might be irrelevant today.

 

There used to be issues in the early DirectX days (due to DirectX interfering with GDI) but not anymore. And fullscreen OpenGL on Windows was always windowed technically (it was a borderless window and the resolution was changed manually).

 

The main advantage of real fullscreen is the fact it can change the video mode, giving the game more control over the display. The main disadvantage is that several games like to screw up fullscreen support so sometimes you get stuff like things not working properly after alt+tab or other monitors blacking out =/

 

This makes switching to desktop way faster.  I noticed games typically have to reload textures if the resolution changes.

 

This issue has mainly to do with DirectX I believe (see what I said about it interfering with GDI above), since OpenGL requires textures to be preserved. Last I checked Direct3D 10 was supposed to fix that too. This is also why many games used to close if you do alt+tab, because they had to reload the textures and in many cases programmers chose to quit the program instead of handling that properly.

 

I could be wrong on what I said about the APIs, so don't take what I said for granted, but if I recall correctly this is from where several of the issues come.


Don't pay much attention to "the hedgehog" in my nick, it's just because "Sik" was already taken =/ By the way, Sik is pronounced like seek, not like sick.

#18 Bacterius   Crossbones+   -  Reputation: 8177

Posted 27 December 2012 - 10:44 AM

I like my games fullscreen because I get distracted otherwise. But I am also apprehensive of tabbing out, because while many games handle it flawlessly with no delay (assuming I run the game at native resolution) and robustly, an equal amount of games just completely fail, outright crash or are impossible to bring back up requiring me to kill them, or even sometimes lose their fullscreen attribute, forcing me to open up the game's graphics settings are re-enable it (yes, Crysis!).


The slowsort algorithm is a perfect illustration of the multiply and surrender paradigm, which is perhaps the single most important paradigm in the development of reluctant algorithms. The basic multiply and surrender strategy consists in replacing the problem at hand by two or more subproblems, each slightly simpler than the original, and continue multiplying subproblems and subsubproblems recursively in this fashion as long as possible. At some point the subproblems will all become so simple that their solution can no longer be postponed, and we will have to surrender. Experience shows that, in most cases, by the time this point is reached the total work will be substantially higher than what could have been wasted by a more direct approach.

 

- Pessimal Algorithms and Simplexity Analysis


#19 slicer4ever   Crossbones+   -  Reputation: 3211

Posted 30 December 2012 - 05:04 PM

am i crazy, or does this thread keeping getting bumped when no one has actually said anything new?(does voting in a poll bump it?)


Check out https://www.facebook.com/LiquidGames for some great games made by me on the Playstation Mobile market.

#20 Sik_the_hedgehog   Crossbones+   -  Reputation: 1492

Posted 31 December 2012 - 10:40 PM

Yeah, I think votes bump it =/


Don't pay much attention to "the hedgehog" in my nick, it's just because "Sik" was already taken =/ By the way, Sik is pronounced like seek, not like sick.




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS