Destroying Quality to improve Quality
Guess I'm not entirley sure if this is a design issue or more technical but...
If a screen has perfect clarity to it, it actually tends to make polygons more visible. If there are things like visual noise, which I kind of consider to be bloom and motion blur, it kind of makes them less noticlable. Some games (such as Penumbra) even use static on the screen to give a more messy view. It sounds undesirable, but this simple effect actually can improve the graphic realism quite a bit.
So, I guess the question here is, what if we took a design approach to make graphics look more realistic by actually lowering the overall quality of the frame? Think like.... crystal clear AVI format transformed into windows media player.
It may depend on the game, but I think it may work good for horror based games that have dark graphics.
I had a similar thought watching the end of terminator the other day - surely I wasn't convinced by that jason and the argonaughts style terminator skeleton animation all those years ago was I? Not exactly - the quality of the tv I watched it on then was much poorer compared to the crisp one I have now and it makes a hell of a difference.
I wouldn't say "destroying quality to increase quality", I'd say "decreasing visual clarity to increase believability". I know what you mean though. Another way to do this is to use a depth of field effect. Same with some very minor motion blurring (depending on the type of game). It also has to do with the art direction - do you want everything to be clean looking, or do you want it rusting, peeling paint, and grimy?
Graphic's technical quality is nothing compared to style. That's why UT looks better than UT2004, and why Morrowind looks better than Oblivion.
But yes, slight blurring can be very effective, and graphically easier than AA.
But yes, slight blurring can be very effective, and graphically easier than AA.
Pardon me if you already know this, but it seems like you might not. Anti-aliasing (the AA that Captain Griffen mentioned) blurs the edges of polygons to stop them from looking pixelated and make the lines look softer. There is also FSAA, full-screen anti-aliasing, which blurs the whole screen slightly. A lot of modern games and video cards support AA, you usually have to turn it on in the game's options.
A little blur can help, and I agree that severe blur might only work well in horror games and dark environments. Too much blur might give players headaches though. When looking at something blurry, the eyes have a tendency to try to constantly bring the image into focus, which causes eye strain if you look at it too long.
A little blur can help, and I agree that severe blur might only work well in horror games and dark environments. Too much blur might give players headaches though. When looking at something blurry, the eyes have a tendency to try to constantly bring the image into focus, which causes eye strain if you look at it too long.
I'm fully aware of AA, but that's not what I'm talking about. AA makes lines smoother, but that doesn't ever cover the fact thet they're still polygons, infact it sometimes makes it worse.
What I'm taking about is much different than AA, and yes, it would work best for horror games but may have use in other genres as well
What I'm taking about is much different than AA, and yes, it would work best for horror games but may have use in other genres as well
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement