How to create an MSAA shader? (For deferred rendering)

Started by
1 comment, last by Vilem Otte 8 years, 11 months ago

I'm sure a lot of people have been asking how to create a pleasing anti-aliasing effect for deferred rendering without falling back on FXAA.

A lot of people unfortunately don't seem to like FXAA, and I hear there are alternatives.

One alternative I heard is that you can use MSAA with deferred rendering, after the geometry has already been drawn and calculated. But, how the hell is that supposed to work?

For one thing, I figured MSAA required that the triangle edges are to be used to calculate the blur samples. If all i'm working with is a texture, how am I supposed to get the triangle edges? Am I supposed to re-calculate them somehow?

I also hear that screen space MSAA can prevent texture blurring, unlike FXAA. But.. I can't find any information on how to implement an MSAA shader, anywhere.

Can anyone point me to a tutorial or something? Or if there isn't one, can anyone help me figure out how to do this? If I cannot get MSAA to work, I mind as well fall back on FXAA, like a lot of people have, but I want to see if I can do this.

View my game dev blog here!

Advertisement

You have some misconceptions here:

MSAA takes a pixel and chops it into 2x2 sub-pixels (or more if you go higher quality). The pixel shader gets run ONCE for the pixel. If all of those 2x2 sub-pixels are inside a triangle, it will write the same color value 4 times. If the sub pixels fall on a triangle edge, it writes the pixel shader value only for the pixels the triangle lives in.

At the end of the day, the buffer is downsampled to ONE pixel by averaging the four sub-pixel values. So: There is no such thing as "Screen Space MSAA". There is no blurring, that is only with FXAA, because it has no sub-pixels, so it has to just smear spots with high contrast.

So when you are talking about deferred, you are talking about making a texture that is supporting MSAA (meaning it is actually much bigger because it has to hold all these sub-pixels) and then you can call a downsample function or something. I'm not sure how that specifically works but I'm assuming you just call a "gl" command to resolve the MSAA buffer to an actual texture.

Also, this makes it so that you are writing more data to your gBuffer for deferred so it could slow things down significantly. You could try only using MSAA on your color buffer and not any other ones.

NBA2K, Madden, Maneater, Killing Floor, Sims http://www.pawlowskipinball.com/pinballeternal

Actually in my game engine I've implemented full MSAA for deferred renderer. In OpenGL simple approach works as following:

  • Render your scene into G-Buffer, storing everything using multisampled textures with n samples
  • Once doing the shading phase, you have to perform it per-sample and then resolve

Your G-Buffer shader will still look the same, you will just write the output into texture created using glTexImage2DMultisample, you might also want to set multisample renderbuffer storage (for your render buffer). This modification is fairly simple.

In shading phase you pass in the multisampled texture(s) - they are read using Sampler2DMS instead of Sampler2D. And you have to read samples explicitly using texelFetch where you also pass which sample do you want to read.

This should help you start with the basics.

Just a note about one serious problem - when to do resolve? The problem is, that when you render multisampled G-Buffer, resolve during shading, then tone mapping (or basically any other post processing effect) might ruin the anti-aliasing. The solution is quite simple - you render multisampled G-buffer, shade per-sample writing into multisampled buffer, apply each post processing effect on multisampled buffer producing multisampled buffer (incl. tone mapping in the end) and then you resolve. This can sound as a bit of a problem (as it will need a lot more computation power and memory to do this).

A lot of game engines although do resolve during shading, apply post-processing on already resolved buffer and then use FXAA hack to smooth out edges where sharp edges appeared (because it is a lot faster and they think gamers won't notice). (I personally don't like FXAA - it in my opinion blurs whole image and degrades the final image quality - using high quality MSAA (with lots of samples) is really looking better)

My current blog on programming, linux and stuff - http://gameprogrammerdiary.blogspot.com

This topic is closed to new replies.

Advertisement