How are you storing your normals? Do you store them in a XYZ form or do you use any form of packing?
I got these kind of artifacts when I used some kind of spherical mapping to pack the normals and I got simillar artifacts on the poles.
Noise in deferred renderer
Oh I store the normal using RGBA8 format, storing only x and y component (it's viewspace normal, the z is manually computed)
Anyway, heres the DEMO.
Please inspect and submit your report here....thx..I really need help. Im sure I can get one here.
Anyway, heres the DEMO.
Please inspect and submit your report here....thx..I really need help. Im sure I can get one here.
well incidentally that seems to be my friends gfx card. I dunno but hes using the 'new' card from Ati. hmm ati filled up with crappy bugs, shit. or maybe it's an error of my shader(Im not sure)
There was some discussion about viewspace normals and storing only X and Y components and reconstructing Z - this does not work everytime: the sign of Z can also be negative.
And 8 bits is very bad precision for normals, you should really go to a R16G16 halfs.
On this page you will find many techniques to store your normals:
http://aras-p.info/texts/CompactNormalStorage.html
And 8 bits is very bad precision for normals, you should really go to a R16G16 halfs.
On this page you will find many techniques to store your normals:
http://aras-p.info/texts/CompactNormalStorage.html
Same issue here on Radeon HD 5470, Radeon HD 6800, Radeon HD 2900xt, GeForce 8600gt and some GeForce GTX (dunno which though) ... Yes, I have that much computers running at home (and no one is sponsoring me :( ... I have to run business myself :P).
Okay, I tried to fix your shaders and fixed them... two ways are possible (the one 0xffffffff described and one I described):
First Way (pass 3-channel normal to g-buffer):
In your source GBuffer.frag
And in you source PointLight.frag
You had this uncommented, so you probably used it (this way it will work) - so you probably know the background behind it. By the way you must normalize the result.
Second Way (reconstruct normals using dirty sqrt):
First lemme describe how sqrt works, in CPU it often works based upon Heron's method of finding square root - which is (source = wiki):
1.Begin with an arbitrary positive starting value x0 (the closer to the root, the better).
2.Let xn+1 be the average of xn and S / xn (using the arithmetic mean to approximate the geometric mean).
3.Repeat step 2 until the desired accuracy is achieved.
This is quite good approximation, although GPUs are doing it somehow differently (they have worse precision, much worse). If you also count with float imprecision on GPUs, you'll find you that it can return NaNs (Not a Numbers).
So what is causing the problem?
This:
Count with me, let me show you how can this work:
So how to fix it?
It is simple, just use absolute value on (-(outnormal.x*outnormal.x) - (outnormal.y*outnormal.y)) to avoid having negative number in sqrt, like this:
Bam... and noise is away! There is also a solution through clamp or min, max functions which is faster (and probably more stable).
Although note, when you're storing normals in R8G8 (8-bit values) you're losing their precision ... use RGB10A2 target instead of RGBA8 target ;) inside glTexImage2D (while generating texture for framebuffer), like this:
Okay, I tried to fix your shaders and fixed them... two ways are possible (the one 0xffffffff described and one I described):
First Way (pass 3-channel normal to g-buffer):
In your source GBuffer.frag
//Scale And Bias NormalvNewNormal.xyz = normalize(vNewNormal.xyz);vNormal.xyz = (vNewNormal.xyz * 0.5) + 0.5;vNormal.w = 1.0; //Unused....
And in you source PointLight.frag
vec3 ExtractNormal(in vec4 vColor){ vec3 outnormal = (vColor.xyz * 2.0) - 1.0; return normalize(outnormal);}
You had this uncommented, so you probably used it (this way it will work) - so you probably know the background behind it. By the way you must normalize the result.
Second Way (reconstruct normals using dirty sqrt):
First lemme describe how sqrt works, in CPU it often works based upon Heron's method of finding square root - which is (source = wiki):
1.Begin with an arbitrary positive starting value x0 (the closer to the root, the better).
2.Let xn+1 be the average of xn and S / xn (using the arithmetic mean to approximate the geometric mean).
3.Repeat step 2 until the desired accuracy is achieved.
This is quite good approximation, although GPUs are doing it somehow differently (they have worse precision, much worse). If you also count with float imprecision on GPUs, you'll find you that it can return NaNs (Not a Numbers).
So what is causing the problem?
This:
outnormal.z = sqrt(1.0 - (outnormal.x*outnormal.x) - (outnormal.y*outnormal.y));
Count with me, let me show you how can this work:
// What if outnormal.x * outnomal.x + outnorma.y * outnormal.y > 1.0 (F.e. 1.00010001000100010001)// Accordng to float number description it can be (due to its imprecision)// Thenoutnormal.z = sqrt(1.0 - 1.00010001000100010001);outnormal.z = sqrt(-0.00010001000100010001);outnormal.z = 0xffffffff; //or if you wish outnormal.z = NaN;// And thus you get noise where this can happen
So how to fix it?
It is simple, just use absolute value on (-(outnormal.x*outnormal.x) - (outnormal.y*outnormal.y)) to avoid having negative number in sqrt, like this:
vec3 ExtractNormal(in vec4 vColor){ vec3 outnormal; outnormal.xy = (vColor.xy * 2.0) - 1.0; outnormal.z = sqrt(1.0 + abs(-(outnormal.x*outnormal.x) - (outnormal.y*outnormal.y))); return normalize(outnormal);}
Bam... and noise is away! There is also a solution through clamp or min, max functions which is faster (and probably more stable).
Although note, when you're storing normals in R8G8 (8-bit values) you're losing their precision ... use RGB10A2 target instead of RGBA8 target ;) inside glTexImage2D (while generating texture for framebuffer), like this:
glGenTextures(1, &mrt_buffer[1]);glBindTexture(GL_TEXTURE_2D, mrt_buffer[1]);glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB10_A2, (int)wndWidth, (int)wndHeight, 0, GL_RGBA, GL_FLOAT, 0);glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
hi villem, seriously, did u fix it??I mean, DID U FIX IT??thx God ure there to help, but I just cant believe myself, please confirm. Did u really fix it and the noise gone??please I cant believe myself. I cant test the app now coz none of the PC HERE SUPPORT SHADER(damn old pc). U saw the noise??
After first and second fix (using clamp, not abs - as clamp one is better):
here - http://www.mediafire.com/?qk6sxq5u6wyu4q3
and here - http://www.mediafire.com/?dp3ccvx3grtrcin
Uploading, for others, if they wish to test it, if it is really running okay.
here - http://www.mediafire.com/?qk6sxq5u6wyu4q3
and here - http://www.mediafire.com/?dp3ccvx3grtrcin
Uploading, for others, if they wish to test it, if it is really running okay.
thx 4 ur time I'll rate u up. anyway, I cant test right now but I'll do at the next few days, its nice to get help from the one who knows better. phew, by that can I say that it ran fine on (all of) ur pc??it's relieving u know how to kill a bug in a code.
Such effort deserves support. Running on a Geforce 8500 GT: Original code produces artifacts. With Vilem Otte's solutions they vanish. Good work.
Off topic: This minimalistic mesh is cute (in it's ugliness).
Off topic: This minimalistic mesh is cute (in it's ugliness).
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement