How do you double-check normal map output? (screenshot)

Started by
9 comments, last by Nairou 12 years, 7 months ago
I'm currently trying to learn how to do deferred shading, and in the process I'm trying to make sure my normals are being rendered properly to my FBO texture. I'm using gDEbugger to examine the contents of the FBO textures, and the normal texture is a GL_RGBA16F.

The problem is, I don't know if I am properly transforming my normal map to view space in preparation for deferred lighting. The transformed MRT image appears correct, in terms of the positions of the color variations, but the overall coloring of the result is different from what I've seen in tutorials and other examples. In the examples I've seen, the resulting output remains as vibrant as the original normal map image. That isn't the case for me, but I'm not sure what I'm doing wrong.

Here is a comparison image. The left one is a plain render of the normal map texture to the cube. The one on the right has the normal map transformed into view space.

normal_comparison.jpg


Gradient positions look right in the transformed image, but the overall coloring seems weird. I don't know how well gDEbugger visualizes RGBA16F textures, or how to check what the actual values are (all gDEbugger shows me are 0-255 color channel values).

Here are the (minimal) shaders I'm using currently, if it helps:

Vertex shader:



#version 150

uniform mat4 worldproj;
uniform mat4 world;
in vec4 position;
in vec3 normal;
in vec3 tangent;
in vec3 bitangent;
in vec2 texture;

out vec3 v2f_normal;
out vec3 v2f_tangent;
out vec3 v2f_bitangent;
out vec2 v2f_texture;

void main()
{
gl_Position = worldproj * position;
mat3 worldrot = mat3(world);
v2f_normal = worldrot * normal;
v2f_tangent = worldrot * tangent;
v2f_bitangent = worldrot * bitangent;
v2f_texture = texture;
}


Fragment shader:


#version 150

in vec3 v2f_normal;
in vec3 v2f_tangent;
in vec3 v2f_bitangent;
in vec2 v2f_texture;
out vec4 outColor;
out vec4 outNormal;
out vec4 outPosition;
uniform sampler2D texture0;
uniform sampler2D texture1;

void main()
{
outColor = texture2D( texture0, v2f_texture );
outColor.a = 1;
vec3 normal = vec3( texture2D( texture1, v2f_texture ) ) - 0.5;
mat3 t;
t[0] = v2f_tangent;
t[1] = -v2f_bitangent;
t[2] = v2f_normal;
normal = normalize( normal * t );
outNormal = vec4(normal * 0.5 + 0.5, 1.0);
outPosition = texture2D( texture1, v2f_texture );
}
Advertisement
I guess this isn't a very common question? :)

How about this: Is there a way for me to view the values being sampled from a GL_RGBA16F texture? I'm assuming the values are not clamped at 0..1, but I'd like to see what the range actually is, so I know if my math is right is converting them to -1..1.

I guess this isn't a very common question? :)

How about this: Is there a way for me to view the values being sampled from a GL_RGBA16F texture? I'm assuming the values are not clamped at 0..1, but I'd like to see what the range actually is, so I know if my math is right is converting them to -1..1.

To render your normal map just map -1..1 to 0..1 (color = normal * 0.5 + 0.5). Then start with a plain sphere without any normal map applied. This way you can "read" the normal map, i.e.
red = (1,0,0) = x-axis
green=(0,1,0)=y-axis
Just from briefly looking over your shader, when you get the normal from the normal map texture you should do value * 2 - 1 to bring it into the range [-1, 1] before you do the tangent space transformation.

Just from briefly looking over your shader, when you get the normal from the normal map texture you should do value * 2 - 1 to bring it into the range [-1, 1] before you do the tangent space transformation.


I originally thought that as well, but I have to be doing something wrong as that gives even weirder results.

Changing the vertex shader to this:



#version 150

in vec3 v2f_normal;
in vec3 v2f_tangent;
in vec3 v2f_bitangent;
in vec2 v2f_texture;
out vec4 outColor;
out vec4 outNormal;
out vec4 outPosition;
uniform sampler2D texture0;
uniform sampler2D texture1;

void main()
{
outColor = texture2D( texture0, v2f_texture );
outColor.a = 1;
vec3 normal = vec3( texture2D( texture1, v2f_texture ) ) * 2 - 1;
mat3 t;
t[0] = v2f_tangent;
t[1] = -v2f_bitangent;
t[2] = v2f_normal;
normal = normalize( normal * t );
outNormal = vec4(normal, 1.0);
outPosition = texture2D( texture1, v2f_texture );
}


Results in this:

normal2.jpg
Try doing as Ashaman73 says; render a 'plain' normal map on a sphere, map the normals into [0,1] and visualize the result.

Results in this:

normal2.jpg


This looks good and correct.
As already said, red=x-axis, the right cube surface normals seems to point in the correct direction.
Green=y-axis, this seems to be correct too when looking at the upper half of the spheres.
Finally blue=z axis, pointing in the screen. This is seems to be correct too, in a deferred shader not all z values are negative , there're always some values which will get positive.
The black colors are just negative values which the videocard clamps to 0, to display them correctly use a mapping from -1..1 to 0..1.

This looks good and correct.
As already said, red=x-axis, the right cube surface normals seems to point in the correct direction.
Green=y-axis, this seems to be correct too when looking at the upper half of the spheres.
Finally blue=z axis, pointing in the screen. This is seems to be correct too, in a deferred shader not all z values are negative , there're always some values which will get positive.
The black colors are just negative values which the videocard clamps to 0, to display them correctly use a mapping from -1..1 to 0..1.


I only just noticed last night that the color differences were due to the blue (z-axis) pointing into the screen rather than outward. I thought the normal values were supposed to point outward from the surface. Is it different for deferred shading? Why do deferred shading tutorials always show the rendered normals to be very blue, the z-axis pointing towards the screen, as they do in regular normal mapping? I've yet to find a tutorial that covers this.

I'm also curious why, with the second version of the vertex shader (remapping 0..1 to -1..1 before transformation), the colors don't blend as well as the original version (remapping after transformation). Green appears very strongly in the upper half of the sphere, then cuts off in the middle. Where as the original screenshot shows the green to be much more blended, like the original normal map texture. I would have assumed this to mean the second version is wrong, since the sphere should have a smooth curve to it. Perhaps my math is still bad somewhere?

Why do deferred shading tutorials always show the rendered normals to be very blue, the z-axis pointing towards the screen, as they do in regular normal mapping? I've yet to find a tutorial that covers this.



It is just a question of coordinate space. XYZ -> RGB, where positive Z is toward the screen surface. I, for one, use that coordinate space since it seems logical.

Cheers!

The orientation of the z component of the normals will depend on the handedness of the coordinate system you are usng; looks like you're view-space is left-handed, since +x goes to the right, +y goes up and +z goes into the screen (hence you see very little blue).

As for the colours not "blending as well" - it's because you're not remapping into [0,1] to visualize the normals.

This topic is closed to new replies.

Advertisement