# OpenGL recalculating normals in geometry shader

## Recommended Posts

I've been messing around with displacement maps and other types of vertex movement in the vertex shader. The problem i keep on running into is that once the vertices are changed the normals are no longer correct. So i figured it wouldn't be too difficult to recalculate the normals in the geometry shader since you have access to multiple vertices that make up the face. here's the geometry shader that i created to do this, it acts on GL_TRIANGLES as an input. and gl_Position isn't multiplied by gl_ModelViewProjectionMatrix until it's output by the geometry shader.
#version 120

varying vec3 eyeSpaceNormal; //output normal for frag shader

void main( void )
{
eyeSpaceNormal = gl_NormalMatrix * normalize(cross(gl_PositionIn[1].xyz - gl_PositionIn[0].xyz, gl_PositionIn[2].xyz - gl_PositionIn[0].xyz)); //calculate normal for this face

for (int i = 0;i<gl_VerticesIn;i++)
{
gl_Position =gl_ModelViewProjectionMatrix * gl_PositionIn[i]; //multiply position by MVP matrix;
EmitVertex();
}
EndPrimitive();
}


This works fine but it only calculates the normal once per face so you get a faceted look. I thought of two possible solutions for this: a) you could use TRIANGLES_ADJACENCY as the input for the geometry shader and then this way you would have access to the adjacent vertices. You could then calculate the normals for the adjacent faces and then average these together to get smooth normals. I haven't been able to do this, beyond the opengl specs there is almost no info on geometry shaders available on the web. This also seems to be a lot of work especially since my models use tangent and bitangent information which i also need to recalculate. b) My second idea was to rotate the original normals (which are already smoothed) using the info from the new normal that was calculated. You could also rotate the tangent and bitangent vectors using this. But honestly I studied graphic design so my math skills are not up to par and i'm not even sure if this would work. here's what i've come up with so far but i'm getting weird results:
#version 120

in vec3 normalIn[]; //original normals
varying vec3 eyeSpaceNormal; // fixed normals for frag shader

void main( void )
{
vec3 p1 = (gl_PositionIn[1]-gl_PositionIn[0]).xyz;
vec3 p2 = (gl_PositionIn[2]-gl_PositionIn[0]).xyz;
vec3 X =  normalize(p1);
vec3 Y =  normalize(cross(p1,p2));
vec3 Z =  cross(X,Y);
mat3 rot = gl_NormalMatrix * mat3(X.xyz,Y.xyz,Z.xyz); //rotation matrix

for (int i = 0;i<gl_VerticesIn;i++)
{
gl_Position =gl_ModelViewProjectionMatrix * gl_PositionIn[i];
eyeSpaceNormal = normalIn[i] * rot; // multiply original normal by rotation matrix
EmitVertex();
}
EndPrimitive();
}


Part of the problem seems to be that every other polygons normal is flipped, probably because of the winding, i also think that multiplying the rot matrix by the NormalMatrix might not be working but i really have no clue at this point any help would be awesome. [Edited by - sgsrules on November 9, 2009 5:30:34 PM]

##### Share on other sites
Your first solution is okay, although you're computing one normal per primitive and you should compute one normal per every vertex.
So you should compute your normal inside the for loop, not outside it.

I'm not sure if you can write it like this (also with this solution you need to make sure that your model contains JUST triangles and no other primitive), as I still don't have too much experiences with geometry shaders (but I'm learning them :P).
#version 120#extension GL_EXT_geometry_shader4 : enablevarying vec3 eyeSpaceNormal;void main( void ){	eyeSpaceNormal = gl_NormalMatrix * normalize(cross(gl_PositionIn[1].xyz - gl_PositionIn[0].xyz, gl_PositionIn[2].xyz - gl_PositionIn[0].xyz));	gl_Position = gl_ModelViewProjectionMatrix * gl_PositionIn[0];	EmitVertex();	eyeSpaceNormal = gl_NormalMatrix * normalize(cross(gl_PositionIn[2].xyz - gl_PositionIn[1].xyz, gl_PositionIn[0].xyz - gl_PositionIn[1].xyz));	gl_Position = gl_ModelViewProjectionMatrix * gl_PositionIn[1];	EmitVertex();	eyeSpaceNormal = gl_NormalMatrix * normalize(cross(gl_PositionIn[3].xyz - gl_PositionIn[2].xyz, gl_PositionIn[1].xyz - gl_PositionIn[2].xyz));	gl_Position = gl_ModelViewProjectionMatrix * gl_PositionIn[2];	EmitVertex();	EndPrimitive();}

As I'm not on PC that is able to debug some geometry shaders, I can't test this if it really emits good vertices and distributing eyeSpaceNormal per every vertex and not per primitive.
This MIGHT work, I can't make myself sure if it will work. Just give it a try, it might work.

##### Share on other sites
It's pointless computing the normals inside the loop, you still get the same normal calculation since you're using the same three points of that face. I did it once outside the loop to speed things up.

##### Share on other sites
You could sample the displacement map around the current vertex, to estimate the normal from that in the vertex-shader.
Using the adjacency-information would probably provide a reasonable approximation as you say (you get four surrounding vertices for each vertex), though there's no guarantee there aren't additional triangles sharing a vertex, apart from those you know about from the adjacency input.
Depending on how expensive you're willing to make the shader, you could combine the two methods, using adjacent vertices to estimate what area of the displacement map should be sampled.

(I'm not sure what GLSL version is necessary to do that, but I assume it should be available in the geometry shader).

##### Share on other sites
I'm sure the adjacency-information would work but i'm trying to keep the computations down. Calculating all the extra normals and then averaging them and doing the same thing for the bitangents and tangents would be pretty costly. Plus is till havent's been able to successfully compile something using triangles with adjacency as an input.

I guess what i mostly want to try to be able to do is rotate the existing TBN (tangents,bitangents,normals) using some sort of matrix if it's possible. I've already got all the smoothed TBN stuff precalculated on the cpu. I'd be nice if i could adjust it using a rotation matrix once per polygon.

##### Share on other sites
Exactly why do you need this, can't you combine the normal and displacement maps so that the changed normal is already present in the normal-map?
With all displacement mapping I've seen the displacement is calculated from the normal-map, to get the actual geometry to match the normal, which is already faking the deformation caused by the displacement map.
I might be mis-understanding what you are trying to do. =)

##### Share on other sites
I'm not just using this for displacement maps but also on different types of object warps and deformations. I also use the TBN vectors for other types of effects, not just normal mapping.

##### Share on other sites
With an arbitrary deformation of vertices, where each vertex can be connected to an unknown number of other vertices, there's not really any easy way to do this in a shader. If you deform by a mathematical function, then you could use that to re-calculate the normals. Google gave me for example http://http.developer.nvidia.com/GPUGems/gpugems_ch42.html, and http://www.ozone3d.net/tutorials/mesh_deformer.php. If you instead deform by a displacement map, then sampling that to approximate a local function could work in the same way.
If your deformations allow more or less random displacements of different vertices, then you probably can't use a matrix, since the matrix would be different for each vertex, and calculating that matrix would require the same work as re-calculating the normals (adjacent surfaces etc).

##### Share on other sites
Do it in 3-pass.
First, render to a RGBA32f FBO with 1 target. Here you calculate vertex displacements. Can use transform-feedback instead, too. Otherwise cast the data to a DynVBO1. (data: struct{ vec4 position; }; )
Second pass: start rendering in another RGBA32f target, that has at least NumVerts pixels. Clear to black. Switch to additive-blending mode. A geometry shader has triangles as input, formed solely by the DynVBO1. The geom shader calculates the triangle-normal and generates 3 primitives: 3 1-pixel points. Aside from the gl_Position for those points (that is at coords that map to the given vtx-ID on the FBO), a varying "vec4 normalAccum=vec4(nx,ny,nz,1)" is sent to vtx-shader, which directly sends it to frag-shader for gl_FragColor.
This all simply makes the gpu do the scatter-write for you.
Now, cast that RGBA32f target to DynVBO2 (data: struct{ vec4 accumNormal; }; )
accumNormal is not normalized, you can either normalize it in another pass, or simply have your next vtx-shaders that will use DynVBO1 and DynVBO2 normalize it.

Attach DynVBO1 for vtx-position, DynVBO2 for vtx-normal, and StaticVBO1 for the static data with 1-14 attribs. Inside the frag-shader, you will easily recompute the TNB out of the normal and dFdx/dFdy.

You can merge pass1 and pass2, if FBO-switching is more expensive than recalculating the vertex-displacements ~3 times.

## Create an account

Register a new account

• ## Partner Spotlight

• ### Forum Statistics

• Total Topics
627657
• Total Posts
2978472
• ### Similar Content

• Both functions are available since 3.0, and I'm currently using glMapBuffer(), which works fine.
But, I was wondering if anyone has experienced advantage in using glMapBufferRange(), which allows to specify the range of the mapped buffer. Could this be only a safety measure or does it improve performance?
Note: I'm not asking about glBufferSubData()/glBufferData. Those two are irrelevant in this case.
• By xhcao
Before using void glBindImageTexture(    GLuint unit, GLuint texture, GLint level, GLboolean layered, GLint layer, GLenum access, GLenum format), does need to make sure that texture is completeness.
• By cebugdev
hi guys,
are there any books, link online or any other resources that discusses on how to build special effects such as magic, lightning, etc. in OpenGL? i mean, yeah most of them are using particles but im looking for resources specifically on how to manipulate the particles to look like an effect that can be use for games,. i did fire particle before, and I want to learn how to do the other 'magic' as well.
Like are there one book or link(cant find in google) that atleast featured how to make different particle effects in OpenGL (or DirectX)? If there is no one stop shop for it, maybe ill just look for some tips on how to make a particle engine that is flexible enough to enable me to design different effects/magic
let me know if you guys have recommendations.
• By dud3
How do we rotate the camera around x axis 360 degrees, without having the strange effect as in my video below?
Mine behaves exactly the same way spherical coordinates would, I'm using euler angles.
Tried googling, but couldn't find a proper answer, guessing I don't know what exactly to google for, googled 'rotate 360 around x axis', got no proper answers.

References:
Code: https://pastebin.com/Hcshj3FQ
The video shows the difference between blender and my rotation:

• By Defend
I've had a Google around for this but haven't yet found some solid advice. There is a lot of "it depends", but I'm not sure on what.
My question is what's a good rule of thumb to follow when it comes to creating/using VBOs & VAOs? As in, when should I use multiple or when should I not? My understanding so far is that if I need a new VBO, then I need a new VAO. So when it comes to rendering multiple objects I can either:
* make lots of VAO/VBO pairs and flip through them to render different objects, or
* make one big VBO and jump around its memory to render different objects.
I also understand that if I need to render objects with different vertex attributes, then a new VAO is necessary in this case.
If that "it depends" really is quite variable, what's best for a beginner with OpenGL, assuming that better approaches can be learnt later with better understanding?

• 10
• 12
• 22
• 13
• 33