Normal calculations

Started by
7 comments, last by sprite_hound 17 years, 1 month ago
I know this is a fairly well-covered topic generally, but I seem to be having problems with it. I'm (still) working on a .obj file loader, and have been trying to write a function to calculate normals. I have two versions of this, one that creates flat face normals for each vertex, and one that averages them for each vertex. Here they are:

// Creates flat face normals
void cObjModel::calcNormals() {
	normal.clear();
	for (vector<cObject>::iterator i = object.begin(); i != object.end(); ++i) {
		for (vector<cObjFace>::iterator j = i->face.begin(); j != i->face.end(); ++j) {
			// Get the verts.
			const cObjPoint3d &vert1 = vertex[j->vertex[0]];
			const cObjPoint3d &vert2 = vertex[j->vertex[1]];
			const cObjPoint3d &vert3 = vertex[j->vertex[2]];
			// Get 2 co-planar vectors
			cObjPoint3d v1 = vert1 - vert2;
			cObjPoint3d v2 = vert2 - vert3;
			// Get the cross product of those vectors
			cObjPoint3d vec3(v1.y*v2.z - v1.z*v2.y, v1.x*v2.z - v1.z*v2.x, v1.x*v2.y - v1.y*v2.x);
			// Normalise
			float length = sqrt( pow(vec3.x,2) + pow(vec3.y,2) + pow(vec3.z,2));
			if (length == 0.f) {length = 1.f;}
			vec3.x /= length;
			vec3.y /= length;
			vec3.z /= length;
			// Add to normals vec
			normal.push_back(vec3);
			// Clear the faces normals, create enough space and fill with the right info.
			j->normal.clear();
			fill_n(back_inserter(j->normal), j->vertex.size(), normal.size()-1);
			errorLog.print("Norm: %f, %f, %f\n", vec3.x, vec3.y, vec3.z);
		}
	}
}

// Creates averaged normals for each vertex.
void cObjModel::calcNormals() {
	normal.clear();
	fill_n(back_inserter(normal), vertex.size(), cObjPoint3d());
	for (vector<cObject>::iterator i = object.begin(); i != object.end(); ++i) {
		for (vector<cObjFace>::iterator j = i->face.begin(); j != i->face.end(); ++j) {
			// Get the verts.
			const cObjPoint3d &vert1 = vertex[j->vertex[0]];
			const cObjPoint3d &vert2 = vertex[j->vertex[1]];
			const cObjPoint3d &vert3 = vertex[j->vertex[2]];
			// Get 2 edge vectors
			cObjPoint3d v1 = vert1 - vert2;
			cObjPoint3d v2 = vert2 - vert3;
			// Get the cross product of those vectors
			cObjPoint3d vec3(v1.y*v2.z - v1.z*v2.y, v1.x*v2.z - v1.z*v2.x, v1.x*v2.y - v1.y*v2.x);
			for (vector<unsigned int>::iterator k = j->vertex.begin(); k != j->vertex.end(); ++k) {
				normal[*k] += vec3;
			}
			j->normal.clear();
			copy(j->vertex.begin(), j->vertex.end(), back_inserter(j->normal));
		}
	}
	for (vector<cObjPoint3d>::iterator i = normal.begin(); i != normal.end(); ++i) {
		// Normalise
		float length = sqrt( pow(i->x,2) + pow(i->y,2) + pow(i->z,2));
		if (length == 0.f) {length = 1.f;}
		i->x /= length;
		i->y /= length;
		i->z /= length;
	}
}

As far as I can tell both of these are working fine, and my model shows up as I'd expect it to. But when I rotate by 90 degrees on the x axis all I get is a black screen. (The surfaces just darken as I rotate up to 90, then lighten again). I'd think this might be because the models I'm trying to load in don't have a consistent winding direction, so some faces are showing up properly, and others aren't if I calculate my own normals. Also, if I change the order of finding the two edge vectors, I can reverse which faces show up, which seems to support that conclusion. If that's the case, I really don't know how to get around it. I tried glLightModelf(GL_LIGHT_MODEL_TWO_SIDE, 1); which I think should work if I've simply managed to reverse the normals, but it doesn't. So I don't really know what to try. Any help is much appreciated. :)
Advertisement
Its best to find a way to unify the winding order for the polygons after all CULL FACE is a very useful and simple optimization that requires this, you can probably fix them if your doing the normals your self by checking the normal vs the center of the object or something similar. Other then that The per face looks fine at least in theory, im not sure about your per vertex,
for (vector<unsigned int>::iterator k = j->vertex.begin(); k != j->vertex.end(); ++k) {  normal[*k] += vec3;}


This seems rather funny to me, seems like your just adding the same normal 3 times, and then normalizing it, which would pretty much just give you the face normal again.

As for the lighting thing, I suggest writeing a simple 'draw normals' function and just see if your normals are correct for yourself. Its simple to do and you don't have to guess.

Probably some weird Light rotation + Object rotation problem if they are good.

Hopefully this helped some. Good Luck

I seek knowledge and to help others whom seek it
In fact, I have had this problem before. It may not be your code at all. load up the model in a program( I recommend blender) and use it to see what the normals are. It may have some reversed, and as such are getting the winding reversed in the files, and when you calculate using this, you are getting some wrong because of it. You may not have noticed this problem if you have it if your program has a setting to draw double sided because with this on, it wouldn't matter. i have this problem when I model something and mirror it to model it quicker, the normals get reversed and you have to recalculate them outside. This may not be the problem, but give a a whirl and see.


Thanks for the replies. :)

@kburkhart84: I tried it (blender), and the models seem fine. (I'm not sure, but I think blender ignores the normals on import anyway.)

@AlgorithmX2: I think I'll end up having to check the winding myself somehow, unless I have made a silly mistake somewhere, which is more than possible.

I think the per vertex calculation is right. What I do is create a global normal vector matching the one for the vertices, and add the calculated face normal to the normals at the positions corresponding to the vertices, so that when I normalise them later, they're effectively averaged. At least, that's what should be happening. :)

I did as you suggested in creating a function to draw the normals:
void cObjModel::drawNormals() {	glDisable(GL_LIGHTING);	glBegin(GL_LINES);	glColor4f(0.8f, 0.8f, 1.f, 1.f);	for (vector<cObject>::iterator i = object.begin(); i != object.end(); ++i) {		for (vector<cObjFace>::iterator j = i->face.begin(); j != i->face.end(); ++j) {			vector<unsigned int>::size_type k = 0;			while (k < j->vertex.size()) {				glVertex3f(vertex[j->vertex[k]].x, 					vertex[j->vertex[k]].y, 					vertex[j->vertex[k]].z);				glVertex3f(vertex[j->vertex[k]].x+normal[j->normal[k]].x, 					vertex[j->vertex[k]].y+normal[j->normal[k]].y, 					vertex[j->vertex[k]].z+normal[j->normal[k]].z);				++k;			}		}	}		glEnd();	glEnable(GL_LIGHTING);}


And here are the screens it produces with a standard cube (blended badly, I'll sort that later). Using normals exported from blender, rather than calculated by me:


Normals calculated per face:


Normals calculated per vertex:


This apparently shows that the y axis is inverted. I'll check through and see if I can change anything to solve that, but all I can guess at is that the winding isn't consistent.

Edit: I don't seem to be able to do anything to make the normals work. I can change it so the x normals appear wrong instead, but that seems to be it, so I'll look into finding the winding direction.

[Edited by - sprite_hound on March 8, 2007 12:40:37 PM]
I dunno if this will help you, but this is my vertex normal code from my obj loader:

    void Mesh::GenerateVertexNormals() {        assert(m_FaceNormals.size() == m_NumFaces);        vector<NHuint> temp; //temp store for triangle indices        temp.reserve(3);                m_VertexNormals.resize(m_Vertices.size());        for (int i = 0; i < m_NumFaces; ++i) {            Vec3 n = m_FaceNormals; //Get this faces normal            GetTriangleIndices(i, temp);            for (int j = 0; j < 3; ++j) { //Go through the 3 indices adding the normal for this face                m_VertexNormals[temp[j]] += n;            }        }        //Finally normalize        for (vector<Vec3>::iterator it = m_VertexNormals.begin(); it != m_VertexNormals.end(); ++it) {            (*it).Normalize();        }    }
Member of the NeHe team.
Hmmm, maybe its this...

cObjPoint3d v1 = vert1 - vert2;
cObjPoint3d v2 = vert2 - vert3;

shouldn't this be
cObjPoint3d v1 = vert1 - vert3;
cObjPoint3d v2 = vert2 - vert3;

so that both directions originate from the same reference?
Just an idea...

I seek knowledge and to help others whom seek it
Quote:Original post by Kazade
I dunno if this will help you, but this is my vertex normal code from my obj loader:

*** Source Snippet Removed ***


Thanks for the post. It makes me think I'm doing my per vertex right. The problem is in the face normal calculations, which run on to make the vertex averages wrong.

Quote:Original post by AlgorithmX2
Hmmm, maybe its this...

cObjPoint3d v1 = vert1 - vert2;
cObjPoint3d v2 = vert2 - vert3;

shouldn't this be
cObjPoint3d v1 = vert1 - vert3;
cObjPoint3d v2 = vert2 - vert3;

so that both directions originate from the same reference?
Just an idea...


Nope, unfortunately that just changes it from the y norms being flipped to the x instead. Thanks anyway.

I've been wondering why my calls to glLightModelf(GL_LIGHT_MODEL_TWO_SIDE, GL_TRUE) weren't doing anything. It appears that when doing the backwards faces, the diffuse lighting is ignored, and only ambience values are used (which actually makes sense, though wasn't obvious from any docs). Since the default blender material exports with a black ambience, that explains why I couldn't see those faces.

So... it looks like I can work round the problem just setting the ambient material property to the same as the diffuse property. Obviously this doesnn't solve it, but it'll do for me for the moment.

I feel I might as well also post my thoughts on how to get normals facing the right way though:

Since any slightly more complex model could well be "hollow" (e.g a bowl shape), it's not possible to just use whether the face normal is pointing towards or away from the center.

So long as the model is a "closed" shape however, it's possible to find out whether any given point is inside it, or outside it. AFAIK, this can either be done by using the winding value for a point (0 degrees for outside, 360 degrees for inside), or by shooting a ray in any direction and counting the number of faces it passes through. (Assuming "farther away" is outside, which methinks it should be for any sane model). Winding is probably the easiest of these options. (?)

Anyhoo, this means that it shouldn't be too hard to take a point that's a very, very small amount in the direction of the normal away from the center of any face and work out if it's inside or outside the model. If it's inside, then the normal is the wrong way round and needs to be flipped.

Not sure I can be bothered to implement that right now, but it's good to know that it's possible. :)
Hmm, Just curious but why do you need to calculate your own normals? You've shown that you can simply use the ones exported via blender if you wished to, Why generate new ones?

I seek knowledge and to help others whom seek it
Well, it was more of an exercise than anything else. It's also a fallback since when I was exporting .obj models, the blender default seems to be not to export normal values, and I kept forgetting to toggle the right option.

Its also been pretty interesting, and I now know how the normals, and vector cross products work, and even how smoothing can be achieved by averaging the normals for the vertexes, and that it's possible to change the smoothing for a model, based on a threshold angle between the face normals at the vertices.

So if I ever need to do anything more advanced I'll hopefully be on my way to understanding how. :)

Thanks once again for all the help.

This topic is closed to new replies.

Advertisement