# How much can Normal avergaing do for you? o)

This topic is 2770 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

Hi!

Quickie chaps/chapesses I'm rendering a low poly humanoid about 4000 faces. Great, fast etc.. but obviously theres the od rather *sharp* edge here and there and my art skills aren't up to much so I won't be able to hide them easily with clever texturing.

So I looked up normal averaging. Looks good. Got an open source pakcage that can supposedly do it but no doubt I'll end up doing most of it myself - :oP

So just how good can these things be? Can they rescue the nasty edge off a 4000 face model? I can't export any normal averaging from my 3d tool as I'm working with the brass tacks here. Just 4 arrays containing vertex, index normal and texture detail. That's it. I'm hoping a normal averger will just re-calc the normals in my normal array. Hopefully that'll be it.

Anyways! THought I'D ask cheers ;o)

##### Share on other sites
Quote:
 Original post by adder_noir Got an open source pakcage that can supposedly do it but no doubt I'll end up doing most of it myself - :oPSo just how good can these things be? Can they rescue the nasty edge off a 4000 face model? I can't export any normal averaging from my 3d tool as I'm working with the brass tacks here.

Now, you're talking about generating the per-vertex normals from just the vertex geometry data? This is something that you usually want to avoid, unless you don't have any better alternatives, since you're losing information that the modeller generated if you just regenerate the normals from scratch. The artists generally love to hate whenever this is done (almost as much as whining about DXT compression ;), and my recommendation is to avoid this from being done.

Unless it takes you less than half an hour to try this out in code, I recommend to just stuff your data into a modelling software and see it there visually.

For example in 3DSMax, smooth/hard edges can be controlled by what's called "smoothing groups" - the triangles of the model are categorized into different groups (artist-driven, manually tunable) and the adjacent triangles in the same group are taken to share a smooth edge, and adjacent triangles in different groups will share a hard edge.

Now, for generating these yourself, you have the naive method of just generating normals as if each edge is a smooth edge. For low-poly models, this does not work too good, since there are lots of sharp corners. Whenever you have a sharp edge in the geometry, having a smooth edge in the normals will look slightly odd.

The next method is to use some heuristic, e.g. an angle threshold, to automatically generate smoothing groups and to distinguish hard edges from smooth edges. But since the distinction of a smooth or a hard edge is something that depends on the model and the artistic eye, and especially for low-poly models the edge angles are quite large, finding a suitable angle threshold can be difficult.

So, if you just have a few triangles that have their normals a bit wonky, I'd recommend the best approach is to just load the model up in Blender and adjust the few normals manually to show a better effect.

##### Share on other sites
Well that sounds good to me! SO if I get this right - can I just use a built in Blender tool to adjust the normals for me and they will automatically export when I run my script that exports the mesh data (normals, indices, etc...).

Basically meaning I don't have to do anything after the export - except accept there'll be a few dodgy edges ;o)

##### Share on other sites
Hmmm... not so easy. Blender won't export the smoothed mesh even when setting the smooth flag to true for each face in the export script and after having saved the file beforehand too.

SO it looks like it's back to muggins to deal with it again. I've had a read through MWE and I reckon that's good enough for me alright. Only issue is that up to now I've only seen folks use this in a quad style mesh. My humanoid model is all triangles as that's how I've set-up my code to run. Great and all but it means now the tutorials do not cover everything I need.

I'm beginning to suspect I can see what a vertex duplication list is for. It's so you can note how many times the same vertex position is referenced in the index list, even though each index number is individual. Means you can I guess build a list of all triangles associated with any particular vertex.

What I must ask is, if I simply go through all my normals and turn them into unit vectors will that help? Would be alot easier than opening up a whole new project to deal with smoothing. I suspect there's no way round it though. I'll just have to do it.

I'm thinking of going through my original non-exclusive index list array and marking how often each number appears and where in the index array. Might be a good place to start. Probably best to start with a simple model too like say a cube or chess piece. Anyways... here goes another huge journey ;oP

##### Share on other sites
In every project I've worked on, smoothing of normals has been done in the art tool (like blender), so the artists have complete control over which edges are hard ones and which ones are soft.

What kind of 'export script' are you using?

##### Share on other sites
A custom one. Probably not so hot perhaps. I did try everything I could though, even setting smooth to 1 did not help.

From what you say it sounds as though even if you can post-operate the normals it might screw up the art-work = bad!

It seems perhaps I might want to look at writing or finding a better export script for Blender. What do you think? If not I can always do it manually. I'm working on it right now - hopefully if it comes to that it won't wreck any artwork. I hope :oP

**Edit**

Interesting. I went into Python in Blender and ran an iterator through every face in the mesh setting smooth = 1. NO EFFECT!

Hmm.. it is this exact operation which is replictaed in my export script. Seems this is not enough to get Blender to smooth normals at all. I need to find how it is done really so I can add this function into my script. Or just find anohter way of doing it.

##### Share on other sites
Well I've now written code that can find where a face index exists in the old index array, searches through to find all the faces this is in and then retrieves the normals from every one of these faces ready for averaging.

Maybe it's not so simple though. Perhaps I can't just average a load of normals without knowing dimensionally where each vertex the indices point to are in relation to one being studied/searched for. No idea to be honest. Almost fun to do though (not really) so I'll continue. Here's the code:

i = 0;    j = 0;    k = 0;    int l = 0;    tempval = 0;    int tempval2 = 0;    for(i = 0; i < 1; i++) // start with the first number in the old array    {        tempval = oldIndexArray; // a temp variable to hold the number        for (j = 0, k = 0, l = 0; j < INDEXARRAYSIZE; j++)        {            if((tempval == oldIndexArray[j]) & (j < INDEXARRAYSIZE))            {                // finds the face number that tempval2 is in.                tempval2 = j/3; // face no. that '4' e.g. is in                normalAverageTemp[l] = normalArray[(tempval2 * 3)]; // goto that same face no.                normalAverageTemp[l+1] = normalArray[(tempval2 * 3)+1]; // in the old normal array and start copying data across                normalAverageTemp[l+2] = normalArray[(tempval2 * 3)+2];                l+=3;                k = l;            }        }        //PROPOSE FUNCTION CALL TO CHANGE NORMALS AVERAGES GOES HERE - check!!        // send k into the function call so it knows how big an array to process ;o)        average_normals(k);    }

That's it. It means I now have an array called normalAverageTemp for each pass through the face index array. This is a global array which the function average_normals links to, to start doing stuff with it. At the moment average_normals just displays stuff. I need the actual computation algorithm in there next *sigh*. The idea here is that k is made equal to l and passed to the average_normals (AN) function so that it knows how far into the array to stop. I couldn't make the array dynamically sizeable obviously so it just has to be big enough to hold the max values likely to be in it.

But that obviously introduces the need to tell AN when to stop, hence it gets k passed to it. One last thing is the AN function will need to clear out the normalTemp array thing for the next pass. Anyways, it's all teaching me code anyway so....it's ok ;o)

ANyone with a better suggestion please do step forward as you'll save me alot of work - which I'm trying to demonstrate ;o)

##### Share on other sites
Quote:
 Original post by adder_noirHmmm... not so easy. Blender won't export the smoothed mesh even when setting the smooth flag to true for each face in the export script and after having saved the file beforehand too.

Mind if I ask, can you post a screenshot of the problem? Does Blender show the model correctly? Can you post a screenshot of the proper appearance? I heartily recommend that you switch exporters or work to fix your exporter to get the proper data out.

Quote:
 Original post by adder_noirWhat I must ask is, if I simply go through all my normals and turn them into unit vectors will that help?

Every modelling tool should by default produce unitary normals. Writing your own tool code that goes through your exported geometry and renormalizes all normals (and orthonormalizes your per-vertex tangent frames if you have any) is a nice sanity check, but shouldn't be required and shouldn't have an effect here - the normals should already be unitary.

If you really want to go the route, this can be a very interesting read: S. Jin, R. Lewis and D. West. A Comparison of Algorithms for Vertex Normal Computation.

I'm not sure about your code, and what you refer to as the "old index array", or "vertex duplication list". When you regenerate geometry normals without changing the existing hard/smooth edge distinctions, you do not need to re-create any index structures. For indexed geometry, the algorithm goes roughly like this:

input: indexArray: a sequence of triplets of numbers indexing the vertexArray and defining a triangle list.vertexArray: the list of vertices to draw.temporary work space:triNormals: the list of triangle normals.adjacent: a list of lists giving adjacency information. adjacent[v] is a list that stores the indices to the triangles that v is a part of.foreach(triangle t in indexArray){   // compute the normal of the triangle t, taking direction from triangle winding order;   triNormals[t] = cross product of two face edges;   // mark to which vertices this triangle is adjacent to. This step is so that we can   // later answer queries like "given a vertex v, what are all the triangles    // it is part of?"   foreach(vertex v in t)      adjacent[v].insert(t);}foreach(vertex v in vertexArray){   v.normal = 0;   foreach(triangle t in adjacent[v])      v.normal += triNormals[t];   normalize(v.normal);}

Now, in the above we don't group together vertices which have the same position but different other attributes ("duplicated vertices"), which (depending on what kind of attributes you have) logically corresponds to preserving the hard/smooth edge distinction that was present in the original geometry. If you want to make all edges smooth, or to reconstruct the hard edge/smooth edge categorization by yourself, you should make adjacent a map of lists instead of a list of lists, where you use the vertex position as the key.

Even though the code is simple, manually generating normals can be a bit sad and tedious process. It should only ever be done when you either have full smoothing group information from the source object, or if you procedurally generated the whole geometry in code (e.g. terrain), or if the model did not come with any normals to start with (ew!). I have to say again, can't really recommend this.

Here's a story why: (might go wonderfully off-topic in your case, but this is just to really explain why)

In the simplest of cases, imagine a cube. All of its edges are hard edges, meaning that if you take any corner, its vertex is specified three times in the buffer, once for each normal direction of the cube. Now, if you simply go and generate smooth edged normals for the whole cube, you'll each corner normal point out off the cube in the diagonal direction, very much unintended.

Now, you're smart and do the edge angle threshold -kind of check to generate your automatic smoothing group information and remove the above problem with the algorithm. But soon you find that your code breaks when your artists used discontinuous UV mappings for some odd reason, and your vertex merge step just goes and fuses together everything that's close enough losing the UV information.

So, quickly you fix that issue and make discontinuous UVs (and other discontinuous vertex attributes) go into separate smoothing groups. Before you have the fix in, you realize that lots of automatic UV unwrappers do produce discontinuities, so you realize you have to just duplicate the vertices but put them in the same smoothing group anyways. Fine.

Then, you find that one of your artists just had to have the disco ball effect in the game (see the first page in this tutorial), and you realize you have no way of making the automatic edge threshold detector pick that up, and since you now have the "discontinuous UV => not a smoothing group edge" feature in, the artist can't use discontinuous UVs to signal hard edges. So, you're smart again and go "hey, let's make this an option in the exporter." and in the end you have a configurable threshold value in the exporter, or a boolean flag "[x] Hard edges on UV discontinuities", or worse, both. But having a global option will not even allow you to combine both techniques, so you're limited there.

All of this manual tuning was supposed to make your pipeline more robust, since, you know, automatically generating your content means more power and flexibility, right? But instead, you now have two more need-to-remember flags, and the next summer the new intern/junior artist comes to you complaining about why the tools are broken and his character looks like it's from 1990 or SNES era. Good grief.

Instead, the best option is not to waste time with this (unless you have one of the cases explained above), and to spend your time to implement an exporter that properly reads the normal data as well. The normals are supposed to belong to the artists, and when something looks wrong in the game, it should be you the programmer going to the artists yelling "this looks wrong, fix it!" and not the other way around (a sign of a good tool is when you can always blame the artists ;).

Exporter stages like double-checking the unitarity of normals and consistent winding orders (and that normals match the winding order) are of course good for Robustness. Also, make the tool yell out to the artist whenever his data is degenerate in some way - that'll make them learn.

(No, seriously, I love our artists in house <3 ).

##### Share on other sites
Wow that's a mother of a post. I'm only replying to let you know I've seen it and have the decency to say thankyou! No doubt I'll have more questions once I've read through it all. Thanks so much. I can see you know yor subject back to front inside out so I'm going to just do whatever you say basically. I will probably finished this exercise purely for academic reasons.

Sorry for the bad explanation I would really need to post a full diagnosis of what I'm doing for it to make sense. When I've read yor post I'll reply properly. Thanks again ;o)

##### Share on other sites
Quote:
 Original post by clb(a sign of a good tool is when you can always blame the artists ;)

OT: Quote of the day =)

##### Share on other sites
Quote:
 Original post by adder_noirI will probably finished this exercise purely for academic reasons.

Doing this for learning is an excellent reason. If you don't feel time pressure to get this stuff done and to move on to other things, go for it!

I'm not saying the code you write will be useless the day after, in fact you'll find that when you need to write a geometry re-indexer or a vertex cache optimizer, you'll be able to reuse a lot of the constructs here. Adjacency information is required for a lot of geometry-related tasks as well, for example in navigation mesh pathfinding.

Quote:
Original post by teutonicus
Quote:
 Original post by clb(a sign of a good tool is when you can always blame the artists ;)

OT: Quote of the day =)

:D

##### Share on other sites
Hi!

Ok thanks I read that, very humourous in places!!!! Hahahah you obviously have alot of experience in the field ;oD

I'm still having hell getting blender to export this stuff. It will show smooth in the program but the script is powerless to do anything with it. Even the internal console in blender doesn't shift it. It only responds to the setsmooth command in the environment menu, never directly from python and it also doesn;t export what it shows in the environment window!!

Just one last question before I start using your code to get stuff happening here (good template btw thanks a bundle). How do I acquire the adjacency data? I'm au fait on the other stuff I've written my own ccross product functions before and stuff. Only the adjacency stuf bugs me. So far my export writes an indexed list (with duplicated indices many times over), vertex list and some crappy normals that make the model blocky and look like calculus going the wrong way.

Can I acquire everything I need from this? Sounds really good this is what I'll be working on now, even if just for the hell of it! Thanks so much ;o)

##### Share on other sites
Quote:
 Original post by adder_noirOk thanks I read that, very humourous in places!!!! Hahahah you obviously have alot of experience in the field ;oDI'm still having hell getting blender to export this stuff. It will show smooth in the program but the script is powerless to do anything with it. Even the internal console in blender doesn't shift it. It only responds to the setsmooth command in the environment menu, never directly from python and it also doesn;t export what it shows in the environment window!!

Thanks! If you can post a screenshot of the good and bad cases, I'm sure someone can see about the issue in more detail.

Quote:
 Original post by adder_noirJust one last question before I start using your code to get stuff happening here (good template btw thanks a bundle). How do I acquire the adjacency data? I'm au fait on the other stuff I've written my own ccross product functions before and stuff. Only the adjacency stuf bugs me. So far my export writes an indexed list (with duplicated indices many times over), vertex list and some crappy normals that make the model blocky and look like calculus going the wrong way.Can I acquire everything I need from this? Sounds really good this is what I'll be working on now, even if just for the hell of it! Thanks so much ;o)

You're very welcome. Adjacency data comes in several flavors (people give different namings for it, but the conventions never seem to be the same), and it serves to be able to answer the following commonly asked questions:

1. Given a triangle, what are its neighboring triangles? (are there T-junctions for this triangle?)
2. Given a position of a vertex, which triangles have a corner at this vertex position?
3. Given a position of a vertex, which vertex positions are reachable from this vertex just by traversing a single edge in the mesh?
4. Given an edge (a pair of vertices), which triangles share their edges with this edge?
5. Given a vertex (by its index in the vertex buffer or by position), which other vertices (by their indices in the vertex buffer) lie at the same position?
6. etc, whatever you can think of.

You can think of this data to be a lookup table to accelerate these kinds of queries, since technically you wouldn't need to calculate this data and you could just compute the information on-the-fly by looping through the geometry, but generating this lookup table improves your performance by a factor (or two, in some cases) of n (# tris or vertices), i.e. it's the sane thing to do.

Now in your case, you're doing queries of the form #2 and possibly #5. The first foreach loop in the pseudo-code in my above post shows how to compute #2. If you want to re-do smooth/hard edge division, you'll need to recreate the per-vertex data and re-index the mesh, for which you'll need #5. This is often done by just converting the indiced geometry to an unindiced triangle list, then doing the computations, and then converting back to indiced geometry.

So, in effect, there's not really any magic to generating adjacency data, you just loop through the primitives you're interested in (triangles, vertices) and store into a lookup table the interesting information you want for each of those primitives. Then when processing, you have that data in a neat table to help the actual work.

##### Share on other sites
That's great! Thanks. I think from what you write I can acquire the different triangles a given vertex belongs to from the handy index list my export script puts out. I think I'm going to have to ditch the normals that come with it though and make my own from scratch. I've been trying some halfway house method that didn't get much results, just some very odd behaviour.

Time to do it from the ground up I think. I'm going to use your template and some well planned code on a simple cube to try and get this to work. Seems all I have to do is acquire the normals with the cross product (I might be able to skip this bit using the normals outputted from blender) and then average them all in a mean fashion.

I kind of tried this last run though so I'm not too sure how I'll fair better this time, but for damn sure I'm going to try it. I'll try and get a screen shot of the blender 3D view and show it to you, but tbh there's not much to see. Just a smoothed cube which renders solid in DirectX.

I've got something to do this morning I'll be back onto this later today. thanks ;o)

**Edit** Dumba** question time:

I don't have to do this on the fly do I? I was under the impression I can just re-calc the fixed normals for the model and then load it and it will always look smooth. That's right isn't it? As the normals don't change at all at run-time do they?

##### Share on other sites
Quote:
 Original post by adder_noirI don't have to do this on the fly do I? I was under the impression I can just re-calc the fixed normals for the model and then load it and it will always look smooth. That's right isn't it? As the normals don't change at all at run-time do they?

You're right. The normals in the input vertex data do not need to change, unless you change the position data in the input vertex buffer. When you're rendering the object in the world, remember to transform the normals in the vertex shader properly (remember the inverse transpose if you use non-uniform scaling, otherwise the usual object world transform matrix suffices for normals as well).

##### Share on other sites
http://img267.imageshack.us/img267/9581/horridpic.jpg

^^^ Have you ever seen anything so horrid? Nope thought not :oD I'm probably gonna leave this one as I'm not getting on too well with it, but hey it's been a great ride and I wrote at least some code that did *something*. I'll either quest for a blender fix or just live with blocky models and shape them better in blender.

Thanks so much clb this was probably the most interesting thing I'll do for ages cheers ;o)

##### Share on other sites
Just one very last question - I assume (now anyway) that every vertex normal is unique here? There are no vertex normals shared between vertices. For example the results above are trying to apply 1 averaged normal to all three normals in the face thus causing utterly buggered results.

SO just to set me straight for my last bout of self-discovery - every vertex normal even those within a given face are all different right? There's no faces where all the vertex normals are the same for all 3 verts that make the face? Sorry to be such a pain in the a but well you got me interested ;o)

I won't ask anymore tho I promise - time for my own work or lack of it to commence ;oD

##### Share on other sites
I'm confused. Doesn't Blender actually store the normals for each model? Can't you just export them in the export script? Looking in the .obj or other export scripts should help find the right things to export.

Why not just hit the "smooth" button ("Links and Materials" tab under the editing panel (F9) in 2.49, left side panel in 2.5 alpha) either in object mode to smooth the whole model, or in edit mode with the verts / faces you specifically want smoothed selected, which will weld the relevant vertices. If you're trying to auto-smooth normals, Blender can do this too, with a set angle constraint.

Your model looks to be flat shaded in that picture, i.e. every vertex on a particular face has the same normal, and if a vertex is shared between two or more faces, the vertex will have a different normal for each face that shares it. It sounds like you want smooth vertex normals, where each vertex has only one normal, which is an average of the normals of each face to which the vertex is connected. Sorry if you already know some of this...

##### Share on other sites
Hi sprite.

That's exactly the problem. They will show in blender but they won't export. The 3D view in blender is also unresponsive to commands from the python console too when given smoothing instructions. The only thing that makes it smooth is the menu button set smooth but none of that normal data comes out with the model after export. Bugger :oP

##### Share on other sites
Weren't you using DirectX? You can just extend the vertex declaration to include a normal and call D3DXComputeNormals() and it will do the right thing for you. I saw your screenshot and basically that model has the wrong normals, D3DXComputeNormals() will give you the right normals.

Since we have a normals expert on this thread, let me ask a question of my own: how do professionals handle the situation where a game level is comprised of a small number of sub-meshes (each with correct artist generated normals) and these sub-meshes are then connected to each other at random translations and rotations to form a maze for instance.

For example, say there are 3 meshes: A long corridor, a T-Junction and a Room. These 3 meshes can be connected together via translations and rotations to form a very simple maze level.

How do most people handle the abnormal normals that will result on the polygons where the different pieces touch each other? Do most engines "weld" those vertices together? Doesn't this bust instancing?

##### Share on other sites
I think I'm going to have to bow out now and live with it. Blender is acting odd and I'm too much of a noob to follow all this stuff tbh. Thanks for the help though everyone much appreciated.

##### Share on other sites
Quote:
 Original post by adder_noirThey will show in blender but they won't export.

Blender *will* export per-vertex normals! I assume you've written your own python exporter? You want to do something like this:
normal1 = mesh.faces[this_face].v[0].nonormal2 = mesh.faces[this_face].v[1].nonormal3 = mesh.faces[this_face].v[2].no

##### Share on other sites
Hi,

Oh I know all that realy I'm a noob :o) Thanks for the tip mate I'll try it out alright ;o) Voted you up too - as I do everyone.

**Edit**

One thing mate, can I use that code for an iteration of kind:

out.write( 'Normals:\n' )	for face in mesh.faces:       		normal1 = face.v[0].no       		normal2 = face.v[1].no       		normal3 = face.v[2].no       		out.write ('%f,%f,%f#\n' % (normal1.x,normal1.y,normal1.z) )		out.write ('%f,%f,%f#\n' % (normal2.x,normal2.y,normal2.z) )		out.write ('%f,%f,%f#\n' % (normal3.x,normal3.y,normal3.z)

This actually ran fine. Nice one thanks. All I need now is to work out how to get Blender to export seperate vertices per face and I'll be away.

[Edited by - adder_noir on July 22, 2010 1:25:35 AM]

##### Share on other sites
Quote:
 Original post by adder_noirAll I need now is to work out how to get Blender to export seperate vertices per face and I'll be away.

You can get the vertex positions like so:
p1 = mesh.faces[this_face].v[0].cop2 = mesh.faces[this_face].v[1].cop3 = mesh.faces[this_face].v[2].co

##### Share on other sites
Well I really don't know what to say. Using a much rougher version of your very nice above code I still managed to get this to work. I will for sure include your latest stuf into my export script thank you so much this pic below would not have happened without the code you posted above. I don't know if there are words to describe how proud I feel about this. It's one of the greatest things I've ever done. Last night I was swearing off code for life, tearing up text books and so on. Now finally it's here. I have it, and thanks to you I have an even easier way to implement it now:

http://img19.imageshack.us/img19/6383/normalpiccopy.jpg

Thanks so much this has truly made my day. Simple though it may look, bugger me if it isn't a normal averaged cube just as it looked in blender. Thanks so much mate ;)