Sign in to follow this  
Kest

Vertex morphing, normals

Recommended Posts

Kest    547
I'm trying to implement some static mesh morphing. A single mesh can have any number of morphs applied, where each morph has a 0 to 1 factor value. At first, the morphs are meshes that have the exact same vertex indices as the original mesh. My plan is to compare the morph mesh vertices to the original mesh vertices and record the difference for each vertex (offset = morph - original). That offset would be the only data needed to reposition the vertex. So to apply a morph, each vertex position is simply.. new = original; new += morph1_offset * factor1; new += morph2_offset * factor2; etc.. I'm pretty sure this will all work pretty well. But I'm getting slowed down by the vertex normals. They need to be adjusted to show the morphs as well, but I'm not sure how to go about adding them in. Are there any efficient methods to achieve this? It could just be a simple matter that I'm totally missing. I have the original meshes for the morphs, so I can pre-process any amount of data, any way I like. One of my first ideas was to simply use a portion of the morph normals.. new_normal = original_normal; new_normal = new_normal*(1.0 - factor1) + morph1_normal*factor1; new_normal = new_normal*(1.0 - factor2) + morph2_normal*factor2; etc.. new_normal.Normalize(); But I'm not that sure about the results. Are there any accurate ways to quickly compute the normal in this manner? Any help is much appreciated!

Share this post


Link to post
Share on other sites
hellknows2008    164
Probably there isn't a good solution for real-time

I myself have the feature you described implemented in my engine. Well, basically its called Morpher in 3ds Max, if you have access to 3ds Max then you can read about the morpher modifier source code, it stores position deltas but not normals. Calculation of the normals in 3ds Max depends on the smoothing group information, so unless you also have those smoothing group information, whatever method you use to calculate the normal the result maybe not be as desired. My take is as same as in Horde3D, normal deltas are also stored and they are treated like positions. My expected usage is that as fewer morphs as possible are used at any time and the deltas shouldn't be very large (otherwise you seek other solutions like skinning or just node transformation). Usually, you can't notice the inconsistency of the normals.

In closing, morphing plus skinning can be very powerful!!! In 3ds Max, you will first add a morpher modifier then add a skin modifier. However, my exportation doesn't deal with T pose/bind pose, so my extraction of each morph (channel bank in 3ds Max, iirc) is a bit complicated. However, I found that ignoring the T pose/bind pose can produce very high compatibility exportation

Share this post


Link to post
Share on other sites
Kest    547
Hey, thanks for your perspective on these things. I've also been messing around with the morpher modifier in Max. Actually, that's what I've been using to manage my base meshes and morph meshes. But I still need to apply each morph to 100 and export them, one at a time, as a static mesh. I plan to do the conversions to morph targets when they're loaded into the game engine.

So what would you think to be the best approach to morph the normals the most accurately? For example, here are some concepts I have in mind..

new_normal = original_normal;

for each morph:
new_normal = new_normal*(1.0 - factor[n]) + morph_normal[n]*factor[n];

new_normal.Normalize();

..or..

new_normal = original_normal;

for each morph:
new_normal += morph_normal[n] * factor[n];

new_normal.Normalize();

..or..

new_normal = original_normal;

for each morph:
new_normal += (morph_normal[n] - new_normal) * factor[n];

new_normal.Normalize();

..or..

new_normal = original_normal;

for each morph:
new_normal += (morph_normal[n] - original_normal) * factor[n];

new_normal.Normalize();


I'm not all that confident with any of them, but I suppose I can test them all out. Still, any information or advice is appreciated.

Share this post


Link to post
Share on other sites
hellknows2008    164
I think my exportation is as same as yours. Exactly, if morpher is detected then my exporter will create a morpher controller for the mesh node, and the data are the base mesh, each morph mesh and the weight animation. When exporting the base mesh, the exporter needs to set all channel weights to zero, while exporting each morph channel is as same as yours, set the weight of the morph channel to 100% and all others to zeroes, then subtract them by the base ones to get the delta

Well, you've come up quite some methods to calculate the normals. Well, I also have no idea which one is better. Maybe I can try your methods in my engine, but I doubt I can judge which result is better.

However, to tell the most correct one, I think its to convert them into morph targets as you stated. I don't have such implemention yet and now you brought it up, guess some day I will revisit the problem and implement the conversion and make another controller :D

Share this post


Link to post
Share on other sites
andyk71    122
Quote:
Original post by Kest
I'm trying to implement some static mesh morphing. A single mesh can have any number of morphs applied, where each morph has a 0 to 1 factor value.

At first, the morphs are meshes that have the exact same vertex indices as the original mesh. My plan is to compare the morph mesh vertices to the original mesh vertices and record the difference for each vertex (offset = morph - original). That offset would be the only data needed to reposition the vertex. So to apply a morph, each vertex position is simply..

new = original;
new += morph1_offset * factor1;
new += morph2_offset * factor2;
etc..

I'm pretty sure this will all work pretty well. But I'm getting slowed down by the vertex normals. They need to be adjusted to show the morphs as well, but I'm not sure how to go about adding them in.

Are there any efficient methods to achieve this? It could just be a simple matter that I'm totally missing. I have the original meshes for the morphs, so I can pre-process any amount of data, any way I like.

One of my first ideas was to simply use a portion of the morph normals..

new_normal = original_normal;
new_normal = new_normal*(1.0 - factor1) + morph1_normal*factor1;
new_normal = new_normal*(1.0 - factor2) + morph2_normal*factor2;
etc..
new_normal.Normalize();

But I'm not that sure about the results. Are there any accurate ways to quickly compute the normal in this manner?

Any help is much appreciated!


Hi,
i have exactly the same problem as you.
But i have already created the programm that generates automatically a file with the difference data between original and morph object. It exports the index, x,y,z,nx,ny,nz,tu,tv.
If i then only use morphs that don't intersect, everyhing looks great. I calculate the normals (nx,ny and nz) in the same way like he positons (x,y,z). That works.

But now i have the problem that when i want to intersect some morphs, i have to recalculate the normals because the surrounding geometry has changed and so the normals.
So, if you want i can give you vb code for the generator....


Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this