Questions about complicated rig in Blender

Started by
15 comments, last by Scouting Ninja 7 years, 9 months ago

I just have finished rigging my first character. After all these years, only now I got the courage to touch this topic qhich sounded quite complicated to me.

So it was quite a revelation to find out rigging is nowhere near as bad as I heard, and actually Blender proved to be much better to use for rigging than I feared. So far everything was quite smooth sailing if you disregard the fact I spent a lot of time learning the whole rigging process (and going in to fix some stupid newbie mistakes).

This is where I am now:

[attachment=32202:character_rig.png]

Disregard the many bones for pouches and Stuff... I will try to animate them from script, and besides struggling with python driver in Blender, I think I should be set with Unitys C# (if the many bones to transform will be performant enough IDK yet).

Also disregard my ignorance for giving the control bones a different shape. Ugly rig, I know. I will rectify it later.

My main problem at the moment is how to parent the rifle to the Hands, and how to make the character aim with it.

Because this might be important: I have IK chains for both arms, upper and lower arm. The shoulder is not included in the IK chain (seems to be better to position it by hand), the hands are not included. Besides IK chains for the legs, there are no other IK chains (yet).

1) How to parent Rifle to hands:

I have successfully parented the rifle to the character right hand. Positioning the rifle in the hand and bending the fingers correctly was a slight pain, so I hope to find a solution that will save me the exact parenting and hand bone positions.

Anyway. I had some success creating a short idle animation with the rifle parented to the right hand, and the left hand parented to the rifle.

I was able to shift the position of the rifle slightly without the hand positions going haywire (though I might have had to slightly adjust the roll, pitch or ywa of the left hand as only the arm (IK enabled under arm bone) was parented to the rifle.

Now, I had to go in and fix some errors on the hand rigs. To do that, of course I had to bring the character into a neutral position. And for that, of course I had to disable the parenting of the left arm to the rifle.

After re-enabling the parenting, the arm has lost the exact position I set it to, had to redo all the positioning again.

Is there a better was to handle two handed objects like that? How to parent it correctly, so it could be left parented to either hand while the other hand does something different (cycling a round, reloading, grabbing a grenade from the belt, whatever)? How to enable the rifle not being parented to any hand (for example when the character throws it away for close combat), while being able to return to the parenting afterwards?

Is it a better idea to separate the Rifle into its own mesh object? (might be needed for example if it should receive a rigidbody and use physics when being thrown away for example)

2) How to make the chracter aim from script:

I would like the character to be adjusting his aim without having to turn the character for aiming. I am not fond of the old "floating characters" where the dev saved on animation frames by simply turning the character mesh without any kind of animation. Also, I think it would look more realistic if the character wouldn't move his leg just for a slight adjustment in aim, but would turn his torso instead.

How can Achieve this without breaking the rig for use without holding the rifle in its hand? I have thought about an additional IK chain involving rifle, arms and torso, but given how floppy the IK chains react, IDK.

Would it be best to create an ik chain for the torso, have a control bone aligned with the rifle muzzle, and use that control bone to aim the rifle (turning the torso with it)? should I rather use a lookAt boneConstraint?

I guess I can just turn its influence down to zero when not in use... is there a way to "group" mutliple contraints and IK chains so I can deactive/activate them on the flip of a (C#) switch in the game engine?

Thanks for any help... I am kinda at the limit of what I could achieve just with tutorial vids and experimenting (should I have gone with a simpler rig first... hell yes! But then, just because you know something is stupid doesn't mean you can dodge that bullet :))

Gian-Reto

Advertisement

Normally the gun is a separate object, to lower the poly count of the skin mesh, the gun is parented to a palm bone, there is no need for the gun to have a bone.

The pouches can be added as physic bodies.

Pose your character in a aiming pose then correct the rotation of the gun, parent and it should now be ready to aim.

If you plan on using Unity you should capture the animation and delete IK bones before export.

When I passed over to Blender I got myself a copy of the "Blender foundation book", I think it is a bit dated however it should still apply to most things in Blender. The Blender master class is more of a general 3D book than focused on Blender.

To aim at run time you tweak the rotation of the torso bone, you should also have a rotation animation that can be used when the target moves more than 90 degrees.

Exporting the animations as NLA animation will allow blending in engines like Unity and Unreal when exported as .fbx

Normally the gun is a separate object, to lower the poly count of the skin mesh, the gun is parented to a palm bone, there is no need for the gun to have a bone.

The pouches can be added as physic bodies.

Pose your character in a aiming pose then correct the rotation of the gun, parent and it should now be ready to aim.

If you plan on using Unity you should capture the animation and delete IK bones before export.

When I passed over to Blender I got myself a copy of the "Blender foundation book", I think it is a bit dated however it should still apply to most things in Blender. The Blender master class is more of a general 3D book than focused on Blender.

To aim at run time you tweak the rotation of the torso bone, you should also have a rotation animation that can be used when the target moves more than 90 degrees.

Exporting the animations as NLA animation will allow blending in engines like Unity and Unreal when exported as .fbx

I specifically left the gun as part of the skinned mesh as I wanted to lower the Drawcall count to a minimum. Checking the model again in Blender, I see that I would have to combine the meshes first in Blender anyway to achieve that goal... silly me:) ...

That begs the questions:

a) can a part of a skinned mesh be hidden (I guess not, if its one combined mesh with one renderer)?

b) even if I don't need to hide it (for example the gun is always visible, or will be hidden below the terrain if not), is it a good idea to separate a part of a combined skinned mesh and let the renderer bounds grow very large? Is there any other potential problem other than that there might be a skinned mesh updated even if nothing other than the gun (who got left behind by the character for example) is in sight?

c) Is all that worth saving some draw calls (that is more a question I need to answer for myself, I guess :))

Pouches added as physics bodies: woah... wait a minute. Pouches as seperate object = additional draw calls, pouches moved by physics = additional physics calculations... isn't this really trying to make the CPU choke? Maybe I should add that I plan to let many of these characters run around in the scene (with proper LODs later on, of course, and hoping for some speed up by instancing support). So while I am getting maybe a little bit overboard with details for this first LOD, I am looking for the most optimized way of doing things to save some CPU cycles for other stuff.

My plan was to keep the pouches as parts of the skinned mesh, weighted to their own bone, and move the bones by script (like in, simplified physics driven by my own formulas. given the simple physics these pouches need to follow, using the iterative approach of PhysX AND collision detection to collide with the body of the character seems WAY overkill, no matter how optimized PhysX is). Do you see a problem with this besides the additional bones used (some CPU and Memory overhead I guess)?

Then, parenting the gun to the hands without a bone... I see how I can parent the mesh to ONE bone. But given I want to make it easy for me to shift the position of the gun, WITHOUT having to readjust one of the hands every time I do that. Can an object without a bone be parented to two bones?

Given I make the bones setup of the gun a little bit more complex, I could add some IK contrainst to the hands that would take care of keeping the hands in place, right? Altough I don't know if disabling the constraint and enabling it would again mess up the positions and rotations again...

Delete the IK bones: okay, now that you mentioned it I started reading on the topic in the net... do I get this right, the constraints I set on the bones in blender do NOT transfer to Unity? So I need to setup the constraints again in Unity (As scripts, or Unity IK controllers)? Kinda was never mentioned in all the tuts I read or watched. Given the Shapekeys do export I kinda wrongly assumed constraints would too.

Or do you mean "delete the IK Bones" as in just "delete the IK HELPER bones" (like the bone that controls the IK, and the pole target)?

Regarding the NLA animation track: That is kinda new to me. I just created a new action, recorded the Keyframe animation to that, and then found that animation in my exported fbx file in Unity, and was able to use in in mecanim without a problem. Why record all the animations to a single NLA Track (seems Unity only works with the default track), when you can create a new action for every animation clip that you can call directly in Unity (and I guess mix and match with mecanim)? Is it that blender does not export all the actions (so far only worked with one)? Or is it that you cannot blend two actions together if they are not taken from the same NLA Track (Again, I am kinda new to this and my assumptions might be WAY off :))?

Thanks for helping me out...

Gian-Reto

A) You can move each vertex of the gun to the inside of the character, scaling it down so it is hidden inside.

B) Hiding the mesh inside the character will prevent the bounds from stretching, this prevents culling errors.

C) You will need to run test but remember that skinned meshes can't be batched.

Most games would just use the spine and hip bone to animate the pouches with the body only a First person shooter would need the pouches to move more realistically.

Using a single bone to represent a few pouches, say 3, then attaching physics to it will allow for simple animation that takes little effort and doesn't waste a lot of resources. A similar technique is used for ponytails and other hanging geometry.

You can also lower updates to save resources, in the end using physics will always consume more resources however it will give good results faster than hand animation and has a similar cost to script depending on your physics engine.

There is no reason to let each pouch have it's own bone and to animate it on it's own.

When adjusting the aim you will have to move the hand, so the palm bone for both the gun and hand is a good choice.

No a object can only be parented to a single point at a time.

Think of your palm bone as a socket, it is where the gun attaches, you can then rotate the gun after it is attached and so move it with out moving the hand.

You want to use IK as little as you can, it over over-complicates things and costs more.

Just use a track to script for the gun or palm.

What I meant was to cleanup the IK helper bones, Unity imports the animation and only needs the deform bones.

NLA is only there to make life easy. When exporting as a .fbx you can tell the exporter to separate animations by NLA files. So if you have a walk loop on say frame 12-60 and a run loop on frame 84-68 then you can brake them with the NLA editor and export them as "Walk" and "Run". They will each start at frame 1 and will make it easy to mix and blend them.

You could also have a .blend file for each animation and export it as a new animation each time.

A) You can move each vertex of the gun to the inside of the character, scaling it down so it is hidden inside.

B) Hiding the mesh inside the character will prevent the bounds from stretching, this prevents culling errors.

C) You will need to run test but remember that skinned meshes can't be batched.

Most games would just use the spine and hip bone to animate the pouches with the body only a First person shooter would need the pouches to move more realistically.

Using a single bone to represent a few pouches, say 3, then attaching physics to it will allow for simple animation that takes little effort and doesn't waste a lot of resources. A similar technique is used for ponytails and other hanging geometry.

You can also lower updates to save resources, in the end using physics will always consume more resources however it will give good results faster than hand animation and has a similar cost to script depending on your physics engine.

There is no reason to let each pouch have it's own bone and to animate it on it's own.

When adjusting the aim you will have to move the hand, so the palm bone for both the gun and hand is a good choice.

No a object can only be parented to a single point at a time.

Think of your palm bone as a socket, it is where the gun attaches, you can then rotate the gun after it is attached and so move it with out moving the hand.

You want to use IK as little as you can, it over over-complicates things and costs more.

Just use a track to script for the gun or palm.

What I meant was to cleanup the IK helper bones, Unity imports the animation and only needs the deform bones.

NLA is only there to make life easy. When exporting as a .fbx you can tell the exporter to separate animations by NLA files. So if you have a walk loop on say frame 12-60 and a run loop on frame 84-68 then you can brake them with the NLA editor and export them as "Walk" and "Run". They will each start at frame 1 and will make it easy to mix and blend them.

You could also have a .blend file for each animation and export it as a new animation each time.

Thanks for your reply. Highly appreciated!

Ah, good idea about hiding the gun... didn't think about scaling. That is exactly what I will try to do.

Well, I kinda dug myself into a hole when I gave the character a coat that was longer than the waistline... so I had to use shapekeys to make it move anywhere near realistic. Then the problem was that I wasn't really sure what kind of animations I wanted to support (stupid error number 2 I guess), so I wanted to be able to move the legs freely. Hence the pouches would start clipping into the coat if left weighted to the spine without modifications. That is why I added the bones.

I see where you are coming from with using the physics engine to move the pouches. I guess that would be a possibility. But I would rather not have the pouches as their own objects.

If I want to let the physics animate the pouches, WITHOUT adding a bone to it (MAYBE I could let the physics engine update the position of the bone actually, as the bone is just a game object in Unity... might work, right?), then I need to separate the pouches out into their own objects, with their own renderer, which would result in additional draw calls. That is what I am trying to avoid with my current setup.

I would forgoe all that additional work if DX12 and the drawcall batching would be here already, but in DX11, that sadly is still a big problem if you are using many of these characters.

Good idea about the tracking constraint. Forgot about this even though I use it to control the head. I did some additional work on the rig, and it seems by parenting the gun to the right arm, and then "parenting" the left arm to the gun, it might work with additional constraints to control the left hand rotation. I currently use an IK constraint to link the left arm to the gun, but I will give the track constraint a try.

Hm, if you say "clean up the IK helper bones", do you also mean "remove the IK chain"? What happens to animation sequences if you remove bones they rely one (can/do they need to be "baked" somehow)?

And if you mean "remove IK altogether", you mean "remove the IK chains only needed for easy posing in Blender", right? Because for the locomotion controller and making sure the feet do not go through the terrain, I will most probably still need IK chains in the legs to make foot placement by script easier?

Can I edit the animations as their own actions in Blender and then export them to NLA files on export? Or do I need to record everything into the main track of the NLA animation? I though I could just create multiple actions, and have blender export all of these in the .fbx file just like it does for Shapekeys...

Well, I kinda dug myself into a hole when I gave the character a coat that was longer than the waistline... so I had to use shapekeys to make it move anywhere near realistic. Then the problem was that I wasn't really sure what kind of animations I wanted to support (stupid error number 2 I guess), so I wanted to be able to move the legs freely. Hence the pouches would start clipping into the coat if left weighted to the spine without modifications. That is why I added the bones.

I think it is time for you to rebuild the model. I know it is hard to scrap some think you worked hard on, one thing that working in professional 3D modeling, you learn to never get attached to your models. Scraping your model and starting over is just a other step in modeling.

Hm, if you say "clean up the IK helper bones", do you also mean "remove the IK chain"? What happens to animation sequences if you remove bones they rely one (can/do they need to be "baked" somehow)?

And if you mean "remove IK altogether", you mean "remove the IK chains only needed for easy posing in Blender", right? Because for the locomotion controller and making sure the feet do not go through the terrain, I will most probably still need IK chains in the legs to make foot placement by script easier?

You can remove the chain for a cleaner file, I found that it doesn't matter. I think Blenders .fbx exporter doesn't export the IK information, at least I never notice the influence and can't access it on engines with IK support.

You can first make all your animation and then divide it into NLA, think of NLA as a way of tagging the animations.

Where has the selective quote option gone?

Well, I kinda dug myself into a hole when I gave the character a coat that was longer than the waistline... so I had to use shapekeys to make it move anywhere near realistic. Then the problem was that I wasn't really sure what kind of animations I wanted to support (stupid error number 2 I guess), so I wanted to be able to move the legs freely. Hence the pouches would start clipping into the coat if left weighted to the spine without modifications. That is why I added the bones.

I think it is time for you to rebuild the model. I know it is hard to scrap some think you worked hard on, one thing that working in professional 3D modeling, you learn to never get attached to your models. Scraping your model and starting over is just a other step in modeling.

Hm, if you say "clean up the IK helper bones", do you also mean "remove the IK chain"? What happens to animation sequences if you remove bones they rely one (can/do they need to be "baked" somehow)?

And if you mean "remove IK altogether", you mean "remove the IK chains only needed for easy posing in Blender", right? Because for the locomotion controller and making sure the feet do not go through the terrain, I will most probably still need IK chains in the legs to make foot placement by script easier?

You can remove the chain for a cleaner file, I found that it doesn't matter. I think Blenders .fbx exporter doesn't export the IK information, at least I never notice the influence and can't access it on engines with IK support.

You can first make all your animation and then divide it into NLA, think of NLA as a way of tagging the animations.

Where has the selective quote option gone?

I am happy to scrap the model if it really turns out to be a nightmare to animate / to resource hungry ingame (all that shapekeys have to be driven after all)... I am not sure I can make it better without extensive redesign, which means getting rid of belt pouches mostly. And the long-ish coat.

I will spend some more hours this week trying to make it work.

As for the IK chain, yeah, guessed I had to re-establish the IK chains in Unity after import. Well, that will work SOMEHOW I guess....

I think I will need to study NLA animations more indepth as I haven't played around with it yet really.

Selective quotes are not the only thing that seemed to have gone the way of the Dodo.

When you capture it for each bone, this is more resource intense however you can then get rid of all IK when exporting.

You just won't have any IK options inside Unity unless you rebuild it.

So I got it to work over the Weekend. Still not perfect, had to go back and tweak some keyframes. Really start to see the limits of keyframe animation now and why MoCap is such a hot topic with pro gamestudios.

I managed to export a version with my running animation (still with some awkward rotations in some of the keyframes), I was able to apply an older idle animation from another version of the fbx and made a very primitive CharacterController with it.

Now I am trying to export 2 animation clips in one FBX and I am getting a big problem.

My Problems currently:

1) When exporting the FBX, and importing into Unity, I (sometimes, weird thing is if I have my Idle action selected in blender instead of the running animation, only the idle is exported), both animation clips have problems with one bone (head bone) loosing connection to the parent bone for a frame at the start of one and the end of another animation clip. I guess its because I have some Bone Modifiers on this Bone which go Haywire from time to time (really, Bone Modifiers have caused me more pain to date than they saved me), and because internally, most probably Blender will create a single animation track with both clips in sequence (correct if I am wrong, just guessing here).

I guess I could try to just pad the animation with additional Keyframes at the start and ending and cut them away in Blender (gonna try tomorrow). Or I could see which bone modifier is the culprit and remove it (the copy rotation for example is not really needed).

EDIT: Didn't really solve the root problem, but by padding the animation actions with some additional keyframes at the start and end (with 10 frames I cut away in Unity), I was able to work around the problem. Well, most probably a little wasteful, but fine for the time being. If there are ways how I could fix this without additional keyframes I would be grateful to hear about it.

2) Not really a problem during export, but my setup for the left arm is that the lower arm copies the rotation of the hand to 50%... which makes positioning easier... but now as I rotate the hand, the hand bone starts to flip-flop weirdly at some angles, not 100% sure if its the bone modifier, but I guess so.

3) Hand positioning on the rifle still is quite a lot of work

I will try to find out how to export the animation clips without problems first... if anyone has useful best practices to share that helps me prevent these problems with one animation influencing another, I would be grateful.

In the future, I will give MoCap a try. Really, while leg positioning was quick and easy with good results, positioning the arms was just a clusterf*ck! I have a Kinect 2 lying around, and while I wasn't to fond of Brekels MoCap software suite when I tried it, I might give it and iPi software another go in the future.

Is there a way to record MoCap data only for a part of the body (torso, arms and head), while combining this MoCaped animation later with keyframed leg animation? If there is, how to do that in Blender? Any good tutorials for that?

I don't really have the space to run around like mad in my appartment, and I don't think the Kinect gives you enough room to do that. Should work fine for Torso rotations and arms though.

@ Scouting Ninja: well, I am currently fighting with much more basic problems than resource consumption, but I will most probably see that I remove unneded keyframe data from my rigs in the future. Speaking of that though, I had to add keyframes because sometimes the interpolation would go haywire and follow weird curves between keyframes. Is there any way to influence the interpolation curves without adding keyframes?

I'll try to share an animated gif or video as soon as the animation is good enough to be shown off.

Thanks

Gian-Reto

Well, I was noticing the strangest thing this morning.

Trying to fix the last thing with my character, setting up the shapekey/procedural bone movement controller so all the shapekeys and bones not animated with the FBX animation, like the pouches, would get animated by script in Unity, I noticed that the shapekeys were triggered on my characters that didn't have the shapekey controller attached.

I then deactivated the controller on some of the characters that had it, and lo and behold, shapekeys were still active on them.

Somewhere along my struggles exporting the character I must have either triggered an option, or maybe it was always active and I didn't noticed because of missing big movement in my Idle animation, anyway it seems like the shapekey drivers I set up in Blender DO get exported to Unity SOMEHOW... either that, or the vertex deformation triggered gets baked into the animation!

Now, this is both good and bad... good because for all the baked animations, I could run them without ANY additional script influencing the shapekeys, thus saving a considerable amount of performance.

Bad because I am unsure now if I can still trigger the shapekeys (or "untrigger" them) when I mix baked and procedural animations for example.

What happens if I have an animation with the head looking right baked in (triggering a shapekey that should shift the collar to the right), and I want to procedurally turn the head left (thus not only do I want to trigger the opposite shapekey, I want to make sure the shapekey triogered when the head is turned right is deactivated)?

Thus, question is can I still control the shapekeys from Unity while playing a baked animation (first test seem to say "no")? Is the animation exported with the vertex manipulation triggered by the shapekey baked in (and how do I check this)?

Can I still deactivate these "baked Shapekeys" (given they are more than simple Vertex information stored with the animation) So I can control the individual shapekeys myself in Unity?

Is Blender now exporting Drivers or not?

Just to make sure: Yes, I see all the Shapekeys being imported as Blendshapes. When I query their value in script, I see the values are changing (though to WAY too high values sometimes.... didn't notice this with the Blender Drivers in Blender). When I try to set them myself in script, the new value seems to get ignored.

Of course I didn't try to "blend" my procedural values with the values set by the baked animation yet, as I wasn't aware the baked animation was controlling these values... How would I blend... this?

Will try to fix all the Drivers in Blender (somehow I abandoned them quickly as I was fed up with the simple oneliner phyton interface), and add new ones for the Pouch Bones. If that allows me to animate all parts directly in Blender, that is one thing less to worry about in Unity.

But of course I still need to find out how to later procedurally influence this, so if anyone has an idea, I would be grateful for answers.

EDIT:

I just remembered, I don't exactly NEED a Driver for the Pouch Bones... could animate them all by hand by just rotating the Bone in each keyframe. A little bit more involved, but easier to control than a python algorithm.

What happens if I then need to influence the Bone rotation in Unity, for example when adjusting foot placement with an IK rig, is still questionable though. Something to test out I guess.

This topic is closed to new replies.

Advertisement