Jump to content

  • Log In with Google      Sign In   
  • Create Account






Making your character come to life (Part II)

Posted by Matias Goldberg, 20 June 2011 · 1,060 views

This is part II, you can find part I here.



Motion extraction

As you may have read in the art guidelines, Distant Souls is animation driven. This means the actual behavior of the characters is determined by the mesh' animations, This gives a lot of power to artists, while taking advantage of modeling packages tools.

An attack's (combo) length, how fast the character moves, if it step asides 45° to the left or just 15°; everything is determined by the animation. Some of these settings can be tweaked through Lua scripts, such as walk speed and jump height; avoiding going back and forth between Blender and the game for such nuisances. The animation is automatically readjusted for these parameters, though extreme values won't look good.

This is called motion extraction. The implementation is actually pretty simple; it takes a main bone (i.e. a root bone) and sees how it is translated over time; extracts this information and creates what I call a velocity curve, and then subtracts this translation from the mesh' animation. When the animation is being played, the mesh is now always centered, and the velocity points from the curve are interpolated to find out at which speed should be moving.

Lua code is used to determine how motion is extracted. For example, it is possible to generate a velocity curve without recentering the original animation; or extract motion only along Z axis.

The 3 main advantages from this system are:

  • Animators control movement. No more “sliding foot” while walking.
  • The animation has now a physical representation, it collides against stuff, and stuff (i.e. a wall) can prevent us from walking further through it
  • Certain animations can be very complex, going forward, backwards, left, right; everything done within the modeling application, no scripting or special tools required.
Attached Image



Procedural animation

Back to the character I am working now; it's animation uses a python plugin I've found over Blender forums (noise constraint 1.1) which is awesome. It is applied to carefully selected bones for a more natural animation; since it keeps these bones always in motion. Human beings, even when we try hard to stand still, are still moving, shaking perhaps if we tense our muscles.

This plugin is always active, which means that by just not moving a single bone, I get a very natural looking idle animation. Plus, the hair's motion looks awesome.

I had to enhance the plugin though, to make it loop-able. The original script meant that even the same keyframe at start and end would not match, which is terrible for walk, run and idle loops. The script now uses an advanced technique developed by fellow GD.Net member using an imaginary plane to loop as a circle, using time as a parameter, and animation's length to determine the circle's radius. I believe this is a much superior way of generating seamless noise than the traditional methods, though with a few disadvantages.

The main drawback is that I have to save (and keep it updated, it's automatically generated on-demand, though) a text file inside Blender which keeps track how long each animation lasts, as Blender doesn't provide any straightforward way to get the “Action” length; and finding it in real time every moment the constraint is evaluated hurts performance badly. FPS went down from 60 to 5 fps; by using this file as a cache, FPS went back to 60. cPickle is teh best thing evar1 :)


For anyone interested the script is attached in this post. I won't be explaining how to use it, figure that out yourself. I'll just warn you that you need to set two keyframes at different frames. One to tell Blender the constraint needs to updated constantly, the other one to avoid a division by zero because the time length is 0. If a division by zero happens, delete ActionsNoise.cfg from the text editor and hit refresh in the constraint options to reenable the plugin again.

Attached Image


Walk animation

As for the walk animation, female walk must look sexier and softer than a male's walk. Seeing fashion model runways is a good reference. The hips swing left and right while balancing part of the body in the opposite direction, and the leg from the hip going up goes forward. The other leg should be aligned (not completely) with the leg on the front. Male walk on the contrary, doesn't align the legs much, they move mostly side by side. One must be careful not to make the butt movement too exaggerated, otherwise it goes from looking sexy to being vulgar. There's a fine line, sometimes difficult to tell which side is your animation standing.

Attached Image Attached Image
Showing the walk animation in different stages


Dead eyes, dead character

Although the new mesh looked a lot better than the old one, when it was put into the game engine I couldn't stop feeling there was something huge missing. It was, for some reason, still as bad as the old one. What was wrong? This new model had more vertices, it's bones were in constant motion, the topology allowed better rigging, the animations were more detailed and this model had two bone weights per vertex while the other one had one.

And I discovered it by chance. What surprised me most is that it's a problem I've heard and thought millions of times, and still I forgot and made the same mistake: the eyes.

That's where our focus is. My thought was that if I kept the eyes still and then move it slightly over time it would mimic real life behavior. I couldn't be more wrong.

I happened to use an IK Target constraint on both eyes. Then it magically changed:

  • Mistake N° 1: Even when we get our eyes still, they're focusing onto something, this means their vision overlaps at some point. Our two eyes aren't parallel. In some cases it's less than a few degrees, but we still truly notice such subtlety.
  • Mistake N° 2: Our eyes aren't in constant motion because we look everywhere; they're in constant motion because we move all the time while we're trying to focus into one single place.
I tried using IK target for the eyes in the old model, but it didn't look good because the eyes had a few problems (when rotating, the eye sphere went “through” the skin). The missing IK target led to mistake N° 1.

Furthermore the old model wasn't in constant motion, which led to mistake N° 2.

The only problem with this IK Target is that I have to be wary all the time to keep the target in place according to the animation, otherwise the eyes spin around or the look in her eyes feel like those from a crazy murderer. Parenting the target to something like the head's bone could have solved it, but then we fall into mistake N° 2 again, as the eyes stop trying to focus into a single point anymore.

Note we don't focus on the exact same spot all the time, so be sure to make sudden changes from time to time (i.e. shift left, then right back)


Attached Image
My nose itches!

Attached Image
This isn't an exorcism, right?


After IK Target was added, the character just came to life. The change was just abysmal.



Reusing, reusing, reusing

Given Distant Souls is slightly ambitious, reusing assets is essential. This rig has been prepared for easy IK (the old rig had lots of problems when IK was on) and most animations will be reused in many characters from now on. That's what Naughty Dog did in Uncharted. I am being very careful not to make any mistakes while doing this “template”. Having done one before helped me learn a lot of mistakes. And I'm most likely still doing some which I won't see or be able to fix until I'm more experienced.

Some animations will be tweaked or modified to avoid monotony and give main characters it's own personality, a few animations can't be reused for obvious reasons (i.e. male characters won't use female's walk animation) and the only animations that will be unique are attacks, combos, and spell castings.

Attached Image


New hair shader

I finally got some time to implement Scheurmann's hair rendering algorithm. IMHO, my version doesn't look as awesome as in the paper, but still a major improvement over the grey hair. Note the screenshots attached are a bit old, the hair has slightly been improved by using better noise textures. The eye brows are now darker, brownish (not shown in the pictures). I wanted to look them like true eye brows by using alpha blended hair strands with this new shader, but it looked terrible. This brownish colour outlines better the eyes, I like how it ended, not as photorealistic as I originally planed, but still I'm not after 100% photorealism (plus, I want some artistic freedom). I'll leave that to the big budget titles who want it.


Attached Image

Attached Image

Attached Image


Well, that's all for today; I'll leave you with more screenshots, and the perlin noise script as I promised. Next time I'll be adding a new animation to withdraw the weapon when entering a fight. Completely unrelated, I'll be taking a very difficult exam this friday (lasts from 6 pm to 10:30 pm!!) so wish me luck; the better I perform, the more time I'll have to focus on the game!

Cheers
Dark Sylinc

Attached Image Attached Image


Attached Image


[source lang="python"]#BPYCONSTRAINT''' PyConstraint template, access this in the "add constraint" scripts submenu. Add docstring here'''import Blenderfrom Blender import Draw, Mathutils, Noiseimport mathimport cPickleimport string ''' This variable specifies the number of targets that this constraint can use '''NUM_TARGETS = 0PI = 3.1415926535#Action data is saved and then reloaded otherwise performance goes waaaaay down.def calculateActionLength( action ): """Updates firstKeyFrame and lastKeyFrame considering the current IpoCurves. """ firstKeyFrame = None lastKeyFrame = None ipoDict = action.getAllChannelIpos() if ipoDict is not None: # check all bone Ipos # ipoDict[boneName] = Blender.Ipo for ipo in ipoDict.values(): if ipo is not None: # check all IpoCurves for ipoCurve in ipo.getCurves(): # check first and last keyframe for bezTriple in ipoCurve.getPoints(): iFrame = bezTriple.getPoints()[0] if ((iFrame < firstKeyFrame) or (firstKeyFrame is None)): firstKeyFrame = iFrame if ((iFrame > lastKeyFrame) or (lastKeyFrame is None)): lastKeyFrame = iFrame if firstKeyFrame == None: firstKeyFrame = 1 if lastKeyFrame == None: lastKeyFrame = 1 return [firstKeyFrame, lastKeyFrame, lastKeyFrame - firstKeyFrame]def saveActionsData(): actionsDict = {} arm = Blender.Object.Get('Armature') act = arm.getAction() for act in Blender.Armature.NLA.GetActions().values(): actionsDict[act.getName()] = calculateActionLength( act ) # Save all actions data to a Blender text 'ActionsNoise.cfg'. textName = 'ActionsNoise.cfg' # Remove old configuration text if textName in [text.getName() for text in Blender.Text.Get()]: oldSettingsText = Blender.Text.Get( textName ) Blender.Text.unlink( oldSettingsText ) # Write new configuration text settingsText = Blender.Text.New( textName ) settingsText.write('This file is automatically created. Please don\'t edit this file directly.\n\n') try: # pickle settingsText.write( cPickle.dumps(actionsDict) ) except (cPickle.PickleError): print 'Couldn\'t pickle actions\' data!' def loadActionsData(): # Load all actions from a previously saved Blender text 'ActionsNoise.cfg'. textName = 'ActionsNoise.cfg' actionsDict = {} if textName in [text.getName() for text in Blender.Text.Get()]: settingsText = Blender.Text.Get( textName ) # Compose string from text and unpickle try: # unpickle actionsDict = cPickle.loads(string.join(settingsText.asLines()[2:],'\n')) except (cPickle.PickleError): print 'Couldn\'t unpickle actions data! Regenerate the file in the settings file!' return actionsDict''' This function is called to evaluate the constraint obmatrix: (Matrix) copy of owner's 'ownerspace' matrix targetmatrices: (List) list of copies of the 'targetspace' matrices of the targets (where applicable) idprop: (IDProperties) wrapped data referring to this constraint instance's idproperties'''def doConstraint(obmatrix, targetmatrices, idprop): # Separate out the tranformation components for easy access. obloc = obmatrix.translationPart() # Translation obrot = obmatrix.toEuler() # Rotation obsca = obmatrix.scalePart() # Scale # Define user-settable parameters. # Must also be defined in getSettings(). if not idprop.has_key('u_loc'): idprop['u_loc'] = 1 if not idprop.has_key('u_rot'): idprop['u_rot'] = 0 if not idprop.has_key('u_scale'): idprop['u_scale'] = 0 if not idprop.has_key('u_locamount'): idprop['u_locamount'] = 1.0 if not idprop.has_key('u_rotamount'): idprop['u_rotamount'] = 30.0 if not idprop.has_key('u_scaleamount'): idprop['u_scaleamount'] = 1.0 if not idprop.has_key('u_speed'): idprop['u_speed'] = 1.0 if not idprop.has_key('u_seed'): idprop['u_seed'] = 1.0 la = idprop['u_locamount'] ra = idprop['u_rotamount'] sa = idprop['u_scaleamount'] noise_speed = idprop['u_speed'] * 0.001 arm = Blender.Object.Get('Armature') act = arm.getAction() actionDict = loadActionsData() if not actionDict.has_key( act.getName() ): saveActionsData() actionDict = loadActionsData() myAct = actionDict[act.getName()] time = Blender.Get('curtime') radAngle = ( (time - myAct[0]) / myAct[2] ) * 2 * PI # Keep the noise always starting at (0,0) by displacing the circle. Otherwise different speed parameters change the 'seed' # Mul by myAct[2] (anim. length) to keep speed constant for different actions x = (math.cos( radAngle ) - 1 ) * noise_speed * 0.8 * myAct[2] z = math.sin( radAngle ) * noise_speed * 0.8 * myAct[2] s = idprop['u_seed'] * 12.3456789 noise_vec = Mathutils.Vector( x + s, x + s, z + s) rv = Noise.vTurbulence(noise_vec, 3, 0, Noise.NoiseTypes.NEWPERLIN) half_vec = Mathutils.Vector(0.5, 0.5, 0.5) noise_vec = noise_vec - half_vec # Do stuff here, changing obloc, obrot, and obsca. if idprop['u_loc'] == 1: obloc[0] += la*rv[0] obloc[1] += la*rv[1] obloc[2] += la*rv[2] if idprop['u_rot'] == 1: obrot[0] += ra*rv[0] obrot[1] += ra*rv[1] obrot[2] += ra*rv[2] if idprop['u_scale'] == 1: obsca[0] += sa*rv[0] obsca[1] += sa*rv[1] obsca[2] += sa*rv[2]; # Convert back into a matrix for loc, scale, rotation, mtxloc = Mathutils.TranslationMatrix( obloc ) mtxrot = obrot.toMatrix().resize4x4() mtxsca = Mathutils.Matrix([obsca[0],0,0,0], [0,obsca[1],0,0], [0,0,obsca[2],0], [0,0,0,1]) # Recombine the separate elements into a transform matrix. outputmatrix = mtxsca * mtxrot * mtxloc # Return the new matrix. return outputmatrix''' This function manipulates the matrix of a target prior to sending it to doConstraint() target_object: wrapped data, representing the target object subtarget_bone: wrapped data, representing the subtarget pose-bone/vertex-group (where applicable) target_matrix: (Matrix) the transformation matrix of the target id_properties_of_constraint: (IDProperties) wrapped idproperties'''def doTarget(target_object, subtarget_bone, target_matrix, id_properties_of_constraint): return target_matrix''' This function draws a pupblock that lets the user set the values of custom settings the constraint defines. This function is called when the user presses the settings button. idprop: (IDProperties) wrapped data referring to this constraint instance's idproperties'''def getSettings(idprop): # Define user-settable parameters. # Must also be defined in getSettings(). if not idprop.has_key('u_loc'): idprop['u_loc'] = 1 if not idprop.has_key('u_rot'): idprop['u_rot'] = 0 if not idprop.has_key('u_scale'): idprop['u_scale'] = 0 if not idprop.has_key('u_locamount'): idprop['u_locamount'] = 1.0 if not idprop.has_key('u_rotamount'): idprop['u_rotamount'] = 30.0 if not idprop.has_key('u_scaleamount'): idprop['u_scaleamount'] = 1.0 if not idprop.has_key('u_speed'): idprop['u_speed'] = 1.0 if not idprop.has_key('u_seed'): idprop['u_seed'] = 1.0 # create temporary vars for interface uloc = Draw.Create(idprop['u_loc']) ulocamount = Draw.Create(idprop['u_locamount']) urot = Draw.Create(idprop['u_rot']) urotamount = Draw.Create(idprop['u_rotamount']) uscale = Draw.Create(idprop['u_scale']) uscaleamount = Draw.Create(idprop['u_scaleamount']) uspeed = Draw.Create(idprop['u_speed']) useed = Draw.Create(idprop['u_seed']) udataFile = Draw.Create(1) # define and draw pupblock block = [] block.append(("Speed", uspeed, 0.0000001, 1000.0, "The speed of animation")) block.append(" ") block.append(("Location", uloc, "Randomly modify the object's location")) block.append(("Amount", ulocamount, 0.0000001, 1000.0, "The amount of location randomness")) block.append(" ") block.append(("Rotation", urot, "Randomly modify the object's rotation")) block.append(("Amount", urotamount, 0.0000001, 1000.0, "The amount of rotation randomness")) block.append(" ") block.append(("Scale", uscale, "Randomly modify the object's scale")) block.append(("Amount", uscaleamount, 0.0000001, 1000.0, "The amount of scale randomness")) block.append(" ") block.append(("Seed", useed, 1.0, 1000.0, "Use a different random value ")) block.append(" ") block.append(("Generate File", udataFile, "Do this each time you've created new pose actions or modified it's length.")) retval = Draw.PupBlock("Noise Constraint", block) # update id-property values after user changes settings if (retval): idprop['u_loc']= uloc.val idprop['u_locamount']= ulocamount.val idprop['u_rot']= urot.val idprop['u_rotamount']= urotamount.val idprop['u_scale']= uscale.val idprop['u_scaleamount']= uscaleamount.val idprop['u_speed']= uspeed.val idprop['u_seed']= useed.val if udataFile: saveActionsData() [/source]




August 2014 »

S M T W T F S
     12
3456789
10111213141516
171819 20 212223
24252627282930
31      
PARTNERS