Animation techniques

Started by
5 comments, last by Mystagogue 21 years, 11 months ago
I need to have a number of sprites or UI icons move about the screen or event react to the motion of other sprites and icons. I''m trying to strategize how I want to describe such motion in a 3D engine. I think I''ve heard of one strategy to make a vector with all the points expressing the general pattern of motion, and then using a time step to interpolate what position to take between any two points. I presume that the new found position should be converted into a matrix transformation representation for rendering - rather than rewriting the vertex info for the sprite(s). What are the programmatically difficult pieces of the above approach I should watch out for? What are the other common approaches I''ve not listed? Is there a book or article that has a good discussion of such things?
Advertisement

I belive what you''re looking for is "skeletal animation" or "bone animation", and possibly inverse kinematics.

A simpler (but more costly in some ways) is keyframe animation w/interpolation.

Here is a site that has some good articles/source about skeletal/kinematics. Also try google (as ever)

Cheers,

2D!Now
nope, he is looking at creating a pretty gui with sprites and icons that move. you cant generalize a problem like this unless you feel like creating some complex ai routines to handle movement. instead you create different "plays" that the "actors" (sprites) will carry out dpending on what button is pressed, where the mouse is, etc. you could have some ai driven stuff, but it makes things much more complex.

you could use matrixies, butit will most likly be faster writing the data to the vertex (DONT READ FROM IT) each frame. in this way you can batch draw your sprites (assuming you can fit most of their frames on only 1 or 2 textures). this will be much faster then drawing them individully and doing matrix calculations.
Yes, my interest is in simply moving around sprites - not kinematics or even AI. But whether my motion is based on writing new vertices for these sprites, or using matrixes, the *real* heart of my question pertained to *generating* these vertices or matrixes. That is, what are the most common *slick* ways people capture predetermined motion?

Let''s say every time you click on a sprite, it wiggles for two seconds. I need to control the speed at which it wiggles so that it is constant on fast and slow machines, and I need to describe the wiggling itself. If the wiggling is not procedural - but is instead a "path" that the sprite follows, then presumably I need to read that pre-defined path from disk and play it back. But what does that path look like? Is it a series of points? If yes, what does typical code look like that interpolates between those points to produce the frame by frame sets of new vertices or new transformation matrixes for the animation?

Indeed, it seems to me this is the kind of thing I could find an existing published example for that I could practically just paste into my own code. Assuredly it must be one of the most commonly solved problems in the gaming universe.

Allrighty. Well, one nifty way of handling sprite movement is by using Bezier curves. Works for me anyway!
that or you could get your hands dirty with an IDirect3DRMInterpolator and transferring the output positions....

yuk

yuk

bezier curves are great for it. niftiness.



die or be died...i think
die or be died...i think
Mystagogue, it is. your just to lazy to do a search and are making things massivly more complex then it is.

you have three options:
1. time based animation that relies on framerate. basically the old curPos = deltaMS/tragetMS*(ptB-ptA)*speed; where ptA represents that point you were just at, and ptB the point you are heading to. speed is the distance/tragetMS you are using. the rest is self explainatory.

2. fixed time. you ignore interpolation and simple move things a certain distance based on how much time elpased.

3. time based with rendered framerate independence. now you have a physics/logic sectio and rendering section. you render ever loop. physics/logic is called only when the delta time since last update exceeds a threshhold (ie the logic framerate). you continue to run the logic until you are no longer "behind sechdule". ie if the last frame took 15 ms to render and you run logic code every 30ms, you skip the logic code. the next frame takes 46 ms so the total is now 61ms. you render the logic code once, and now you have 30ms left so you run it again and now hav 1ms left. so you render a frame and it takes 32ms to complete. so the total time is 33ms so you run logic and now have only 3ms left. the next frame takes 10ms total is 13ms, skip logic, frame takes 12ms total is 25ms, skip logic, frame takes 16ms total is 37ms, run logic you have 7ms, etc.
during render you interpolate based on the last position and next position of the object (ie what the position will be next time logic is run).

do some research and you will find all the information you require. dont expect a cut and paste answer, their is none. you decide if you want to store your wiggle path as points, a parametric equation, bezier curves, etc.

heck
angle+=someChangeInAngle;
x = x+dx;
y=sin(angle)*height;
creates a wiggle path.

This topic is closed to new replies.

Advertisement