Jump to content
  • Advertisement
Sign in to follow this  
no hit wonder

Synchronizing effects/sounds to animation

This topic is 4135 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

This is an issue that I’ve always personally had trouble with. In strictly physics based animation [like a tire bouncing down stairs], it’s easy to sync up the sounds or effects involved with animation with the movement of an object. In other types of animation, for example featuring a key-framed run animation, it’s not always so clear when a sound or effect should be executed. Playing a sound corresponding to a footstep isn’t so easy to try to record it with an approximated offset into the animation sequence of a run isn’t always so easy, and syncing up the footstep to correspond to where the foot actually landed can be even worse. Certainly this is a stroke in favor of using just straight physics based animation, or ambiguous sounds [like snow ruffling when walking through snow, instead of hearing distinct snow-crunching footsteps], but pure physics models are not practical for all different kinds of games [they are computationally expensive, and not practical for large actor counts]. So I was just wondering what methods the rest of you use to sync up effects or sounds with animation. The method I’m currently working on implementing is a message based system, with messages sewn into the animation files themselves [takes a lot of by-hand tinkering, but hopefully it’ll pay off]. I made a little model viewer, that I can drop effect flags into to send messages out of the animation system. For example, a “step” message is sent each time a foot lands down in a run, accompanied by a vector marking the origin of the message, two vectors marking the forward direction and plane of incidence normal [for applying decals like foot steps to surfaces], and a time stamp. The idea seems promising, but complicated. This is such a simple problem, surely others have come up against it, what solutions have you come up with?

Share this post


Link to post
Share on other sites
Advertisement
That's usually the way. Events are keyed into the animation. Sound, footprint, things like that. It's just a flag inserted at some keyframes.

Share this post


Link to post
Share on other sites
Animations are usually accompanied by an animscript. At the same time as the animation is running, the animscript is running too. Events can be inserted into the animscript - for example footsteps and sound effects, but also particle effects too. Or even something more game specific - for example, "turn on active hits", during an attack animation.

This could be implemented via a scripting language or something more specialised, and is likely to need a decent editor to be usable.

Share this post


Link to post
Share on other sites
In EMotion FX we have plugins for 3DSMax and Maya where artists can type an event-type string and parameter string at given frames.

We export those and at runtime each string is linked to some ID integer for fast compares (to avoid string compares).

Then at runtime the anim engine triggers callbacks with the event type and parameters. The game programmers then process these events.

Examples of events could be something like:

SOUND Footstep.wav
SCRIPT OpenDoor.scr
PARTICLE Ignite 0 180 4

So you can trigger anything using those events. It's a very powerful system.
We also save those events inside the max / maya files. So you don't have to deal with separate scripts or event files or anything. Also they are of course embedded inside the exported motion files.

Share this post


Link to post
Share on other sites
Quote:
Original post by Buckshag
In EMotion FX we have plugins for 3DSMax and Maya where artists can type an event-type string and parameter string at given frames.

We export those and at runtime each string is linked to some ID integer for fast compares (to avoid string compares).

Then at runtime the anim engine triggers callbacks with the event type and parameters. The game programmers then process these events.

Examples of events could be something like:

SOUND Footstep.wav
SCRIPT OpenDoor.scr
PARTICLE Ignite 0 180 4

Right along the lines of what i'm doing at the moment, exepct it's not embedded in 3dSMax or Maya [because i'm a poor **** and don't own either of them], instead just a throw-together program that I've made. Instead of 'Sound Footstep.wav' or the like though, it has something along the lines of 'Message : Step' [though is still converted into an int ID at load time], to allow for the effects of 'step' to be swapped out. Makes it pretty useful for transitioning between walking on tile to walking through snow to walking on bubble wrap [can even associate several things to a step, like make the bubble pop sound and make little tiny explosions]. It makes the actual script decoders really tough to debug through, as the message->action mapping ends up pretty nasty.

[edit]

Took a minute to look through your website, very impressive. [very very].

Also looks like you've got more [substantially more] experience with this than I do [as this is actually my first run at throwing the stuff together]. Perhaps I'll reconsider and look into your method. Looks like you've certainly got a handful of people who really like what you've done. Will definately take a look into what you've done [not every day i get a word from an obvious pro]

Share this post


Link to post
Share on other sites
Quote:
Original post by Buckshag
In EMotion FX we have plugins for 3DSMax and Maya where artists can type an event-type string and parameter string at given frames.


This is likely a decent system. It's certainly one way to do it.

On the other hand, having a separate script does have it's advantages. For a start, it allows events to be added by people other than artists - for example, musicians / audio content creators normally have no knowledge whatsoever of Maya or Max, and a separate animscript allows them to set up the sounds. So it very much depends on the team you are aiming for.

Share this post


Link to post
Share on other sites
I like the script idea, it's more flexible. And it does not require much (except parsing stuff).


on_keyframe_event index = 12, event = leave
begin
start_sound sound = slamdoor.wav, volume = 1.0
start_particle_fx effect = dustpuff1.pfx, position = anchor_6, normal = vector(0, 1, 0)
start_npc npc = big_badass_guard_monster.npc, ai_state = alerted
end



stuff like that...

Share this post


Link to post
Share on other sites
Quote:
Original post by SunTzu
Quote:
Original post by Buckshag
In EMotion FX we have plugins for 3DSMax and Maya where artists can type an event-type string and parameter string at given frames.


This is likely a decent system. It's certainly one way to do it.

On the other hand, having a separate script does have it's advantages. For a start, it allows events to be added by people other than artists - for example, musicians / audio content creators normally have no knowledge whatsoever of Maya or Max, and a separate animscript allows them to set up the sounds. So it very much depends on the team you are aiming for.


Each way has its advantages and disadvantages. However it is very easy to combine having a script (or something similar) mixed with the 3dsmax/maya integrated way.

Also since we have previewer tools and the tools where you setup these events inside Max/Maya are using our own interfaces (where they can choose for preset-event types and manage all events), it is easy for audio designers to work with this.

They can use the interfaces to add new events, and press the preview button to automatically see your model and motion in an in-game viewer where the events are processed as well.

They will hear and see exactly what it will look like in the game. And they can tweak it, without having to know anything of 3DSMax or Maya. All they do is use our plugins that are made for this. And load in files created by artists/animators. But that isn't the hardest thing :) They would still have to do the same with the exported files when using scripts.

But again, different methods can be combined easily. Most of this is based on feedback from our clients as well, which means they want to do it that way :)

I should also mention that I trigger other events as well. These are more global events that indicate when a motion has looped, is being stopped, is being started, and several other things. These have proven to be useful as well so you might also consider implementing those.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!