Application/Programming side of Audio

Started by
1 comment, last by Kryzon 9 years, 10 months ago

I've been living in the creative aspect of sound using a daw. I want to step outside of that now and learn more about how it's applied. Is there a programming side of audio in a game? Say I make the sound of a spell being cast, how do you tell it to make that sound for the appropriate action?

Advertisement

Typically its no different then authoring a character or level with textures in the right places. The sound is a resource like a texture, model, level, shader, etc. Where textures and vertex data get passed to a graphics API to render, the same goes for a sound file. You would load sound files and use an audio API of some sort (ex. FMOD) which would play the sound.

When the sound will play would be largely dependent on the game/engine being used. In general it would be connected to the animation system some how if it were a spell being cast, for example when the spell cast animation reaches a specific frame (arm fully stretched out?) the sound would played.

If it were something related to physics, say something being hit, the physics system might trigger various events such as playing a sound on hit, you might even change the sound depending on whats hit. The thing being hit may provide the correct sound to play when its hit. It all really depends on how the code is setup and what its capable of doing.

Without going into what actually makes the speaker emit sound - core APIs, sound devices etc. - audio for games will usually involve a high level software engine which offers an API that is used for:
- playing sounds in synchronization with game events;
- performing automation (the interpolation of audio parameters by time);
- streaming audio samples from a file;
- real-time sound processing (DSP) etc.

You can consider an audio engine as "middleware" - a layer of software that lies between your application and the operating system, facilitating communication.
Some even come as a suite\toolkit of software for you to not only program the audio features of your game but to visually design the sounds, parameters and effects used, and control the whole mix. Therefore, you could say that some of the sound design takes place during implementation.

Examples of audio engines:
- https://www.audiokinetic.com/products/wwise/
- http://www.cri-mw.com/product/lineup/audio/criadx2/index.html
- http://www.fmod.org/
- http://www.un4seen.com/bass.html
- http://www.ambiera.com/irrklang/

This topic is closed to new replies.

Advertisement