Sign in to follow this  

Application/Programming side of Audio

This topic is 1281 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I've been living in the creative aspect of sound using a daw. I want to step outside of that now and learn more about how it's applied.  Is there a programming side of audio in a game? Say I make the sound of a spell being cast, how do you tell it to make that sound for the appropriate action?
 

Share this post


Link to post
Share on other sites

Typically its no different then authoring a character or level with textures in the right places. The sound is a resource like a texture, model, level, shader, etc. Where textures and vertex data get passed to a graphics API to render, the same goes for a sound file. You would load sound files and use an audio API of some sort (ex. FMOD) which would play the sound.

When the sound will play would be largely dependent on the game/engine being used. In general it would be connected to the animation system some how if it were a spell being cast, for example when the spell cast animation reaches a specific frame (arm fully stretched out?) the sound would played.

If it were something related to physics, say something being hit, the physics system might trigger various events such as playing a sound on hit, you might even change the sound depending on whats hit. The thing being hit may provide the correct sound to play when its hit. It all really depends on how the code is setup and what its capable of doing.

Share this post


Link to post
Share on other sites
Without going into what actually makes the speaker emit sound - core APIs, sound devices etc. - audio for games will usually involve a high level software engine which offers an API that is used for:
- playing sounds in synchronization with game events;
- performing automation (the interpolation of audio parameters by time);
- streaming audio samples from a file;
- real-time sound processing (DSP) etc.

You can consider an audio engine as "middleware" - a layer of software that lies between your application and the operating system, facilitating communication.
Some even come as a suite\toolkit of software for you to not only program the audio features of your game but to visually design the sounds, parameters and effects used, and control the whole mix. Therefore, you could say that some of the sound design takes place during implementation.

Examples of audio engines:
- https://www.audiokinetic.com/products/wwise/
- http://www.cri-mw.com/product/lineup/audio/criadx2/index.html
- http://www.fmod.org/
- http://www.un4seen.com/bass.html
- http://www.ambiera.com/irrklang/

Share this post


Link to post
Share on other sites

This topic is 1281 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this