Sign in to follow this  

Where to play sound effects

This topic is 3296 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi, I new to game programming and have just started to play around with sound in a demo game I'm writing. I have the code I need to load and play sounds and it all works ok. I just want to know, in temrs of the game architecture, where the actual sound should be played. Should it be in the logic or render portion of the game loop or somewhere else?

Share this post


Link to post
Share on other sites
Put it into its own audio portion :)

Rendering and playing audio have different requirements. Rendering is all about what you see, which is not always what you want to hear (think about an enemy approaching from behind[want to hear] or people talking who are far away [dont want to hear]).

The audio portion is about managing your sounds (playing music loop, playing effect sequences) whereas your logic portion is about triggering certain events (fx or audio effects).

--
Ashaman

Share this post


Link to post
Share on other sites
Personally, I call sound effects in the logic code for whatever event triggers the effect. For example, if you have a Gun which is fired in Gun.Fire(), then somewhere Gun.Fire() I'd call Audio.PlayWeaponFire().

Obviously it will be more complex than that, different sounds, or no sounds, would be triggered if the Gun cooldown is not complete, if it is out of ammo, if it is jammed, etc.

Audio is definitely a unique architecture challenge because calls to it will be interspersed in a variety of completely un-related classes. Guns, doors, UI buttons, background music, triggered sound effects, local ambient noises, etc.

Share this post


Link to post
Share on other sites
Thanks for your replies.

I think I will seperate my audio code into it's own area and set flags via the logic code so the audio code knows which sounds to play. In my game most game objects are represented by simple structs so this should be sufficient, however moving on I might consider using class functions for handling the sounds effects.

Share this post


Link to post
Share on other sites
I was always a fan of some sort of message passing from your logic to the audio where you'll pass in a structure representing the required info to play the audio (sort of what your last post was thinking of). The audio will then update and process all queued effects.

Share this post


Link to post
Share on other sites
I haven't reached the audio playing part of my engine yet, but this is what I had in mind:
In my main game loop I do the following:


while(running)
{
ProcesUserInput();
DoLogicWithUserInput();
Update3dGraphicsForThisFrame();
UpdateSoundDataForThisFrame();
Render3d()
RenderSound();
}


I make no distinction between graphics or audio. They are both assets that can be rendered by different render systems.

Share this post


Link to post
Share on other sites

This topic is 3296 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this