Jump to content

  • Log In with Google      Sign In   
  • Create Account


Member Since 18 Sep 2010
Offline Last Active Jul 21 2016 06:26 PM

Posts I've Made

In Topic: Dynamic sound effects

21 July 2016 - 06:15 PM

All the techniques you describe are pretty bread-and-butter game audio techniques and are well-handled by tools like FMOD or Wwise.


You can do much of it in unity, but it will be quite a bit more work, since you're have to hand-code up what's in essence a (very simplified) FMOD/Wwise type engine.


If your games development budget is small, those tools are very low cost (or free) and would likely save you several programmer-days worth of work.

In Topic: Should I Learn How To Use Wwise Fmod Etc?

21 July 2016 - 04:40 PM

Knowing how to implement can give you a big 'leg up' over competition, so I'd say that it's worth getting at least a bit fluent in Wwise and FMOD.


Now if you told me that you're a composer, and you can (truthfully) say that in a room of 1000 top composers in LA you're in the top 5 (literally comparing yourself to people like Bear McCreary,Michael Giacchino, James Horner, etc. then don't bother with FMOD, Wwise, etc. Just focus on creating excellent music.


Knowning FMOD/Wwise will not only give you a leg up (a lot of studios require or strongly recommend it), but it will also let you put out a better product and help make the game better

In Topic: Process of audio dev for games in 80's-90's?

22 May 2016 - 09:13 PM

In these cases what would a "System" mean? Unless you were just referring to the Console's "Sound Operating System" at the beginning, as if you were saying "Writing to" the sound operating system. In those sentences I'm comprehending it as something like a program written for the composer to help them compose music under the limits of the console, or something along the lines of that. I apologize for my confusion


Yes, I was just referring to the "Sound Operating System" (although the SOS was something that game developers generally had to write themselves-- the consoles didn't come with one, for the most part.
It is pretty confusing.. the SOS runs on the console itself, and it interprets data files to control the synthesizer.  


Basically I was asking if there was a way to simulate/emulate the sounds of the console's sound chips on the computer being used for development. With that "Special MIDI Interface" did you hear the sounds resulting from the key presses straight out of the SNES, with it being plugged into a TV/Speakers or was it also plugged into a computer with the sounds going into it so you could hear them from that?



No, there were no 'emulators' that would run on a PC to let you hear what the sound chip would sound like.  This was 1990, when a state of the art PC was a 30Mhz--computers these days are literally a thousand times more powerful when you consider clock speed, multi-core and vector instructions. :)
"VST" wouldn't be invented for another 6 years.

Yes, you could hear the results of pressing on a MIDI keyboard and hear the results coming right out of the SNES.

When we finally got that working, it was a great boon to productivity (before that we had to do a lot of file copying--using floppys, and run some other magic to make the SNES make sound).

In Topic: Process of audio dev for games in 80's-90's?

19 May 2016 - 06:23 PM

I did music and sfx for quite a lot of of Genesis, SNES games back in the day ... I'll see if I can answer your questions--sorry for the very lengthy reply!
-> So from what I understand, on those consoles, they had to "code" the music/sound while programming the game, typing in certain codes to make certain sound.
A game system would include a custom-written "sound operating system."  That is some computer code that gets incorporated into the game that would read lists of "musical" commands and then send them, at the right time, to the synthesizer chips to make them make sound. These musical commands were generally stored as ascii text files that woudl look like this
   patch Bass91
   volume 22
   note C2,30
   note c3,30
   rest 60
   glis c3,g4,60
   patch BrassSynth22
   volume 3
   note g3,15
   slur e3,15
So the "coding" (actually writing computer programs) was done once--to write the Sound system. AFter that, anyone would create these "notelist" files could write the music, given some semi-technical instruction.
-> But I heard even in those days, actual instruments were involved in the development, mainly synth keyboards.
Sometimes. It depended on the composer. SOme games didn't use notlists like I did, but used MIDI data. (I personally found MIDI data to not be very space efficient, so I didn't use it. it also didn't match the way I happen to compose). Those composers would compose in MIDI, save the midi file, and be incorporated into the game which would then be used to drive the synth chip
-> So the question there was did they just have composers make music with those instruments then record them (Tapes or on computers?) and hand them to whoever was writing the game's code for them to listen to it so they could translate it step by step, matching the notes played in the recorded music into the best batching sound making code lines the console could do?
Not really. As I mentioend above, once you have a programmer write a system, then the composer/sfx person doesn't necessarily need to be able to write comptuer code to write the music for the game.
That said, there was one game where I was hired to do just that-- I was given someone elses music and asked to implement it into a game system. That was actually for a playstation game, but the developers didn't realize until they had all their music composed that they coudn't fit the music AND the 3000 lines of dialog all on the CD, so they had to use the PS synthesizer chip (which was very similar to the SNES one, actually). 
-> Or did they somehow connect their keyboards to a workstation computer to map the keyboard keys to play different sounds from certain console codes on the computer?
Not quite sure I understand that quesiton. But, for example, for SNES games I did, we designed a special MIDI interface to the SNES (it went to the cartridge slot). That would let me play a note on the keyboard, and have it play from the synth chip in the SNES itself.
-> It must of been hard if they had to translate the music's notes step by step, were they able to write their own software to help convert the music?
It actually wasn't that bad.  I actually used to compose my music long-hand, on music manuscript paper (with a pencil and eraser). Then when I was happy with it, I'd start up a text editor (my fav was called 'brief') and transcribe the music from my music paper into the notelist format I gave an example of above. I got to be pretty fast at it.
-> I'm also wondering if the case was any different on Genesis, or other consoles with FM chips. Was it possible to use any of those FM synths like Yamaha keyboards from the 80's to compose straight to the Genesis's sound chip since they were both FM based, or would that composed FM music still need to be translated indifferently?
For a number of practical reasons, it didn't work that way. The synthesizer in the Genesis was differnet from the most popular keyboard (the DX7). 
FOr the Genesis, we woudl have special "Genesis sound chip" editing progrmas that would let us play with some of the parameters of the sound chip. when we found a sound we liked, we'd be able to save all the parameters that defined that sound, and then give it a name (for example, BrassSynth22, like in my example above).
For the SNES, since it used a "sample playback" engine, we could create our own sounds by recording actual instrument sounds, and then 'looping' them (makign them so they would play for an arbitrarily long time, even though thte sound itself was only a few tens of milliseconds long). Then we'd translate those sounds into the particular SNES format, again using a custom written software tool.
-> As you can tell, I'm probably overthinking this so hopefully this wouldn't be too hard for anyone to explain. I'm all ears here, If you happen to have stores of composing music and sound for video games taking place in the time frame I described, I'd be really interested in hearing them and it would probably help me gain the idea.
You're not overthinking at all.
The composition  and sound design workflow was really quite an issue back in those days. A lot of software developoment work often went into trying to make it easy for a non-programmer to be able to write music for those games. We would spend a lot of time refining the sound system software to add features and improve workflow. Back then a lot of us who were writing the music and doing the SFX were also the people who would program these sound systems, so we had a pretty good idea of where the pipeline bottlenecks were..
A few years ago, I wrote a gamasutra article on some of these techniques you might find interesting. http://www.gamasutra.com/blogs/BrianSchmidt/20111117/90625/Designing_the_Boot_Sound_for_the_Original_Xbox.php

In Topic: experienced video game composer needed for questioning regarding my dissertat...

13 April 2016 - 09:17 AM

Hi Jack,

Would be happy to chat. I've been in game music since 1987.

You can email me directly-- brian [at] either of the web addresses listed in my signature below.


What is your dissertation on?