Process of audio dev for games in 80's-90's?

Started by
5 comments, last by bschmidt1962 7 years, 11 months ago

I've been reading a little about composing music and sound for pre-cd consoles (NES, Sega Genesis, SNES) with curiosity, but there's a little that

I'm having a hard time getting the concept of.

So from what I understand, on those consoles, they had to "code" the music/sound while programming the game, typing in certain codes to make certain sound.

But I heard even in those days, actual instruments were involved in the development, mainly synth keyboards.

So the question there was did they just have composers make music with those instruments then record them (Tapes or on computers?) and hand them to whoever

was writing the game's code for them to listen to it so they could translate it step by step, matching the notes played in the recorded music into the best batching sound making code lines the console could do?

Or did they somehow connect their keyboards to a workstation computer to map the keyboard keys to play different sounds from certain console codes on the computer?

It must of been hard if they had to translate the music's notes step by step, were they able to write their own software to help convert the music?

I'm also wondering if the case was any different on Genesis, or other consoles with FM chips. Was it possible to use any of those FM synths like Yamaha keyboards from the 80's to compose straight to the Genesis's sound chip since they were both FM based, or would that composed FM music still need to be translated indifferently?

As you can tell, I'm probably overthinking this so hopefully this wouldn't be too hard for anyone to explain. I'm all ears here, If you happen to have stores of composing music and sound for video games taking place in the time frame I described, I'd be really interested in hearing them and it would probably help me gain the idea.

Thanks.

Advertisement
Hand-coding the music was necessary for the 8-bit consoles (2600, Vectrex...) but the more
advanced the hardware got, the more automated the process could be. I remember my music guy
making midi files that could go straight to the programmer for use in the game.

-- Tom Sloper -- sloperama.com

Back in the 1980's most of the composers were actually programmers who also knew music. And they were code everything in via a trackr like system. MIDI and bank files with samples are still used today for 3DS games and even on the Wii in some cases.

Nathan Madsen
Nate (AT) MadsenStudios (DOT) Com
Composer-Sound Designer
Madsen Studios
Austin, TX

> So from what I understand, on those consoles, they had to "code" the music/sound while programming the game, typing in certain codes to make certain sound.
True, but you still have to do that today. You type in codes, they get compiled, built into software, it sends data to audio hardware, and headphones/speakers make noise.
In the oldest systems, those dating in the 1970s, typically there was so little processing power that the notes were given to the programmer (or the programmer came up with the notes themselves) and needed to figure out exactly where there were CPU cycles to insert the tone generation commands.
But anything after the 1970s, and on the three systems listed (NES, Sega Genesis, SNES) there were standardized audio formats that they could play and had relatively sophisticated audio chips for their era.
> I heard even in those days, actual instruments were involved in the development, mainly synth keyboards.
Yes. MIDI has been around since 1983, and before that there were vendor-specific keyboard interfaces.
> did they just have composers make music with those instruments then record them (Tapes or on computers?) and hand them to whoever was writing the game's code for them to listen to it so they could translate it step by step, matching the notes played in the recorded music into the best batching sound making code lines the console could do? Or did they somehow connect their keyboards to a workstation computer to map the keyboard keys to play different sounds from certain console codes on the computer?
Even back in the 1970s there was music editing software.
It was quite common for musicians to have music keyboards plugged in to their computer, it was also common for people to enter music into the software directly with number codes. Even back on my 286 there was a port where I attached a music keyboard, and games that used MIDI would play the music through the keyboard when it was attached. Most people didn't have synth keyboards, but if you had one you could certainly use it.
MIDI files are typical of the '70s and '80s audio style. The audio can have a definition of what instruments exist, sets the tempo, and plays notes based on timestamps along the interval. Beyond just the key that was hit there is information for how hard the key was pressed, the duration of the note, and certain effects to use. In earlier formats the instrument definitions were left off, leaving you to whatever instrument definitions were part of your hardware's tone generation.
Usually the data was good but the tone generators on early sound chips was terrible. Back then many chips had tone generators that were flat tones or badly-sampled approximations of real instruments. You could buy them with beautiful instrument banks, but they were more expensive than most people wanted to buy. Even back then there was a night-and-day difference between what my computer generated on its own and what was generated by my music keyboard even though both processed the same MIDI stream. Today you can replay the old well-written MIDI music and instead of sounding like beeps and boops from those familiar with the 1980's sound banks, the modern chips have beautifully sampled instruments recorded from actual masterwork instruments, so the built-in MIDI player can sound like a real symphonic band or orchestra.
Another style of music was similar and popular in the late 1980s to mid 1990s. Rather than relying on hardware banks for sound generation, they would provide their own samples. MOD, STM, S3M, and similar formats were quite common. They took more processing power because they were manually mixing all the parts, but they tended to sound nicer than the MIDI hardware most people owned.
> It must of been hard if they had to translate the music's notes step by step, were they able to write their own software to help convert the music?
Yes, software existed.
> I'm also wondering if the case was any different on Genesis, or other consoles with FM chips. Was it possible to use any of those FM synths like Yamaha keyboards from the 80's to compose straight to the Genesis's sound chip since they were both FM based, or would that composed FM music still need to be translated indifferently?
The composer may have used the synth keyboard for data entry, but they would (and still do) edit the results before passing it along.
I did music and sfx for quite a lot of of Genesis, SNES games back in the day ... I'll see if I can answer your questions--sorry for the very lengthy reply!
-> So from what I understand, on those consoles, they had to "code" the music/sound while programming the game, typing in certain codes to make certain sound.
A game system would include a custom-written "sound operating system." That is some computer code that gets incorporated into the game that would read lists of "musical" commands and then send them, at the right time, to the synthesizer chips to make them make sound. These musical commands were generally stored as ascii text files that woudl look like this
track1
patch Bass91
volume 22
note C2,30
note c3,30
rest 60
glis c3,g4,60
endtrack
track2
patch BrassSynth22
volume 3
note g3,15
slur e3,15
...
etc.
So the "coding" (actually writing computer programs) was done once--to write the Sound system. AFter that, anyone would create these "notelist" files could write the music, given some semi-technical instruction.
-> But I heard even in those days, actual instruments were involved in the development, mainly synth keyboards.
Sometimes. It depended on the composer. SOme games didn't use notlists like I did, but used MIDI data. (I personally found MIDI data to not be very space efficient, so I didn't use it. it also didn't match the way I happen to compose). Those composers would compose in MIDI, save the midi file, and be incorporated into the game which would then be used to drive the synth chip
-> So the question there was did they just have composers make music with those instruments then record them (Tapes or on computers?) and hand them to whoever was writing the game's code for them to listen to it so they could translate it step by step, matching the notes played in the recorded music into the best batching sound making code lines the console could do?
Not really. As I mentioend above, once you have a programmer write a system, then the composer/sfx person doesn't necessarily need to be able to write comptuer code to write the music for the game.
That said, there was one game where I was hired to do just that-- I was given someone elses music and asked to implement it into a game system. That was actually for a playstation game, but the developers didn't realize until they had all their music composed that they coudn't fit the music AND the 3000 lines of dialog all on the CD, so they had to use the PS synthesizer chip (which was very similar to the SNES one, actually).
-> Or did they somehow connect their keyboards to a workstation computer to map the keyboard keys to play different sounds from certain console codes on the computer?
Not quite sure I understand that quesiton. But, for example, for SNES games I did, we designed a special MIDI interface to the SNES (it went to the cartridge slot). That would let me play a note on the keyboard, and have it play from the synth chip in the SNES itself.
-> It must of been hard if they had to translate the music's notes step by step, were they able to write their own software to help convert the music?
It actually wasn't that bad. I actually used to compose my music long-hand, on music manuscript paper (with a pencil and eraser). Then when I was happy with it, I'd start up a text editor (my fav was called 'brief') and transcribe the music from my music paper into the notelist format I gave an example of above. I got to be pretty fast at it.
-> I'm also wondering if the case was any different on Genesis, or other consoles with FM chips. Was it possible to use any of those FM synths like Yamaha keyboards from the 80's to compose straight to the Genesis's sound chip since they were both FM based, or would that composed FM music still need to be translated indifferently?
For a number of practical reasons, it didn't work that way. The synthesizer in the Genesis was differnet from the most popular keyboard (the DX7).
FOr the Genesis, we woudl have special "Genesis sound chip" editing progrmas that would let us play with some of the parameters of the sound chip. when we found a sound we liked, we'd be able to save all the parameters that defined that sound, and then give it a name (for example, BrassSynth22, like in my example above).
For the SNES, since it used a "sample playback" engine, we could create our own sounds by recording actual instrument sounds, and then 'looping' them (makign them so they would play for an arbitrarily long time, even though thte sound itself was only a few tens of milliseconds long). Then we'd translate those sounds into the particular SNES format, again using a custom written software tool.
-> As you can tell, I'm probably overthinking this so hopefully this wouldn't be too hard for anyone to explain. I'm all ears here, If you happen to have stores of composing music and sound for video games taking place in the time frame I described, I'd be really interested in hearing them and it would probably help me gain the idea.
You're not overthinking at all.
The composition and sound design workflow was really quite an issue back in those days. A lot of software developoment work often went into trying to make it easy for a non-programmer to be able to write music for those games. We would spend a lot of time refining the sound system software to add features and improve workflow. Back then a lot of us who were writing the music and doing the SFX were also the people who would program these sound systems, so we had a pretty good idea of where the pipeline bottlenecks were..
A few years ago, I wrote a gamasutra article on some of these techniques you might find interesting. http://www.gamasutra.com/blogs/BrianSchmidt/20111117/90625/Designing_the_Boot_Sound_for_the_Original_Xbox.php

Brian Schmidt

Executive Director, GameSoundCon:

GameSoundCon 2016:September 27-28, Los Angeles, CA

Founder, Brian Schmidt Studios, LLC

Music Composition & Sound Design

Audio Technology Consultant

Thank you guys for the helpful answers!

I'll see if I can answer your questions--sorry for the very lengthy reply!

No worries in the slightest, I found all of it very fun to read as well as very helpful! I really appreciate you taking the time to describe as much as you have

So the "coding" (actually writing computer programs) was done once--to write the Sound system. AFter that, anyone would create these "notelist" files could write the music, given some semi-technical instruction.
-> So the question there was did they just have composers make music with those instruments then record them (Tapes or on computers?) and hand them to whoever was writing the game's code for them to listen to it so they could translate it step by step, matching the notes played in the recorded music into the best batching sound making code lines the console could do?
Not really. As I mentioend above, once you have a programmer write a system, then the composer/sfx person doesn't necessarily need to be able to write comptuer code to write the music for the game.

In these cases what would a "System" mean? Unless you were just referring to the Console's "Sound Operating System" at the beginning, as if you were saying "Writing to" the sound operating system. In those sentences I'm comprehending it as something like a program written for the composer to help them compose music under the limits of the console, or something along the lines of that. I apologize for my confusion

-> Or did they somehow connect their keyboards to a workstation computer to map the keyboard keys to play different sounds from certain console codes on the computer?
Not quite sure I understand that quesiton. But, for example, for SNES games I did, we designed a special MIDI interface to the SNES (it went to the cartridge slot). That would let me play a note on the keyboard, and have it play from the synth chip in the SNES itself.

That answer does work for what I was asking, and that also sounded very interesting. Basically I was asking if there was a way to simulate/emulate the sounds of the console's sound chips on the computer being used for development. With that "Special MIDI Interface" did you hear the sounds resulting from the key presses straight out of the SNES, with it being plugged into a TV/Speakers or was it also plugged into a computer with the sounds going into it so you could hear them from that?
In these cases what would a "System" mean? Unless you were just referring to the Console's "Sound Operating System" at the beginning, as if you were saying "Writing to" the sound operating system. In those sentences I'm comprehending it as something like a program written for the composer to help them compose music under the limits of the console, or something along the lines of that. I apologize for my confusion

Yes, I was just referring to the "Sound Operating System" (although the SOS was something that game developers generally had to write themselves-- the consoles didn't come with one, for the most part.
It is pretty confusing.. the SOS runs on the console itself, and it interprets data files to control the synthesizer.

Basically I was asking if there was a way to simulate/emulate the sounds of the console's sound chips on the computer being used for development. With that "Special MIDI Interface" did you hear the sounds resulting from the key presses straight out of the SNES, with it being plugged into a TV/Speakers or was it also plugged into a computer with the sounds going into it so you could hear them from that?

No, there were no 'emulators' that would run on a PC to let you hear what the sound chip would sound like. This was 1990, when a state of the art PC was a 30Mhz--computers these days are literally a thousand times more powerful when you consider clock speed, multi-core and vector instructions. :)
"VST" wouldn't be invented for another 6 years.


Yes, you could hear the results of pressing on a MIDI keyboard and hear the results coming right out of the SNES.

When we finally got that working, it was a great boon to productivity (before that we had to do a lot of file copying--using floppys, and run some other magic to make the SNES make sound).

Brian Schmidt

Executive Director, GameSoundCon:

GameSoundCon 2016:September 27-28, Los Angeles, CA

Founder, Brian Schmidt Studios, LLC

Music Composition & Sound Design

Audio Technology Consultant

This topic is closed to new replies.

Advertisement