Improving Communication with Your Sound Designer Part 1

Published March 17, 2008 by Kane Minkus, posted by Myopic Rhino
Do you see issues with this article? Let us know.
Advertisement

There is a reason none of the guys at SomaTone Interactive Audio become professional Jazz musicians. What is it? I mean, after years and years of studying our instruments, jazz theory, classical theory, Bill Evans, Miles Davis and practicing our brains out, why not become pro jazz guys? The common consensus around our studios: the jazz guys are in the corner always getting ignored. Why be brilliantly ignored? Let's make noise! So my question for you is do you treat audio or your audio designer for your game like he or she is a jazz musician? You would not believe how many sound designers express this is how they feel - staff sound designers and production houses alike. Perhaps that is not your MO; and you actually love audio and think about it early on in your projects. Perhaps you wake up at night thinking about the coolest melody line, musician or new virtual instrument that would impact the game perfectly! Or maybe that is just what we sound guys do. Either way, the following article will suggest lots of ways to more deeply understand the world of your sound designer and communicate effectively, in his or her language to get the results you are after, quickly and give your user the impression that audio got the attention it deserves.

One of my favorite teachers at Berklee College of Music used to always say, "The sum is made up of the parts! Pay attention to the parts and pieces from the beginning!" So let's explore the parts of audio that come together to make the whole experience. Some of you might be more experienced with working on music/audio, but if you are not familiar with the digital studio of the new millennium - sequencers, virtual instruments and post production/composition lingo - you could still be in the dark when it comes to really communicating in ways that help the sound designer translate your vision into sound.

To begin to clarify, the word Sound Designer, especially in the game industry, has somehow become the common term for the audio professional making the sound. However, in the audio world, a sound designer is someone who is literally designing sounds. This means, most of the time, the sfx person (occasionally this could refer to someone designing sounds/textures for a composer - like Trent Reznor's sound designer who creates sounds/noises that he ends up composing with). Often in the Advertising Industry the term sound designer means both the sound effects person and composer - mainly because in lots of commercials, it is hard to distinguish between general sound scapes as sfx or music. However, in the game industry, there is a clear distinction between composers and sound designers. If you applied for a job at Lucas Arts as a sound designer, no one would expect you to come in with a guitar and start writing music.

So, when talking with your "Sound Designer" - if that is their job title in your company - that person is likely thinking about sound effects and music separately in both the creation and integration - with distinctive language, tools and approaches to both.

This article will give you lots of tools to have effective conversations with your composer about technical changes. Stay tuned for the next article where I explore effective ways to discuss creative content design, more tools and a deep breakdown of the process of music and SFX design.

Tools of the Trade

is so good these days, that you only need 2 computers (at most, some guys can just get away with one), a bunch of software and maybe a great microphone. From this setup one can compose a full orchestra or make a movie sound like a movie (including the score and sfx). Usually you will see a Mac & a PC (or just one or the other), a small keyboard controller, a nice desk, a set of good pro speakers (one of the most important parts of the studio - don't forget the sub woofer!), some sort of acoustical treatment to make the room "mix" ready (maybe a little tracking room to record live instruments/VO), and a refrigerator to keep the audio specialist up all night for your projects, which has crazy deadlines. Yes, there are some guys who love to have more - a real piano, a nice tracking room, a cushy client coach, etc., but those are really unnecessary frills to get the job done in games. These studios/work stations are commonly referred to as DAW's - Digital Audio Work Stations.

One main tool your sound designer and composer will use is a sequencer or audio design software. The industry standard is ProTools and Logic. Logic only runs on a Mac and after years of using ProTools, it is a general consensus that you want to be running that on a Mac as well. This means your audio specialist is likely to be a Mac user! Be sensitive to that as we Mac users realize that everyone else in the business world uses a PC! (Especially since we create audio for games that are mostly PC based). Obviously your audio specialist must know a PC well also. Some audio professionals chose to work solely on a PC (especially when they specialize in game audio) because games are mainly created on and for PC users or PC platforms. If they are mainly PC users, they may use a program called Cubase (this seems to be the PC software of choice - although some use Sonar or other programs). You don't need to know the details of the programs, but this way you are familiar with their tools. There are some additional specialty programs - like we use Peak (on the Mac) to master all our audio at SomaTone (some use Sound Forge on the PC - we will discuss mastering later), Ableton Live for loops and Virtual Instrument hosting (or Acid on the PC - we will discuss virtual instruments later) and Reason is a popular program for some audio professionals. Amedeus is a great editing tool and can convert to OGG files efficiently. All these extra tools are just for your knowledge to understand that the audio professional needs several tools to get the job done.

Steps to Creating Audio

Understanding the steps to production will help you isolate where a problem is coming from to help correct it:

For Sound Design:

  1. Pre Production
  2. Recording Sounds
  3. Editing & Sound Sculpting
  4. Mixing
  5. Delivery
For Composers:
  1. Pre Production
  2. Writing
  3. Recording
  4. Editing/Mixing
  5. Mastering
Although some of these steps have the same name, many are very different for the composer and the sound designer.

Sound Designer

Pre Production

This is where the creative conversations take place (we will talk about effective creative conversations in the next article), reference material is found (this is a great time to reference another film, game or project that contains sound design that you like or dislike), integration and audio engine considerations take place here as well.

Recording Sounds

The sound designer might need to (or choose to) record original sounds for a palette they will use later. These are sometimes labeled Foley Sessions (sound effects sessions that are related to materials being used to create sounds - footsteps, bags being hit together, rustling of clothing, smashing of things, anything you can think of can be recorded here.) Another option is to purchase a massive library of pre-recorded sounds. Be aware that, a sound designer having purchased libraries does not make the sound design finished. A good sound designer will layer sounds, sculpt sounds and use all sort of audio/editing tricks to get an appropriate sound for a game. At SomaTone, although we have an enormous library of digitized pre-recorded sounds, we still find ourselves recording new material constantly to layer it over existing material.

Editing & Sound Sculpting

This is where the sound design starts to come together. The sound designer will begin sculpting the sound, through editing features on his sequencer. They might use effects (like reverb, delay, flangers, EQ), use fades/crossfades, layer many sounds on top of each other, take a pre-recorded sound and run it through a synthesizer to effect it, etc.

For example, the sound of a growl for an alien character in a game, might be (and there are an infinite amount of ways to make a sound - especially a non-organic/real one) a combination of a dog growl, a metal door slamming together, stones rubbing together and a harsh synth tone. This is where the creativity of the sound designer really starts to make a difference in the quality of the sounds. And there are many, many tricks that one can learn to make things come out like a sound that works for a game (like chaining effects together, time stretching/pitch shifting sounds or processing in advanced synthesis engines like Csound, Kyma, Metasynth or Reaktor). It is pretty much an "anything goes" approach, but a good sound designer has a sense of what should be layered in (both sounds and effects) to make a sound effective, clear and impactful. Good sound designers also know ahead of time what they are looking to hear and how they are going to roughly get there - this eliminates endless additions or tweaking.

Mixing

The sound designer will then mix all the individual elements of the sound to give the final sound and, separately, each sound efx proportionally with the others in the game, to give an equal balance from one sound to the next. These are simply volume (or referred to as Gain) adjustments.

Delivery

Once the sfx are mixed, the delivery of them should be kept at these levels and they need to be individually bounced out (keep reading for the definition of "bounced out"). This makes it easier for the programmer to integrate them and keep a dynamic sound to the game (instead of "normalizing" all the sfx - a process of making each sound as loud as it can be). If the sound designer is actually creating environment loops, they might choose to keep the "panning" (sound location in the 3-D space), or deliver a full looping file with all elements integrated. Often the final files are called "bounced out" meaning they are put into a final form (.wav, mp3, or OGG format) that can be posted or emailed (versus still sitting in the sequencer). Another term "2-track mix" means the final mix of the sound effect is in a 2 track form (this is your basic stereo track with a Left and Right channel - like a regular musical CD). The "2 Track" is usually referred to when the sound effect track is continuous - versus the individual elements being delivered. Another common term is "Sound EFX Stem" - this comes from the film industry where you have just the sound effect separately - versus having them mixed in with the music or dialogue. Sound EFX stem and 2 Track are interchangeable - if you are only talking about the sfx (because some 2 Track mixes can have the music and dialogue "married" or mixed together.)

Composers

Pre-Production

This is where the creative conversations take place (we will talk about effective creative conversations later), reference material is found (this is a great time to reference another film, game or project that contains a score that you like or dislike), integration and audio engine considerations take place here as well. In the film world, you would have a "spotting session" here, which means the director and composer would get together and go through the whole film to pick out "cue points" - or places where the music should be synced to the film edits/emotional content. This can be done for games as well - even though games are not linear, they can have some linear elements in the early discussions.

Writing

The composer will go to work writing early sketches of the music. Every composer will compose slightly differently, so I will not attempt to layer one process over everyone, but from a macro scale, the composer will have at least two phases - 1) rough ideas or ideas in progress and 2) final compositions ("pre-mixed" or not mixed). Many composers are concerned about giving rough or not finished compositions to producers for fear that they will not be able to hear the final musical vision from the rough sketch.

Recording

Once rough ideas are started (or if the composer likes to start immediately writing in their sequencer), recording the parts can begin.

Let me share about the writing/recording process a bit. The writing/recording process will often start with melodies, chords or drum beats (if a score is going to be really textural or out of the norm, the writing process might start with a concept - i.e. banging on a piano to create a horror feel, etc). Then the composer begins to layer instruments in the sequencer on different "tracks". To explain what a "track" is, imagine a racetrack where several cars are all driving. However, each car is restricted to staying within their own lane. This is how a track in the recording world works. You can have one track for guitar, then another track for bass, another track for vocals, and another track for drums. When you play all the tracks back simultaneously, you get a fully orchestrated song. This is why it makes it possible for one composer to play and record all these different instruments separately and then play them back together. Tracks in the DAW are limitless - for the most part. So you can have 80 tracks if you have 80 different instruments in the orchestra all playing at the same time and recorded on a separate track.

In the cave man days, we had real instruments :) Now a days, we have samplers that trigger recordings of real instruments. And if you have amassed a good sample library of instruments, you can have incredibly realistic sounding instruments from every corner of the world at your fingertips. Sprinkle in some good "programming" (this is what it is called when a composer plays an instrument, like a flute, on a keyboard) chops and you can create incredibly realistic sounding orchestras, world compositions and beautiful scores. If you are still a skeptic of the quality of sampled instruments go check out "Ivory" from Synthology or Symphonic Choirs from East West. These samplers will knock your socks off. If your composer is using cheap or bad samples this will show up by the piece sounding too "midi", "fake", "synthesized" or cheap. Feel free to request better samples; there are incredible samples out there for every instrument on Earth. Since composers should be constantly updating their sample libraries, or spending time creating them through recordings, this should be a major reason why you chose to work with a composer or not. You will know right away by listening to the horns, pianos, strings, guitars, etc, if your composer is using high quality samples or not. The instruments should sound like they are played by live players since most good sample libraries are recordings of live players now a days.

As the composer layers each track with new instruments, harmonies, melodies, etc, they are creating their final vision (or trying to find one, depending on how they like to write). We believe a composer should have a clearly defined vision of what they are going for and should be able to clearly explain it, before they even start writing anything. This means you should be able to have meaningful conversations with your composer about where they are headed (by meaningful I mean you should understand in non-musical terms what the result will be - we will look at this later as this is a communication process in itself) at anytime during the creative process. If a composer has lost the vision, it is likely they are lost in general in their composition.

Some producers ask for initial sketches of music to make sure the music is headed in the right direction. A word of caution about this: often musicians are concerned that their finished work will not be able to be envisioned by someone who isn't them, or isn't a composer.

Let me give you a personal example. When I compose, one of the mix engineers might stop in my studio to hear how things are going. I often get a funny look like "what are you doing man?", and then they leave confused. However, once I get all the parts I envision into the songs (with complex orchestral orchestrations, many times it can sound like nothing until all the parts are playing together), they will come back in and totally get it! A sigh of relief will be given from them as they realize they just couldn't hear the final vision from the initial sketches. You and your composer will have to work out when the right time is to listen to early sketches, just remember that you cannot always hear the composer's internal masterpiece in initial sketches. But again, the composer should be able to clearly explain to you where they are going with the sketch.

Editing/Mixing

Once the basic instruments and parts are all put into the song, the composer will move into an editing/mixing stage. Cutting and pasting music like words in a word document, the composer will begin to edit parts together or use editing tricks for a specific effect (often used in electronic music these days, i.e. - stutter or vocal edits).

The mix will begin coming together here as well. The mix is comprised of volume levels of all the instruments, the panning (3-D placement of an instrument in the sonic space), reverbs, delays, EQ, flangers & special effects.

The mix is one of the most important and often overlooked components of a song. More often than not, a composer is not a mix engineer and will admit that their biggest weakness is in mixing their music. The mix is where all the parts start to get shaped and molded together into a cohesive piece of music. Mixing involves a musical and technical understanding of compressors, reverbs, delays, frequency response, EQ's, special effects (phasers, flangers, etc.), the big picture, dynamics, automation, de-essers, limiters, expanders, etc. Most problems, mistakes and issues are created, discovered and fixed in the mix. At SomaTone we have a dedicated multi-platinum mix engineer mixing everything that goes out the door. And it is night and day what comes in from our composers and goes back out to the client - once run through a proper mix process.

Without getting too technical, let's look at a few very important parts of the mix so that you can understand what to suggest instead of "it just seams too cloudy!"

Volume

One basic part of the mix is volume - some instruments will need to simply be made quieter or louder. BUT don't forget that your composer has the ability to "ride" the volume, or create dynamic changes in the music. The strings don't need to sit static at the same volume through a passage - it is not either loud or soft. Expect and demand from your composer that the mix sound DYNAMIC. This is one of the most forgotten parts of music from young composers. Keep the music and parts moving and create tension and release through the volume.

Pan

Pan is often overlooked too. This is where an instrument is placed in the stereo field. You will be amazed how much space opens up in music when instruments are properly panned in the sonic space. Imagine a jazz trio all standing in the same place trying to play. It would sound "crowded", because it is! If the music/mix sounds crowded, or too much, all happening at once, it could be improper panning (although it could be too much playing per instrument also). Also, panning can help separate frequencies that are all competing for space. If a mix is too muddy try requesting the panning be addressed a little more.

EQ

This is a secret weapon among mix engineers. A good EQ job can make a world of difference. Make sure that different instruments are not competing for the same frequency space. Make sure that instruments containing lots of mids and highs (in the frequency spectrum) like guitar and piano, are "rolled off" or have no or little low end competing with the bass. If your mix sounds muddy, or two bright/harsh, this is the area you want to recommend your composer address.

Effects

Reverb and Delay are two common effects in music. Reverb gives you the spatial relationship that the instrument has to the room it is in (reverb is actually just a long delay repeated very closely). If you want a dry violin sample to sound like it is in a cathedral, crank up the reverb (or use a little delay). Of course too much is described as sounding "wet". If you feel the mix needs more dimensionality perhaps individual instruments need more reverb (or delay depending on the type of music). Or if the mix feels too loose and sloppy sounding, it might need less reverb to tighten up the cohesion of the instruments working together.

Compressors

Compressors work to stabilize the sound in the sonic space, by making it not have as much of a dynamic range (in this instance, sometimes you want to reduce the dynamic range to cause the sound to feel more controlled). If a sound is too wild, or too dynamic, using a compressor will help reel it in.

Mastering

Does you composer hand in fully mastered recordings? If not, they should be! Mastering is a process of taking the final mixed 2 Track (or music composition) and running it through one more stage of audio processing. This stage of audio processing will usually contain: multi band compression (to restrain the dynamic edges of the mixed tracks a bit), limiting (squeezing the music to the loudest point before distortion - this also removes some of the dynamic range) and perhaps a bit of EQ to adjust any levels for the whole track. This stage is what gives it that "radio quality".

When you sit in your car and tweak the knobs on the EQ, you are mastering (or affecting the master recording) the song in your car (of course the song on the radio has already been mastered professionally!). You will notice that each time you mix and then master (or use a compression stage) you are limiting the dynamics of the piece. In order to properly get a professional sound and keep the dynamics of the piece (so it doesn't just sound flat dynamically), the mixer/mastering engineer must really know what they are doing. If your composer is not handing in mastered tracks, you will know it because it will sound weak when played against a reference piece of music in the same genre. Also, it will be quieter then other pieces of music played at the same volume on your stereo or computer.

Next Time

This gives you lots of tools to have effective conversations with your composer about technical changes. Stay tuned for the next article where I explore effective ways to discuss creative content design, more tools and a deep breakdown of the process of music and SFX design. Until then!

Cancel Save
0 Likes 0 Comments

Comments

Nobody has left a comment. You can be the first!
You must log in to join the conversation.
Don't have a GameDev.net account? Sign up!

Co-founder of SomaTone Interactive Audio, Kane has years of experience in music composition and design with Platinum record titles and dozens of game and movie credits. He explains to non-musicians the inner workings of this craft to better enable producers to communicate properly with their audio staff

Advertisement
Advertisement