Presenting Your Music To Film Makers

With over 25 movies and a dozen albums under his belt, Scott Glasgow is a highly acclaimed composer in the industry. Not unlike his music, his studio setup and workflow is a notch above the rest; especially when it comes to presenting his work to film makers. So we asked Scott to give us the run down of how he gets his music approved with flying colors and he was awesome enough to oblige. Enjoy!

By Scott Glasgow


One of the most important aspects to understand when writing music for film, TV or any media is that the process is a collaboration between you, the “music department”, and the film makers. You must be aware that you’re pitching your music to people who may or may not have any musical abilities, or even a desire for them. Unlike writing songs in a band, working on an album, or writing a symphony where decisions are made by one (or by a group of musicians), you must work with those with little-to-no musical knowledge. One of the main challenges film composers (like myself) face in collaboration is the process of getting music approved through a series of meetings before it’s either recorded live or delivered to a dub mix stage for final mix. This may happen with the director of a film, but sometimes you’re collaborating with producers too. I like to call this process my “presentation”; however, I have heard many call it the “dog & pony show” (but that seems too derogatory). On films with a larger budget, there is usually a music editor in the room monitoring playback with the director and producers, so the composer is a little more hands off. When there is no music editor, the collaboration is a great process that, for the most part, is a fun challenge. And a part of that challenge is how to MONITOR your audio signal in the most efficient way possible to “sell” your track as perfect for that scene. Here is how I do it.


Lets start with my setup. One of the most frequent questions I get from new composers is how I set up my studio. I keep it basic:

  • One computer: MacPro 12-core with 96GB ram
  • Audio interface: Universal Audio Apollo firewire with TB add on card
  • Apogee Big Ben – for word clock sync for working with other devices or rented gear like a mic pre
  • 2 iPads V2
    • Left iPad: Touch OSC – midi / key commands
    • Right iPad: Lemur – Audio Monitoring
  • 88-key keyboard
  • Near-Field Monitor Speakers
  • 4 Video Monitors.

Output Blog | Scott Glasgow | Using an iPad To Optimize Your Workflow

The iPads are running TouchOSC and Lemur, both with a custom made set of faders and pads created by me. The only reason I have both is that I started on TouchOSC then switched to Lemur when getting the 2nd iPad. I prefer Lemur as a touch interface, so at some point I will change the 1st iPad to Lemur also. I run OSCulator for two things – to make sure my key commands work in my sequencer and to keep a monitor on both Lemur and Touch OSC signals coming into my sequencer.

The most important thing I think many composers forget about when setting up their studio is ergonomics. You will be spending many hours working at that desk, so give yourself some elbow room and find yourself a great chair. I use an Aeron, but there are better options out there. I am always amazed how many composers work stations I see that are cluttered with gear, especially an 88-key midi keyboard right in front on top. It is claustrophobic and quickly leads to fatigue by not having anywhere to rest your arms as you type, work, etc. Think about how your body works. Having a midi keyboard in front is a terrible way to work for me, so I have a sliding stand.  Another option is to use an attached sliding shelf to push your keyboard under the desk. There are many times I like to work out ideas with pencil and paper, and I can simply push the keyboard under the desk to give me room to write.

Output Blog | Scott Glasgow
Close up shot of the “midi control / key commands” iPad running TouchOSC.


To prepare for the presentation, I create a simple “presentation session” file in my sequencer. I use Digital Performer, but any sequencer will do. This “presentation” file consists of the following:


  • Bank of bus insert tracks (for monitoring)
  • A set of tracks for dialog
  • Sound FX
  • Temp and other (will explain that later)
  • A pair of “cue” tracks in an A/B format (for instance, I have a track labeled “Cues A” and “Cues B”).
  • ONE MIDI track set to a piano sound in Kontakt to work out a few ideas on the spot.


Output Blog | Scott Glasgow
My “presentation” session with the tracks and their folders.

I use this “presentation session” for all my spotting in a film with my director, and I use markers to designate where music will be in each reel of the film. The advantage to setting up a session this way is that I’m presenting a stereo mix of my music and removing all MIDI, audio, plugins or any “production” type tracks I’m using in my big sessions. Playing back a stereo track of the music also helps to keep the CPU usage low while also playing video, audio, and bus tracks. Unlike this “presentation” file, each piece of music in a film (called a cue) gets its own session which can have anywhere from 200-500 tracks. These sessions have tons of plugins and virtual synths hosted in Vienna Ensemble Pro streaming all the audio and MIDI internally on one system. The most important place I work with a director or film maker is this “Presentation Session”.


Output Blog | Scott Glasgow
The tracks window where you see why two “cue tracks” are need for cue that overlap (one cue ends, as another begins).

This is where all discussions happen in regard to musical choices on a project. Any fixes to the music are noted and to be fixed later. Once the changes are made, the director returns to hear them in a new session. Why use stereo mixes and not just open all my big cue sessions one at a time? There two reasons… First, there may be issues or time lag between opening dozens of cues with a large number of tracks while a director is sitting in your studio. Second, I don’t want to get into “co-composing” with a film maker. As soon as I hear a film maker talking about piano lines or whistling what he thinks I should be writing with the flute, I know I’m in trouble. A presentation file with stereo tracks only is the key to keeping the “collaboration” working smoothly. I don’t want them talking music specifics, but rather emotion and direction like they would if they were working with an actor.



Monitoring control to the speakers is a very important part of the presentation. If you were to create quick mixes and bounce them to Video QuickTime files for your collaborator to listen to, you may mix the music to the dialog too hot; in which case you may get notes to rewrite the cue, because for some reason “it was too loud” or “something wasn’t right”. My solution is to create a mix on the fly that is not recorded, but it is mixed as the film maker is working with me and listening to playbacks.

iPad isolation and fader controls for each track or set of tracks:

  • Mains
  • Mix (Anything that is my music)
  • Dialog and SFX (which we call “DX”)
  • Temp music (almost always turned off until someone says “but in the temp there was this cool thing”)
  • Other Ideas (for any tracks you try against the picture)
  • Click (which you really do not need)
  • Buttons for record, play, stop, jump a bar back or forward, and jump a marker back or forward.
  • On the far right are some utility pads that trigger save, unto, cut, paste, copy.

Output Blog | Scott Glasgow
My iPad 2 “Audio Monitoring” setup

The beauty of this iPad “monitoring” setup is as you play back the reel of film where you have cues, you can fade the music, dialog, or main volume up or down in real time. If your collaborator is having trouble hearing how it fits in the scene after a few play backs, you can be so bold as to hand him (or her) the iPad with this control for them to mix the balance themselves to their liking. Remember, you’re not recording fader moves — you can, but I do not. In the end, you are looking for approvals for your music in the scene. If it takes them showing you want they want, so be it. Give them the iPad control.


Output Blog | Scott Glasgow
Here is the screen cap of the same set of controls from my iPhone. They both have the exact same functions.

Another aspect to this I find really important is the personal level of interaction it creates while presenting. Just a few years ago I would sit there like some “mad scientist” presenting my music to these film makers with my back to them as I controlled the computers/sequencer for levels, etc. It was isolating and impersonal. The beauty of having 100% control over the audio right from your iPad or iPhone is that you can sit next to your film makers — or even on the same couch as they listen to your tracks. This creates conversation and an open dialog about the work, rather than a “tadahh, now what do you think?” type feeling. It’s much more comfortable to be able to read people’s faces than to finish playing something and spin your chair around to get their reaction.

This is just one way to set up your studio, but over the years I have found this to be the best, most effective way to present your music to film makers and collaborators. Happy music making (and collaborating)!

– Scott Glasgow


Here is a screen capture of my full session set up during a Ron Perlman film called “Poker Night”. You will see that there are many more MIDI tracks. I start with a folder of “unique” to the film tracks (unique instruments or synths like REV), then follow that with a “sketch orch” folder of about 35 tracks and finally a folder with 200+ MIDI tracks for my “detailed orch”.

Output Blog | Scott Glasgow
In the bus folder is the same bus, in the audio folder is 10 stems and a stereo mix track.

Using An iPad To Optimize Your Workflow | Output | Scott Glasgow
Studio setup last year with one iPad for monitoring using Neyrink V-Control (a very cool IOS app)


Exhale — Modern Vocal Engine

65% OFF

When you get the Output Bundle

Want free loops? Plus exclusive offers, production tips, and new product information?