Shortly after completing principle photography in December, and after taking a well-deserved and much-needed break from work, I started to think about the workflow for post-production. Having never edited something so large, I had no idea what to expect, or even if we had the right tools for the job. I started scouring the internet for ideas and expectations for editing a feature-length film with Final Cut Pro X (fcpx). I didn’t see many people doing that, so in the same spirit as my lengthy post on directing my first feature in Africa, I offer you my nerdy narrative on our post-production process. Maybe it can come in useful to others who are considering FCPX as an editing platform.
Why Final Cut X?
Over a year ago, I directed our first project that we edited in FCPX. We had been undecided up until that point whether we would jump ship (from Final Cut Studio) to Adobe Premiere, but figured the free trial for FCPX was worth giving it a chance. We were blown away by how quickly we were able to take 6 hours of unlogged footage and assemble it into a finished video package. Normally this process would have taken 2 weeks for the logging alone. FCPX and its range-based keywording and scrubbable event library allowed us to do this in a day. The trackless, magnetic timeline, with auditions and compound clips helped us to very quickly assemble a rough-cut. It did take us a few weeks to really learn the ins-and-outs of using FCPX efficiently, but by the end we understood the unconventional but clever choices Apple made in redesigning the editing workflow from the ground up. So, while not sure how FCPX would hold up with 80 hours of footage and a 2-hour feature, we plowed ahead with using it for at least our rough-cut of the movie, realizing how much time it would save us in the 1st half of post-production.
Mike, our script-supervisor/data wrangler, took copious notes while we were shooting. Things that would help us later to sync up our audio and video footage, as well as know which performances from each camera setup were preferred by me on set. At the end of every production day, Mike would take the SD cards in play, make duplicates of them on our “card folders” external drive, and then ingest those cards into FCPX events on another external drive. We made an event for every scene, plus events for 2nd unit footage for each major location. Occasionally Mike would also assemble dailies from the footage, especially if it was a multi-day shoot and we would have the possibility to reshoot before leaving that location. And especially if there was a question if a certain sequence of shots was going to work.
The logging and editing process
Myself and 2 other editors would begin post-production in January. I gave us 15 weeks (3 days per scene, 70 scenes) to assemble the 1st rough cut, not having edited anything this large before and not knowing how long it would really take. We each have a 6TB RAID with identical copies of the Final Cut Events that were already ingested. I divided the movie into 44 chunks of scenes, assigned an editor to each chunk, and made a shared Evernote note that would help us track progress. Each editor would take their scene and:
- Create proxy media for the event/scene
- Label/log each video and audio clip in the “scene.setup.take” format (ie. 9.14.2).
- Sync video and audio clips
Use a combination of the shotlist and the shooting log to assemble the timeline for the scene according to how it was originally envisioned
- Create auditions for potential good takes, which I would later choose from
- J-cut, tweak, adjust timing to make it as good as possible without wasting time grading, mixing, add music or ambient
- Review with me, make changes, re-review
- Sync the events/projects to the other editor’s drives
- Go grab another scene and start over!
We had our rough cut. We had scheduled some test screenings, so we did a quick 1-pass audio/color correction and balance, and threw some temp music tracks in there (I used the soundtrack to BBC’s “The First Grader”). The movie was around 2 and a half hours at this point. I knew we’d be taking some things out (and had a good idea of what that was going to be) but needed our test audiences to weigh in.
The test screenings were extremely informative. We had 3 different screenings for 3 different groups of people: young adults, pastors, middle age adults. We talked briefly before each screening, preparing them that the music was temporary, that they would notice some technical issues we’d fix later, like mic noise that we’d fix later in ADR, or boom stands that made it into the shot that we’d fix later in VFX. Then we’d roll the movie. Immediately afterward, we handed out a 2 page questionnaire which allowed us to capture the immediate impact and thoughts before leading them through a time of asking questions. Questions like: did you understand what the pact was between Max and Tom? Which character did you connect with most? Where there any parts that confused you? The answers, both on individual questionnaires and in group discussion, were invaluable. They confirmed some of my suspicions of scenes I wanted to cut, scenes I wanted to reshoot. And they really surprised me in some major holes that I hadn’t seen. For example, in the beginning of the 2nd act, Max comes home to his apartment and surprises his roommate Tom who is watching a scary movie on the couch with his girlfriend. We all knew that Max & Tom roomed together, and that this was his apartment. But our test audiences didn’t draw that conclusion until a couple of minutes into the scene. We were also uncomfortable with how the scene played out, so this confirmed our need to reshoot this scene. The test audiences were also surprised to see Max & Yusef as friends at the end, as the last time we see them interact Yusef is throwing his phone into the ocean effectively severing his relationship with “the Christian.” From our perspective we didn’t see an issue, as a couple years pass in between, but for our test audiences they were hung up on not seeing how those two people were reconciled. So, we added a quick trip to the coast with Claude, who plays Yusef, to shoot some pickups and a quick sequence of shots to show him reflecting on his friendship with Max and that he has Max’s phone number written down. Problem solved.
By July, after the 2 scenes we reshot, the 2nd unit pickups at the coast and around town, and having cut approximately 30 minutes (several entire scenes) from the movie, we were ready to lock the picture. Already by this point we had used FCPX to consolidate around 70 events into 1 event that held only the clips that made it into the timeline. After locking the picture, we finalized auditions, reconsolidated, and then started the process of creating high quality ProRes versions of each clip to export for color correction, grading, and VFX. We also exported audio stems to be used in sound design and mixdown.
Taylor, who was PM during production and one of three editors, took the task of color correction (matching clips) and grading (applying a look). He opted to use DaVinci Resolve for this process. The roundtrip workflow from FCPX to DaVinci and back to a FCPX timeline was not an easy one. By the time we got to the final assembly we were working with about 3 TB of ProRes clips, some of which FCPX would reject as not matching the originals. Complicating this was the fact that many of the clips were compound clips that were created during the audio/video sync process. And we didn’t have enough hard drive space to keep more than 1 version of the movie’s event/project on a single drive. Each time we wanted to back it up to another drive it took a day to copy.
Josh, the other editor, was happy to take on the role of VFX. While not having lightsabers and spaceships, we did have a fair amount of VFX work that needed to happen. Boom stands to remove from shots. Church signage to digitally remove. Some crowd shots at the church needed to have people digitally inserted to fill in some holes. Dead pixels to remove. Max’s POV after getting hit in the head. And all the titling and roll credits. Josh would work with the ungraded clips (usually) in After Effects, and render out his replacement clips and hand them off to Taylor to be color corrected and graded to match the clips around it.
That left me to work on music. In May we hired Eric Wainaina as our music consultant, one of the biggest names in music in East Africa. Eric was given the rough cut of the movie, and we sat down several times to “spot” it together, and talk through the music cues. Eric’s job as music consultant was to find African music that could fit the 9 cues that needed to have local music. He’d present me ideas and we’d talk about which ones I liked, then he would go and contact the agents or bands and securing a license to use it in our film. He also wrote one of the cues, Yusef’s theme, as I needed something more authentically ethnic than I could produce (I added the string undertone). Mike Saum, the screenwriter, also co-wrote the theme song to the movie with Eric Wainaina.
The biggest task for me, though, in addition to weighing in on multitudinous decisions weekly with Taylor on color, Isaac on sound, Eric on music, was writing the other 24 cues that would be underscore. I started with picture locked version of the movie, a newly upgraded version of Logic Pro X, and EWQL’s sample libraries Piano, Hollywood Strings, Hollywood Brass, and Stormdrum. These were new libraries to me so there was a lot of time spent getting to know the various string articulations, drum types, and familiarizing myself with this incredible, vast library that Hans Zimmer also uses! I locked myself in the basement of our house for about 2 months. The first month it didn’t feel like I was accomplishing much, but I was exercising some composition muscles that hadn’t been exercised in a very long time and it took me a long time before I’d find something I was happy with. By the end of those 2 months I found I had developed a good instinct and could be happy nearly always with the first thing that would pop in my head.
Some songs I wrote without a keyboard, but just forming the melody and orchestration in my head. Some songs I wrote only through hours of “noodling” on the piano. But each cue was carefully planned out ahead of time. I knew what the cue needed to communicate and where it needed to sit underneath the dialog, occasionally becoming a voice in the dialog. And each cue was written having mapped the dramatic “beat” changes in what the viewer is experiencing and discovering. I thoroughly enjoyed this process, and am fairly pleased with the results. You can listen below to my orchestral underscore:
As I was wrapping up the music writing, Eric was finalizing the contracts with our Kenyan artists, and Isaac at GNPI was beginning the sound design. I eventually joined Isaac, and pulled a couple weeks of 6am to midnight work in the studio on getting it just right. I was surprised by how much work this was, partly due to the fact that the roundtrip from FCPX to ProTools is a mess. Even after carefully assigning roles in FCPX to each type of audio, and having a couple guys spend a couple weeks organizing all the foley and ambient, we still had an organizational nightmare. After sorting through that, getting everything into the appropriate tracks, we had to work a lot of magic in the ambient tracks to cover noise in the mics during production (from noisy environments that made our cuts obvious). Then we had about 5 days of ADR (dialog replacement) with several of the actors. Then the process of mixing. Mixing with monitors. Mixing with headphones. Resting our ears. Everytime we sat down to listen through it and make notes is was a day-long process. Simultaneously, all the other pieces were coming together, just in time for…
getting ready for the theatrical premiere
We knew we wanted to have our premiere in a local theatre, and found the management at Century Cinemax Junction very accommodating. Our premiere dates were booked, 3 showings that first week for private audiences, then public showings starting on the weekend.
All we needed to figure out how to do now was create a “DCP”, or Digital Cinema Package, which is a very high bitrate version of the movie in surround sound which is required to be able to play it through these expensive digital cinema installations.
We got quotes from vendors in the US who would have charged a couple thousand dollars for the creation of a DCP file. We decided (after testing with our trailer) that we could probably do this on our own using open source software. It was a big gamble. We either needed to mail a hard drive to the US and have someone else do a quick turnaround so Ted Rurup (producer) could carry it back to Kenya the following week, or we could do it ourselves. No time to try it ourselves, and fall-back to hiring it done because of the timing and the distance we live from the US.
For video, we had to create a massive TIFF sequence (a TIFF image for every frame of the movie), then feed that into opendcp which would create a JPEG2000 version of the TIFFs for us. This took days.
Meanwhile, I took the audio stems (dialog, foley, ambient, music) and made a 3.1 (left-center-right, LFE) mix of our audio. I did this blindly as I didn’t have proper montoring equipment for surround sound. During the sound design phase we had already isolated our LFE channel (sub) in preparation for this step.
Then Josh used opendcp to combine the video and audio into a DCP package, which at the end of the day ended up being under 500GB. We had to make several trips to the theater to find the right combination of picture and drive formats to find something the projector server would accept.
But once we did… wow. To see your creation through camera lenses, then computer screens, and to suddenly take it to the silver screen with a projector and sound system we could never afford to own, suddenly it was obvious and overwhelming! We had made a movie! And it looked an sounded like any other movie! In fact, I took my wife out to see Gravity that weekend, and during the trailers leading up to the movie they showed our trailer. It looked and sounded as good as any of the others (while having 1% of their budget!). My wife and I couldn’t help but squirm with excitement, and wonder if those around us were excited to see a local production with Hollywood production values.
The crowds at the 3 private premieres a few days later were even more pleased, excited, and moved at the end than we could have ever anticipated. It was an amazing experience, albeit an extremely long one and occasionally frustrating. To see it not only finished, but effective. Seeing random strangers moved to tears. The movie played for several weeks, before needing to be pulled to make way for the mass of Hollywood films that come out in December.
Over the past year, I’ve learned a lot. I’m extremely grateful for the experience, and especially that we were able to accomplish everything we had dreamed of. If I had to do it again, I’d do a few things differently.
- While fcpx allowed us to get a rough cut rather quickly, the post production workflow to color and to sound was a mess. I’m not sure it was worth it. Either we should have done everything in the Adobe suite, or we should have used FCPX (and plugins) to do the grading and the sound design. It would have saved us weeks, and lots of headaches.
- ADR was much easier than I expected. On set, sometimes I’d have us do 20 takes to get the delivery just right, or worrying about other background sounds. Seeing how easy it is to not only re-record dialog in the studio, but fix problems with deliveries I wasn’t happy with, I would have worried a lot less about that on set.
- Making movies IS fun. I doubted that for a good long time. From long days on set, weeks away from family, to the tedium of 10 months of post production. That was all cured in the rewarding experience of watching it on the big screen, surrounded by cast, crew, friends who didn’t want it to end.