The Covid crisis has made work-from-home editing tools more important than ever
By Jennifer Walden
The COVID-19 pandemic put the kibosh on film production, at least temporarily.
But some in the post industry have been able to carry on, thanks to remote workflows and flexible schedules that incorporate off-peak hours. Many editors I’ve talked to about working remotely have been rushed into this predicament. They’ve shared their improvised solutions (which may also include stable and fast internet connections like the ones wow internet provides), some which aren’t pretty but are getting the job done for now. When asked if this way of working is sustainable, though, these post pros had a collective answer: “No.”
Desperate times, desperate measures.
There are three main factors to consider for remote editing setups – collaboration, content/media sharing, and the actual editing.
Filmmaking is a highly collaborative process – like a Venn diagram of different disciplines, with the director in the middle. Ideas need to flow freely, and if people can’t congregate around the console or editing desk, that exchange is stifled. Zoom has become the de facto “face-to-face” collaboration tool, but security issues can pose a serious threat, especially when dealing with valuable intellectual property like a Hollywood film. It’s essential to keep that content safe.
Editor Dody Dorn had just finished the director’s cut of Zack Snyder’s “Army of the Dead” when the call came down for her to continue her work from home. Fortunately, “post-production supervisor Andrea Wertheim saw the writing on the wall and ordered some equipment in advance,” Dorn said. So she and her three assistants were outfitted with desktop iMacs and desktop microphones “for better sound quality because we knew we were going to be doing a lot of videoconferencing.”
Their team does use Zoom to discuss the cuts, but they are very careful about what is saved on those iMacs. None of the film content is on their iMacs at home; it all lives safely on the NEXIS at the studio in Glendale where they were cutting the film.
Even when they need to watch content together for VFX reviews, Dorn’s first assistant Carlos Castillón notes they don’t actually send the content to everyone joining the conference. Instead, Visual Effects Supervisor Marcus Taormina uses Frankie, a browser-based solution for interactive review of stills and video.
Frankie is made by Cospective, the same company that makes cineSync. But unlike cineSync, where each person has to have a local copy of the media, with Frankie all you need in order to view the content is a link.
“Marcus puts the media onto the server at his house and then he provides us with a link to it. We just click the link and watch him control the shots. The director can even draw on the frame, like, ‘Let’s add some more flames here and some more blood splatter there.’ So during the VFX reviews, we have two windows open, the Frankie window to review the content and the Zoom window so that everyone can be discussing at the same time,” Castillón said.
So how do Dorn and her team edit the film if they’re not actually moving any media to the iMacs at home?
“We’re using Jump Desktop to control the AVID’s at the studio and we’re using Tunnelblick for a secure VPN connection. That VPN tunnel doesn’t allow regular Internet use; the firewall has been configured that way. This way, no media can be transported over the network,” explained Castillón, who used a similar remote-editing setup while working on ‘Justice League’ a few years ago. For that film, the editor was in L.A. while Castillón and another assistant were in London.
“We never had it set up in someone’s house before, but luckily, it’s been working pretty well for us,” he added.
With Jump Desktop, the team is able to control the AVID systems remotely, but it’s like looking at a snapshot of the AVID’s screen. The image isn’t full-screen and it’s not full-definition. The resolution peaks out at 1080p, and the image quality is dependent on how many users are connected to the AVID at once. “When all four of us [Dody and the three assistants] are connected to the network, it will get a little sluggish. Sometimes the image we see is pixilated. It’s dependent on the internet connection, too, and everyone right now is relying heavily on the internet. So if I know that Dody will be working in the morning, I will offset my hours to work around that. Sometimes I end up working at night because there is less lag and fewer dropouts than we were experiencing during the day,” he said.
Had they not been so far into the post process on “Army of the Dead,” Dorn feels the image quality that comes from using Jump Desktop would have been a concern. “We’d have to come up with a better viewing system because my ability to evaluate the material is compromised due to the lower resolution of the image,” Dorn said.
If image quality or the stability of the NEXIS connection is a concern – as it is for assistant editor Angie Luckey on The CW’s “Dynasty” series – another approach is to use mirrored drives. But when copying content onto the editor’s and assistant’s local drives, the trick is to make sure you have everything you need to complete the episodes.
Luckey has been handling the show’s backups since Season 2, saving bootable backups onto DVDs using Bombich Software’s Carbon Copy Cloner. She’s the keeper of the content, so when the shift to remote editing happened, Luckey was able to help the other assistant editors pull what they needed to finish their episodes.
“We’re using sound effects and music from Season 1 and 2. Plus, we need to pull over all the material for the main title and the end credits. I would make sure the assistants took their episodes offline, found out what they lost, and copied over the missing media. They’d also need material from past episodes for the recaps. So, we’d email each other back and forth at home, saying, ‘I don’t have this and I need that.’ But I have a copy of it on the bootable backups and so we’re able to exchange media when we need it,” Luckey said.
Her setup at home is almost the same as her studio setup. From her cutting room, she has her Trashcan Mac running AVID, a USB 3.0 hub, two monitors, a DVD burner, and an office chair. The two main differences – and the cause of her biggest frustrations – are that she’s missing her client monitor that would have made it easier to view the VFX drop-ins for approval, and she has to rely on her residential internet service. Maybe if she could find a reliable Internet Service Provider (like those at ISP-in-area.com) then at least that bit of her frustration would be taken care of.
Uploading large files at home – like the online turnover which consists of a video-only “Same as Source” .mov files and separate .wav files totaling 13 GB – can take over five hours. The sound turnover is even bigger, with a split-track “Same as Source” chase, separate .wav files, and two sets of AAFs with different specs all totaling 22 GB.
“The turnover to our sound team is always a huge file,” Luckey said. “I started that upload to Aspera in the morning and it finally finished a little before 5 p.m. This is crazy. To continue working this way, I would need the upload speed to be better. But, honestly, I don’t think remote editing would be a wise thing to do for the future.”
Finally, let’s look at the actual editing. If you’re not connecting to your workstation (DAW or NLE) remotely, then your computer at home has to handle the demands of your session or project. Music editor Brent Brooks has a “cheese grater” 12-core Mac Pro in his studio at Formosa Music, which “has way more horsepower than what I have at home,” he said.
His remote setup uses two laptops – a Mac Book Pro running Pro Tools (which is never connected to the internet) and another “online” laptop used to move content to and from Moxion (a secure content-sharing platform similar to PIX). Using the online laptop, he downloads the content to a thumb drive and then transfers that to the “offline” Pro Tools laptop. “This way, the content is safe. Even my online rig doesn’t have these files on the desktop. That’s the thing you have to be really careful about,” he said. If at all your data is compromised at some point, fear not! There are quite a lot of places where you could visit for more information about how to recover the data.
His Mac Book Pro is running the same version of Pro Tools that he uses in the studio. But the reduction of processing power requires an increase of patience. He said, “I see a ‘beach ball’ more often at home than I do in the editing room. I’m always hoping that it’s not going to crash!”
Brooks is editing the music on the upcoming film “Unhinged,” and he’s working with composer David Buckley, who is in Andorra, a small country in the Pyrenees Mountains between France and Spain. “David has been working remotely the whole time, and my support with him was going to be on the internet no matter what,” said Brooks.
Currently, they’re only working with demo tracks in stereo, so it’s not as taxing on his system as the full 5.1 stems (a set of 45 split-out instrument tracks) would be.
“At this point, it’s more like work-in-progress stuff for the director to review,” Brooks said. “I’m mostly mixing on headphones right now, which is kind of odd. Usually, mixing on headphones isn’t good for playing back on speakers because your point of reference is so much different when you’re listening through headphones. But these demos I’m doing now are being reviewed on computers with headphones, anyway, so it works.”
When it comes to mixing on head-phones, re-recording mixer Bruce Buehlman from 20th Century Fox/Disney Studios has a tried-and-true setup he’s been using for remote mixing since 2014. It’s the Smyth Research Realiser A8 surround processor paired with the Smyth-recommended Stax SRS-2170 headphone/amplifier set or Stax 3170 headphone/amplifier set. “Stax headphones (made in Japan) are electrostatic, open headphones. They’re extremely neutral so they don’t color the sound that is being produced by the Realiser,” Buehlman said.
The Realiser A8 is able to replicate a mix environment by applying processing to the output signal that feeds the headphones. The processing is based on calibrations that were captured by the user for a particular space. The processing also incorporates head-tracking, so the sound can change when you turn your head just as it would if you were in the real space listening through speakers.
Buehlman explained, “The unit comes with its own microphones for calibration and built-in capturing software that you bring to a dub stage or near field mixing room that you want to capture. You put the microphones in your ears, like earbuds, and face straight toward the front speakers. It plays frequency sweeps through the stage’s speakers, and it has you look at different speakers in the front so it can track your head movement. Then, you put the headphones on over the microphone earbuds and it does the same exact test.”
The best way to test the accuracy of the processing – to see how faithfully it reproduces that mix environment – is to do an A/B comparison. According to Buehlman, when you’re capturing a room, you have the ability to instantly switch between the speakers in the room and the headphones. When the head-tracking reference unit (centered in front of the listener) sees the head-tracker on the headphones, the sound is played through the headphones. When you take the headphones off, and it doesn’t see the tracker anymore, then it will automatically switch to the speakers. You can use any sound – pink noise, music, dialogue, etc. – and solo a specific speaker, like the left rear surround, if you’d like. “I’ll take the headphones on and off without moving my head and it’s the same frequency response, it’s the same level, it’s the same reverb time, and it’s in the same position relative to my ears. It’s extremely faithful,” he said.
To get the best results from the Realiser A8, Buehlman strongly recommends capturing profiles of your favorite mixing environments. “If you’re used to mixing on a particular stage, you want to capture that space. The generic profiles aren’t the way to go. Capturing the real rooms was the only option for me and it has served me really well.”
Another bonus is that one A8 system will independently head-track sound for two mixers, and with inputs for eight channels of discrete audio, you can mix in 7.1 and below. Need to mix in Dolby Atmos, DTS:X, or Auro 3D? Smyth’s Realiser A16 can virtualize and head-track 24 virtual sound source positions for one mixer, and 16 virtual sound source positions for two mixers independently.
“The nice thing about the Realiser is I can capture the rooms I want. And because I did it, I know that it is faithful to the room. I think this is an extremely viable solution for remote mixing and pre-dubbing, especially for lower-budget projects that might be unsupervised, or minimally supervised,” said Buehlman.
For those with the extra space and construction know-how, another option to mixing remotely is to build a home mixing studio, which is what Formosa Group re-recording mixer Andy Koyama did. He turned a 13-foot by 12-foot section of his garage into a Dolby Atmos dub stage. Those looking to pull off such a project themselves may want to seek the advice of professional electricians to see if the plans are feasible beforehand. Check out information about such services here – saltle.com/electrician-texas-service-areas/round-rock-tx-electric-and-lighting/. Making sure your electrical setup is adequate is key to knowing whether or not you can go ahead with your own home studio with all the equipment which may be highly power-hungry.
“It’s a completely contained studio. It’s all floating, with double layer drywall and lots of acoustic treatment. I installed a 7.1.4 Atmos setup with Barefoot MM45’s and their accompanying subwoofers for my L-C-R, JBL 708’s as my surrounds, JBL LSR308’s in the ceiling, and a couple of 12-inch JBL subwoofers. I have a 49-inch TV up front for picture reference, but I’m mainly looking at my computer screens. I have two Pro Tools rigs – one HD native and another HD system with UAD hardware – on Mac Minis. I don’t need a large console at home, so I have a single-fader PreSonus FaderPort,” Koyama said.
The challenge of mixing a theatrical release in a small studio is to make that mix translate well in a large theater. “Ideally, we would all dub feature films in a feature film dub stage, but when circumstances dictate, we must adapt,” Koyama stated.
Before mixing in a new room, he references his past mixes that he knows very well and uses those to get attuned to the space and to set levels that will translate well to the larger dub stage. “That way, I can alter my mix-style depending on the way the sibilance is, or how harsh the midrange is, or how deep the low end is. So if you have material that you are familiar with that you know translates to a real world theater, use that to do a mental translation for what you’re currently mixing.”
“No matter what stage you’re mixing on, you have to keep that in the back of your mind while dubbing to get it into the mix-pocket that you are accustomed to, so that you know it will translate.”
“That kind of awareness,” he observed, “comes from many, many years of dubbing,” he concluded.