Oscar-Worthy Sound Editors

Awards WATCH

Millie Iatrou, left, and Ai-Ling Lee.

by Mel Lambert

As on previous occasions, the 91st Academy Awards nominees in the Best Sound Editing category covered a number of noteworthy narratives, from the hidden but advanced kingdom of Wakanda, through a biopic of a 1970s rock ensemble, NASA’s first landing on the moon, the importance of remaining mute silent in a post-apocalyptic world, and middle-class life in 1970s Mexico City.

The teams that developed the innovative soundtracks for these scenarios are Benjamin A. Burtt and Steve Boeddeker, nominated for Black Panther; John Warhurst and Nina Hartstone, AMPS, for Bohemian Rhapsody; Ai-Ling Lee, CAS, and Mildred Iatrou Morgan for First Man; Ethan Van der Ryn and Erik Aadahl, MPSE, for A Quiet Place; Sergio Díaz, MPSE, and Skip Lievsay for Roma. Four of these films, with the exception of A Quiet Place, are also Oscar-nominated nominated in the Best Sound Mixing category, with some of these nominated sound editors doing double duty as re-recording mixers.

Benjamin A. Burtt.

Co-supervising sound editor Burtt, who has also been nominated for an MPSE Effects/Foley Award for Black Panther, recalls that the editorial team was trying to tell a deeper story of Wakanda, as its citizens confront a unique threat. “During the Warrior Falls challenges, we stripped away a lot of the background sounds, and tried to really focus the track on the fight and grappling sounds,” he says. “We were trying to emphasize that these challenges for the throne were raw and primal; not two superheroes going at it.”

Prior to the release of Black Panther, the comic-book world of Wakanda was largely unknown to the general public. “We are quickly introduced to their history, culture and traditions,” Burtt continues. “To the rest of the world, they are an undeveloped country that mostly keeps to itself. As the movie unfolds, we come to learn that they possess technology much more advanced than any other nation.”

Steve Boeddeker

Co-supervising sound editor/re-recording mixer Boeddeker, is also an Oscar nominee for sound mixing on Black Panther and is nominated for a CAS Sound Mixing Award and an MPSE Sound Effects/Foley Award for Black Panther. He believes that audiences have certain expectations with any Marvel Cinematic Universe movie. “While we wanted to meet those expectations,” he says, “we really wanted to do something different. Ryan Coogler’s script is such a departure from your typical superhero movie, and we wanted the sound to follow suit. There are plenty of opportunities for exciting sound, but also many chances for sound to play an emotional role — specifically the Ancestral Plane sequences, where emotion is key.”

Discussing Wakandan technology, Burtt says, “We found ourselves repeatedly asking, ‘If the technology is so advanced, would it make a sound? Would Wakandans intentionally design something that hummed? Would the Royal Talon Fighters be loud and alert people to their presence or, despite their size and power, were they designed to be silent so they could move around quietly and remain unnoticed?’ We debated often whether some bit of technology would make a sound. Sometimes we settled on ‘Yes,’ other times we settled on ‘Let’s try it with nothing — or at least something very subtle.’” Boeddeker confirms: “We had many discussions with Ryan about what futuristic design would/should be like. The sounds needed to convey a feeling of ‘high tech, but also be pleasant and rooted in traditional African sounds.”

On the dialogue side, the native Wakandan language of Xhosa posed its own challenges. “It is a unique and beautiful African language characterized by clicks and pops,” Burtt explains. “Naturally, it is difficult to edit in a language you are unfamiliar with, adding back the important clicks and pops you typically would remove, and seamlessly go back and forth between English and Xhosa. Our dialogue supervisor and other editors did a great job wrangling all the various languages and accents heard throughout the film.”

“All aspects of the effects, music and mix were always transitioning back and forth between the royal African influences and a more Western-focused sound,” Boeddeker relates. “For example, the early fights on the waterfall are brutal and aggressive, but rooted in the ancient tradition of battle. There are no high-tech elements, no Vibranium sounds [of a fictional metal used in Marvel comic books] — just the brutality of the fight. But the huge battle at the end of the film is a full-blown futuristic swirl of flying Talon Fighters and futuristic weapons.”

John Warhurst.

Supervising sound and music editor Warhurst, who is also a BAFTA Nominee for Sound on Bohemian Rhapsody, and has been nominated this year for MPSE Musical and Dialogue/ADR awards, reveals that the editorial team’s overall goal “was to try and make the audience feel as excited as if they were attending the Queen concerts portrayed in the film. We did a lot of work to build up the scale of the huge audiences, and notably Live Aid. We were very fortunate to have opportunities to create concert crowds using a combination of original recordings, a marketing campaign by Fox that invited people to submit recordings of themselves singing along to the song ‘Bohemian Rhapsody,’ and access to Queen’s own recordings of crowds from a back catalog of concerts. That gave us both scale and authenticity.”

Warhurst’s team secured time on a second-unit day with a crowd of 600 to record extras singing Queen songs. “Inspired by Freddie’s ‘Ay-oh’ call and response technique at Live Aid, this crowd was recorded by playing each line of the song through the PA we had on set, and then have the crowd sing it back,” he explains. “Pro Tools sessions were built so this could be done with a rhythm — the crowd singing back the line they had just heard through the PA, and then they would immediately hear the next line. To worldize the music, we also managed to get two hours before one of the Queen concerts to play the concert music from the film through the band’s current PA system in a 20,000-seat arena without an audience.”

“Using Blend Audio [a UK-based loop group], we scheduled an exterior space at Shepperton Studios to record our mid-sized crowd,” relates supervising dialogue/ADR editor Hartstone, who is also a BAFTA Nominee for Sound on Bohemian Rhapsody, and has been nominated this year for an MPSE Dialogue/ADR Editing award. “We had approximately 40 actors, who we recorded in groups of eight with headphones. With all these crowd sessions, it was important to always push the group to keep the same level of energy and enthusiasm as real concertgoers; getting them to jump and punch their arms in the air so you can hear the movement in their voices.”

During dialogue editorial, she recalls, “One of the biggest challenges was ensuring our Freddie, portrayed by the truly excellent Rami Malek, was believable singing as Freddie Mercury, whose four-octave range is the stuff of legend. Unique to this film, we had three voices to interweave [to create a cohesive and believable vocal track]. Rami did amazing work performing as Freddie, giving absolute authenticity to his physical presence and movements.”

In terms of providing an immersive Dolby Atmos experience, a primary concern was the creation of concert crowds. “These tracks were built from many different sound recordings, to create bespoke tracks for each concert, with rich layers of voice, whistles, drums and other stadium crowd sounds,” Warhurst reveals. “For ‘We Will Rock You,’ we had three Pro Tools systems playing back the crowd — musical/ effects and group — probably totaling around 500 tracks. There were around 150 tracks coming from the Music Pro Tools system, with everything stemmed, plus worldized tracks.”

Nina-Hartstone.

“There were many unique hurdles in the creation of a soundtrack for Bohemian Rhapsody,” Hartstone concludes. “The press conference scene focuses heavily on the build-up of pressure and the intrusions into Freddie’s private life. The picture edit and camerawork put us inside his head; the sound of the incessant, intrusive questions from reporters. This was interwoven with camera flashes and sound design elements [to emphasizing this pressure build up].”

Sound designer/supervising sound editor/re-recording mixer Lee is also Oscar-nominated for sound mixing on First Man, together with a BAFTA Sound nomination, as well as CAS and MPSE nominations. “First Man was filmed and edited with sound in mind,” she offers, “Picture editor Tom Cross left room in the cockpit sequences, where we can only see what the astronauts see, for sound to illuminate the story. The astronauts react to the intensity and danger created with sound. Sound is used to show not just what you see, but what we want you to feel. In these scenes we crafted an arc where sound transitions from an authentic G-force experience of being strapped to a rocket, to using abstract design such as adding animal vocals, in order to evoke a visceral sense of anxiety felt by the characters. Damien Chazelle also wanted space to have a lonely, chilling effect, so we used shades of quiet, or even extended periods of pure silence.”

Ai-Ling Lee.

According to supervising sound editor Morgan, who also received BAFTA and MPSE nominations for First Man, “The film uses a lot of radio communication between mission control and the astronauts, many of which are comms from the Gemini and Apollo missions. But for story purposes and clarity we also had to supplement the archival comms with recreations. Much time was spent casting for loop group actors who sounded authentic and of the period, and then editing the performances so that their cadence matched the older comms.”

To portray high velocity throughout the spacecraft’s ascent, “We manipulated the sounds — turbines, metal vibrations, fire whooshes — to perpetually rise in pitch like a music cue, creating a feeling of G-force and danger,” Lee recalls. “It was important to make these scenes uneasy and surprising, by playing with sound dynamics; we’d build to extremely loud moments, then immediately drop into shocking silence. To play into Neil Armstrong’s emotion and anxiety of fighting for control of the spacecraft, we processed animal vocals that were pitch-manipulated and distorted to sound like turbines or fireballs, using sources such as elephant roars, snake hiss, lion growls and even an animal stampede.”

The Saturn V portrayed in the film is the world’s most powerful rocket. “To capture the fury and insanity of the Apollo launch,” Lee adds, “we turned to the next most powerful rocket on Earth, and recorded the maiden launch of Space X Falcon Heavy. We set microphones on the launch pad to capture the close-up ignition, as well as the sonic booms for the X15 scene.”

To help the audience feel the low-end rumble during launch, “we recorded sound in JPL’s acoustic chamber,” states Lee. “Using nitrogen gas, they blast the chamber used to test hardware components by simulating the acoustics inside a rocket during launch. To capture the extreme low frequencies, we converted a subwoofer speaker into a microphone.”

“My biggest challenge was editing Neil Armstrong’s famous words from the moon,” Morgan acknowledges. “Ryan Gosling referenced the original recording and performed the dialogue very close to the way that Armstrong did. Damien and Tom wanted the dialogue to sound exactly like the original, so I edited Ryan’s performance against Neil’s to reproduce the pacing. I combined the futzed air from the original as well as the radio squelches and placed them in the same spots. I used Revoice Pro to tweak a few words and syllables to match Neil. Some people have told me they thought we used Neil Armstrong’s voice, which really pleases me!”

Ethan Van der Ryn, left, and Erik Aadahl.

Supervising sound editor Aadahl was also Oscar-nominated for sound editing on Argo (2012) and Transformers: Dark of the Moon (2011), and is BAFTA nominee for Best Sound on A Quiet Place. He is also up for MPSE Dialogue/ADR and Sound Effects/Foley Awards. He says that actor/director John Krasinski’s film about a family forced to live in silence while hiding from monsters with ultra-sensitive hearing “had sound baked into the DNA of the script; John described it as ‘a central character.’ Sound is deadly; the family we follow has to intimately understand sound to survive.”

“One of the unexpected things that we did was to use the sound to put the viewer/listener into the sonic perspective of the characters,” says co-supervising sound editor Van der Ryn, who was also an Oscar nominee for Argo (2012), Transformers: Dark of the Moon (2011), as well as Transformers (2007), and he won the Oscar for his work on for King Kong (2005) and The Lord of the Rings: The Two Towers (2002). This year, for A Quiet Place, he is also BAFTA-nominated for Sound  and is up for MPSE Dialogue/ADR and Sound Effects/Foley awards. “We did some subtle things with the sound design to help describe the rules of the world that our characters are living in,” he adds. “This made the audience active participants in the film, and created an extremely immersive and interactive [experience].”

“We created a number of sonic points of view that John Krasinski came to call ‘sonic envelopes,’ Aadahl reveals. “The deaf daughter, Regan, played by Millicent Simmonds, who is deaf in real life, uses a cochlear implant. During her close ups and point-of-view shots, we go into her sonic reality, which Millicent described as a ‘low hum.’ Other sonic envelopes include the family’s POV and, of course, the creatures, which have hyper-acute and -sensitive hearing.”

As Van der Ryn explains, “The main narrative arc we take is to set up the rules of the world now dominated by alien killing creatures with hyper-sensitive hearing. Sound is initially portrayed as the biggest hindrance to our characters’ survival, as is brutally demonstrated near the opening. As the movie progresses, we start to get hints that maybe sound can also become a weapon for our characters to use against the alien creatures. By the end of the movie this concept is fully comprehended by the deaf daughter.”

The first big risk with sound occurred on the script level, Aadahl relates. “There is almost no dialogue at all in this story. Or exposition. It’s really unheard of having a studio movie with so little dialogue, and what there is of it whispered. I remember working on a scene in which Regan is testing out a new cochlear implant her father has made for her. We are in her ‘sonic envelope’ — the low-hum and heart pulse that she can hear — and she pulls out the implant. Silence! I remember gasping when we tried it — goosebumps!”

“We had to figure out what the alien creatures should sound like,” Van der Ryn adds. “Because they are blind with hyper-sensitive hearing, we thought it would make sense for them to use some sort of echo location, like sonar, to navigate the world searching out their prey. We started playing around with a stun gun we had laying around in the studio, and recorded it zapping against grapes, which, like humans, are good conductors of electricity because they have a fleshy interior encased in a thin skin. We slowed down those sounds and they became the echo-location sounds.”

Sergio Diaz.

Supervising sound editor/sound designer Diaz, who received his very first Oscar nomination for Roma, says that his biggest challenge while working on the film was “to recreate director  Alfonso Cuáron’s memories and explore the cinema experience with an immersive sound design/editing with huge amounts of accurate sounds.” Co-supervising sound editor/re-recording mixer Lievsay is also nominated for a Sound Mixing Oscar for his work on Roma; he previously won a the award for Cuáron’s Gravity (2013), and was nominated for Inside Llewyn Davis (2013), True Grit (2010) and No Country for Old Men (2007). He also earned a CAS Award nomination this year for Fahrenheit 11/9 (2018) and and MPSE Award nominations for Roma. “Alfonso wanted the viewer to be able to smell the atmosphere, morning, noon and night; to feel the day pass almost in real time,” he says. “We jointly decided to use Dolby Atmos to give a 3D quality to the soundtrack — extreme Atmos — everything would pan with the moving camera perspective, just like in real life.”

From the film’s very first shot, between the earth and the sky, “All the sound elements should be flowing harmoniously during each sequence.” Diaz says. “There is a thin line between affected and organic sounds; our soundscape was always to preserve the organic sound touching the audiences with subtle, chaotic and silent moments, as we move forward until the sky and the earth appear again in a peaceful resolution. The most remarkable risk to explore was Alfonso’s idea to pan all the dialogue and effects elements around the cinema, so that wherever the camera goes, you’d discover more sound elements in the left and right sides.”

Diaz recalls that he invested more than 18 months recording and editing accurate sounds for the film, “starting with the atmospheres [that formed] the basis of my sound design for that period of time,” he says. “Since the actual Mexico’s cacophonic is very noisy, with plenty of cars and less trees/birds, I had the idea to record the city atmospheres on Christmas day of 2016, with no one on the streets. In addition, I recorded all the cars on screen, crowds, vendors, soccer games and more specific sounds outside of the city.”

“Sergio can take credit for the thousands of period and location specific sounds that were gathered for Roma,” Lievsay acknowledges. “All sounds needed to be accurate and realistic; many sounds of 1970 Mexico City can still be heard today. Our epic Group ADR session in Mexico City involved over 300 voice actors and we created recordings for every person on camera, each with scripted lines from the filmmaker. These recordings enabled us to extend our 3D panning to the foreground and background extras in each scene.”

The re-recording phase was a major experience, Diaz recalls, “since Skip and Craig Henighan connected with the film at such a deep level. The biggest hurdle was the translation of Alfonso’s memories and emotions into our soundscape journey. Apart from the foregoing, the film has no music score; sound design transforms it into a sublime experience for each scene.”

“During the past few years, Craig and I had several positive experiences with Dolby Atmos and were very excited to apply such concepts to this type of film,” concludes Lievsay. “We did have a very long final mix lasting eight weeks, during which we had the time to try many approaches and select the best ideas. We learned much about the possibilities of immersive audio.”

About Mel Lambert 60 Articles
Mel Lambert is intimately involved with production industries on both sides of the Atlantic. He is a 30-year member of the UK’s National Union of Journalists. He can be reached at mel.lambert@content-creators.com.

Leave a Reply

UA-83546600-1