By Jennifer Walden
FX’s “Alien: Earth” series captures all the vintage charm of the original 1979 “Alien” film while adding fun, new flavors to the franchise: the “Ocellus” eyeball alien creature, the carnivorous plant-like alien called the “Orchid,” the USCSS Maginot spaceship, cool new tech like Morrow’s Swiss Army-style cyborg arm, and lab equipment for transplanting consciousness — all requiring visual effects.
“Alien: Earth” lead visual effects editor Glenn Cote and his team certainly had their hands full, liaising with eight different visual effects vendors and supplying them with everything needed to efficiently generate animated creatures or complete post-viz shots, while also providing picture editor Regis Kimble with elements he needed, as quickly as possible, to keep the show on schedule.
“This was my first time working with Regis and his team, and it was unlike any other show I’ve been on. Regis, who was also a producer, would be on all of our vendor calls, so everybody was hearing it from the horse’s mouth, so to speak. The editor and the visual effects supervisor, Jonathan Rothbart, together were guiding the vendors,” said Cote.

Another new aspect of the process for Cote was that Kimble preferred to do his own animation temp mockups. So instead of Cote doing the majority of the mockups, as he recently did on director Gareth Edwards’s “The Creator,” he’d supply some of the elements to Kimble. “For example, I was getting the creatures in basic animated cycles against a green screen or blue screen background from the pre-viz team. I’d give those to Regis, who would manipulate them and move them around in the Avid, choosing the right frame from the walking cycles to put the creatures in the shot. He and series creator/showrunner Noah Hawley decided exactly where they wanted them to be,” Cote explained.
These initial effects were rudimentary, blocky, unrefined illustrations meant to show how the final visuals would move around the frame. Cote noted, “We still had to work out all the subtleties of the fine animation. Jonathan spent many hours refining the visuals so these creatures moved the way you see them move in the finished episodes.”

“Usually, I’m the one driving the post-viz process directly with the vendors, but Regis is very hands-on. His temp mockups became the offline blueprint for our visual effects vendors. My job was to make sure he had all the elements he needed,” added Cote. “And I handled traditional post-viz for full CG scenes, which there were quite a few of, especially in the first episode, like the crash sequence and some of the space shots.”
“Alien: Earth” was shot on location in Thailand — not on an LED volume stage. Cote said, “They didn’t have the most high-tech shoot. The most high-tech gear was the crane used for the oners and the motion control camera for the transition sequence. Everything else was very traditionally shot. Noah Hawley and director/cinematographer Dana Gonzales knew early on how they wanted to shoot it, mimicking the ‘70s aesthetic. That went all the way through to the finishing process; in the DI [digital intermediate), we added a film grain that’s reminiscent of this 1970s feel, and it’s evident from the opening sequence inside the USCSS Maginot.”
What the set lacked in technological luster, it made up for in size. “They had very large-scale sets. The whole underground area where the ship crashed was built in an abandoned shopping center. They created the whole side of the Maginot in there, with fake concrete chunks for the destroyed building,” said Cote. “We also used CG set extensions, for instance, every time the camera looks up or down the stairwell. The hardest part there was having water and sparks dripping through the frame. Sometimes we had to paint them out and put them back in to make it work with the digital set extensions. That was more tedious than difficult. Once they dialed in how far down the stairwell they wanted it to go, Jonathan would spend a lot of time massaging the lighting to make sure there was no demarcation between what’s digital extension and what’s practical set.”

For any scene with an animated creature, the on-set visual effects team would have the unit shoot with the actor and an empty frame where the creature would be. Then, after the actor was done but before they wrapped the lighting on set, they would shoot the background and also “tile” (i.e., have the camera pan and tilt to capture a series of overlapping plates of the background, which the visual effects team can use when compositing a creature into the scene) either with the motion picture camera or a still camera, and also do a LiDAR scan to capture highly accurate 3D data of the physical set. “Then we’d also do a pass with a maquette — a puppet on a pole — just moving crudely through the frame in the lighting of that shot as a reference for how the creature would look in reality, in that actual space with that lighting. They physically move the maquette to all the different marks while keeping the camera running,” said Cote. “So, there’d be a background shot, a clean plate, and a maquette pass; that’s three or four elements for each scene.”
There were also complex motion-control shots, like the “transition” scene in Episode 1, when the kid characters have their consciousnesses transferred into synth bodies. “It’s all done in one take. I had to stitch that together a few times in the Avid to make sure for proof of concept. All those elements were sent separately to the vendor so they could stitch them together for the final visuals, including when the shot tilted up to the overhead monitor where the ‘Peter Pan’ animated film is playing. It was a lot,” said Cote, who revealed that at the last minute, a studio note came in requesting that each kid have a name tag. “Since the kids’ consciousnesses were put into adult synth bodies, it could have been a bit confusing if we didn’t know that they change their names when they become synths. So adding name tags was a whole other pass for that one-take shot,” he said.
Cote admits that he’s a bit old school in terms of visual effects editorial, having started his career as an apprentice at ILM in 1988 — before there was Avid Media Composer or any NLE software, before Photoshop, After Effects, Maya, and other visual effects tools brought the work into the digital realm. Cote said, “There was no such thing as pre-viz or post-viz. The process involved doing a temp optical, sending it to the lab, waiting for the next day to check it out, and then continuing to iterate.”
Editing tools evolved as Cote’s career progressed, yet he has stuck to using Avid Media Composer for making visual effects temps. For him, it’s the most efficient way to get work done, so why complicate it? Cote said, “The advantage is that you can easily unwind it when you’re turning the show over. You don’t have to hunt for the elements you used to make a temp. They’re there. It prevents what we call ‘mystery mixdowns.’ Truthfully, we did have that on this show; assistants and others on the post team would make temps but bury where the sources were. We’d have to hunt down all the elements used to make that mixdown. So my primary tool is — and will always be — the Avid, for everything.”


There is one exception. Cote uses Adobe Photoshop (something he’s become very proficient at) to create clean backgrounds, then imports those plates into the Avid. Working mainly in the Avid environment is a conscious choice. Cote explained, “Rather than working in other applications and having to export and re-import the elements, I’ll keep as much as I can in the Avid. The way I organize a session, I always know for whatever temp I made, there’s a bin that has all the elements that I used to make that temp.”

Another complex CG sequence in Episode 1 was the tactical team’s arrival at the Maginot crash site. Although this was conceived of and shot as a continuous take, it was later decided to add cutaways to allow the audience to see what Joe Hermit was seeing, to understand his experience of the situation. Cote said, “There’s also that moment where the camera tilts up to reveal the building, and a piece of it falls and smashes into the camera. There’s a cut there. Debris smashes into the engine, and smoke billows up; there’s a cut there. We ended up using those devices to compress this sequence. That was a very long, iterative process to get the handoff between the practical plate and the CG. Everything below the characters’ waists (what’s on the ground), the bus, the building entrance, and the parked truck were practical, so they’re in the background plate. Everything else — the entire Maginot ship, and the entire top of the building — was manufactured. That took a lot of dialing in.”
Cote noted that for some scenes, they did live virtual-reality animation — a process he first experienced while working on “The Creator.” He said, “Director Gareth Edwards did a session at ILM where they shot tons of live animation for the space station scene (a lot of which didn’t make it into the cut) using a VR headset. We got that back in the cutting room and used that to build all our temp sequences. We did that as well on ‘Alien: Earth,’ through a different vendor, UPP in Prague. We’d have a remote VR session led by our visual effects supervisor, Jonathan, and we’d set up some sequences in the crash scene that were difficult to nail down exactly. Jonathan would move the virtual camera around in the frame — up, down, in, out, change the lens, try moving from here to there, move backward, and so on. They output four or five iterations of that, and we’d make a library for Regis, who’d then decide which ones to use and where in the cut.”
The post-production team worked remotely on “Alien: Earth,” since, according to Cote, that’s Kimble’s preference. “We stuck with that model with no exceptions,” he said. When working remotely, Cote mainly uses his own gear. His studio has an iMac with three monitors (one supplied by the editorial rental company Atlas Digital) running Jump Desktop, which remotes into an Avid Media Composer Ultimate workstation (physically located somewhere in Burbank) with a full suite of visual effects industry-standard plugins, like Boris FX Sapphire. “Everybody on the show has the same plugins. So when the assistants or I do temps, it all works properly, no matter what system we’re using,” noted Cote.
His standing desk has two keyboard levels: an Avid keyboard on the upper level for editing, and a standard keyboard on the lower level for typing emails and indexing. He has a laptop setup for virtual meetings and review sessions, which he also uses to continue working while away from his home studio. Cote said, “This setup is a comfort for me. I can work standing or sitting, depending on what I’m doing. Some days I don’t sit down all day because we’re slamming away. I like to keep my setup simple because fewer things can go wrong. That’s especially important on a show like ‘Alien: Earth’ that has thousands of visual effects shots and a very defined schedule.”

Cote’s Avid (accessed via Jump Desktop) was connected to the Avid NEXIS while editor Kimble was using Resilio for remote editing. “Regis and our visual effects production team were looking at higher-quality, higher-resolution images. Everybody else was remoting in, so unfortunately, the resolution is going to take a hit, depending on your internet connection, which is sometimes fabulous but often not. We relied on them as another set of eyes to judge the shots if we had a bad connection,” he said.
Cote worked with two assistants, industry veteran and ILM alum Jim Milton and one of Kimble’s “Fargo” assistant editors, Inëz Czymbor, dividing the duties among them to keep pace with the show’s demanding schedule. He said, “As a visual effects editor, I serve two masters. I’m working to get shots in the latest cut and in front of Regis as quickly as possible, and also preparing for our vendor meetings.”
For remote review sessions with the visual effects vendors — which happened quite often — they used two solutions: SyncSketch (now part of Unity Technologies) and ScreenIt (a proprietary program by Atlas Digital, which is no longer supported). Cote explained, “We were one of the last shows to use ScreenIt. It allowed us to stream from our Avids. SyncSketch was for higher-resolution clips, and it required uploading to their server. As you can imagine, our days were pretty packed with almost back-to-back vendor meetings. There were over 2,100 shots for the show, and we didn’t do a whole lot of pre-viz during filming because they were scrambling just to get it shot. It really fell on our department to pick up the pace and run with it in post. We were ‘jumping on the box’ — having a remote review session — constantly. It was an everyday essential.”
Cote’s now finely tuned remote workflow took years to develop. It all started with Marvel’s “The Falcon and the Winter Soldier,” which had to switch to remote editing due to the COVID lockdown. “We had to figure out — as did the rest of the industry — how to work remotely. Marvel went through numerous iterations of software, trying three or four different types of permutations before we finally found what works best,” Cote said.
It was then that Cote first met visual effects producer James Ledwell, production manager Mitchell Callisch, and lead visual effects coordinator Alexandra Rebeck — all now reunited on “Alien: Earth.” Cote shared, “We all bonded on that show. Picture editor Jeffrey Ford came in and took over. We devised a new hybrid workflow that was more feature-film-based because we all primarily had feature experience — something Marvel was looking for. So, I’ve stuck with that model ever since.”
Cote went on to do “Moon Knight” with Rebeck, and then “Werewolf by Night” (a Marvel special presentation directed by composer Michael Giacchino) with Ford. Next came “The Creator.” Cote said, “Gareth’s team was nice enough to set up my system at Fox exactly as I had it at Marvel for ‘Werewolf by Night.’ Editor Hank Corwin, despite his lengthy and storied career, had little experience with visual effects-heavy shows, yet he quickly grasped my process and let me run with it. ‘The Creator’ looks great, and we got nominated for a couple of Oscars. What more could you ask for?”
Just as “The Creator” was wrapping up, Cote received the call for “Alien: Earth” from Ledwell. “It had been a while since we did ‘The Falcon and the Winter Soldier.’ Now, we had the opportunity to put that effects team together again. It’s really the Team Ferrari of visual effects. These are folks who’ve done big Marvel shows. James and Mitch were coming off of ‘Dune: Part One.’ So we all have quite a body of knowledge,” said Cote.
The team’s experience proved invaluable in tackling complex visual effects shots, such as the Maginot crash scene. Cote explained, “Finding the right speed for the ship was difficult because if you mathematically followed the laws of physics, it didn’t work. It wasn’t dramatic. It just looked weird because it was going so fast that we wouldn’t be able to clock what’s going on in some shots. So, for dramatic purposes, we slowed it down and sped it up depending on what angle we were viewing it in.”
The alien creatures also required numerous iterations to get right. “It’s challenging to develop a new creature, to figure out how it’s going to move, its coloration and texture, and how that reacts to light. For instance, the tick’s color seemed to change because of its translucency. The reflectiveness of different textures on the creature would react to the lighting in very different ways. We had to tweak every shot based on the lighting to make it look correct. Also, the way it moved took a lot of care and time. Patricia Binga, the animation supervisor for Zoic Studios, was very patient with us changing directions a few times to see what would work best,” Cote said.
Then there’s the Ocellus (a.k.a. “eye midge”). Its unique characteristics made the team wonder: should it move like an octopus? a squid? a spider? “That was another tough one, but it very quickly became our favorite character to work with,” said Cote. “The most challenging of all, however, was Morrow’s cyborg arm. There was extensive R&D with Noah, Jonathan, and the talented team at FIN Design + Effects on this. We wanted to be able to see the inner mechanics under the skin’s translucent surface.”
Cote explained that actor Babou Ceesay wore a sleeve on set that had the impression of some features underneath. But it limited his range of motion, and he couldn’t move his hand naturally. “The initial idea of the sleeve was to get away with as many shots as possible without needing visual effects to touch them. Of course, that didn’t work out. We ended up having to touch everything, sometimes replacing the entire hand or arm, or having our in-house visual effects artist color-correct it to match all the other shots, but that only worked when you see just a little bit of it in the frame,” he said. “By and large, we did replace the bulk of the cyborg arm shots. That particular R&D was done entirely in post and took the longest. Once that was dialed in, FIN ran with it. They cranked out tons of shots with the arm with very few iterations necessary.”
Despite creating huge amounts of visual effects shots on a strict schedule, Cote noted there were no major bumps in the road on “Alien: Earth.” He concluded, “Multi-tasking during vendor reviews is a key benefit of working remotely. The downside is that when everyone is not in the same physical space, it diminishes the ability to ‘read the room’ and respond accordingly. The tight post schedule and a delay caused by the sudden shutdown of our visual effects vendor MPC made remote working not just an advantage but a necessity on ‘Alien: Earth.’”
