By Jennifer Walden
Thinking about learning Unreal Engine 5 (UE5) for filmmaking, but aren’t sure if it’s worth the time and effort? Or, not sure how to start learn-ing UE5 for filmmaking? CineMontage spoke with Virtual Production Supervisor Joshua Toonen to get some considerations for that journey into virtual production using a game engine to enhance and streamline filmmaking.
Since the groundbreaking use of Epic Games’ Unreal Engine for production on Disney+ series “The Mandalorian,” virtual production has become a global multibillion-dollar industry with LED Volume (stages) being built the world over, from Canada to Australia. The largest one in the US – Amazon Studios Virtual Production’s Stage 15 in Culver City, Calif. – just opened in December.
An LED Volume is like a typical sound-stage, but instead of green screen behind the actors, a system of linked LED wall (and ceiling) panels display real-time computer-generated backgrounds. Unreal Engine is used to create, manipulate, and display those virtual backgrounds. Virtual production combines physical set structures and live actors with that virtual set extension. Light from the LED panels bounces off the physical objects on set, creating a natural, realistic look. The physical foreground elements and the virtual background are both captured directly in-camera, along with any lens distortion, depth of field effects, bokeh, and lens flares that will ultimately add a sense of “realness” to the image. Additionally, virtual cameras inside Unreal Engine can be synced to the camera on-set, so the background can change to match the camera movement.
Toonen worked as the Unreal Operator on the set of Netflix’s “Avatar: The Last Airbender” at the WFW virtual production stage in Burnaby, Canada (operated by Pixomondo). It’s a 2,200 square-meter, 310° virtual production stage, with 2,500 LED wall panels and 760 LED ceiling panels. Toonen spent over six months on the Volume, working through numerous fantasy-based backgrounds in UE5. Seeing how the backgrounds work with the set and costumes is important, but getting a realistic, cohesive look in-camera begins with pre-production collaboration among the art director, set designers, and virtual art department. The digital background is just one aspect of the whole production.
As Toonen cautioned, “You can’t just show up with backgrounds that a visual effects studio built, put the actors in front of it, and think you’re going to shoot it. The whole goal of using an LED Volume is to achieve in-camera final visual effects and not replace everything after the fact. There needs to be a clear goal from the start. It’s not just digital backgrounds. The best virtual production shoots are a marriage between the practical and virtual, and that can’t just come from one department. It comes from everyone looking at the set and the scene we’re trying to make, and finding the best version.”
He offers a concrete example of an ocean scene: a ship deck is built on set and a virtual sky and ocean are in the background. They work separately, but to make them work well together, there needs to be some interaction between them. “By wetting down the floors, or wetting down the props and the set, you’re going to get all the real reflections from the LED walls. Something small like that really starts to blend the two. Some of the most successful sets — like Pixomondo’s work on ‘Star Trek: Strange New Worlds’ and ‘Discovery’ — have a lot of reflections, such as completely reflective floors, because it just adds that extra parallax and movement and realism. That’s something you could only capture in real life. If it were shot on green screen, it wouldn’t have those natural reflections. But again, you have to start as early as possible with the visual effects team, lighting team, cinematographer, and director working together to achieve it all in camera.”
To help a virtual production look realistic, points where practical and virtual elements meet, such as the floors, should match perfectly. While on set, Toonen does a process called “blending” in UE5. He explained, “You have a lot of creative tools in Unreal Engine 5, like one to color-correct regions. You can color grade parts of your virtual scene — make them darker, brighter, change the color temperature, or change the color entirely — just like you can in DaVinci Resolve or any compositing software. This allows you to blend the practical and virtual seamlessly.”
Also, lights can be added to the virtual world in UE5 to match the practical set. For instance, if a lighting team adds a bright hair light behind an actor, Toonen can add a light that is the same color temperature in the virtual background. “I can add that around the actor, or to the top of different buildings behind them, just to take it to that next level and do things you would be doing in post. We have the flexibility to pull that off on-set,” he said.
It’s important to note that drastic lighting changes (such as changing day to night) will require “light baking” in UE5. That means pre-computing all the lighting across an entire environment so it will run efficiently (at the correct frame rate in real-time) on the LED screens. Toonen said, “Sometimes the images are 28K resolution when displayed on the entire surrounding screen. These gigantic images need to run at 48 frames per second (fps) or higher. If it runs less than 24 fps (the camera frame rate), then you can’t use any of that footage. So when preparing these environments for the virtual production stages, we need to be very intentional about what we’re trying to bring to life so that it renders all in real-time. That’s probably the most overlooked part, and it’s the thing we have to fight with the most.”
REAL-TIME CHANGES
Toonen started his career as a visual effects compositor, working on Hollywood blockbusters like “Alien: Covenant,” “Deadpool 2,” “Captain Marvel,” and “Star Wars: Episode IX — The Rise of Skywalker” at visual effects studios such as Industrial Light & Magic, Framestore, Luma Pictures, Pixomondo, and MPC (a division of Technicolor Creative Studios). Initially, he used traditional visual effects software, but discovering Unreal Engine changed everything for him. “In traditional visual effects workflows for a Hollywood movie, it can take anywhere from one hour to 24 hours to render out a single frame. On an animated movie, you might wait a whole day to get one single frame of footage. Visual effects studios use render farms — thousands of computers just rendering one frame at a time. With Unreal Engine, you have real-time rendering. You don’t wait for anything. The coolest part about UE5 is that you can make creative changes in real-time on set; you can literally change the entire background, or rotate the entire scene, or change the lighting,” he said.
For Toonen, being an Unreal Operator is akin to being a “live” compositor — someone who makes creative visual effects changes on set as opposed to in post. “It’s a very different pace, one that I enjoy. As traditional compositors, we would have days or weeks to finish one single shot in post, but on set we have to finish an entire sequence in a single day,” said Toonen.
The benefits of using UE5 during production even carry over into post-production. As Toonen explained, “In the Unreal scenes, we can take snapshots of the exact setup of the lights and the exact camera position used for a shot, and export that to the visual effects studios. Now, they can start with that same setup and then add in whatever they need, pushing things further in post.”
According to Toonen, LED Volumes and Unreal Engine are best used for achieving in-camera final visual effects on medium close-ups and tighter shots. Wider shots benefit more from visual effects in post. “Seventy percent of the shots on a virtual production stage are close-ups with an out-of-focus background,” he said. “Those are very forgiving. You’re relying on an interesting-looking lens to do most of the work for you. And honestly, those are often the most successful shots. Filmmakers should lean into making those close-ups look really great to get those in-camera finals. Following that methodology will save money for those wide shots at the end.”
UE5 is also a powerful tool for the pre-production stage. For “Dune: Part Two,” cinematographer Greig Fraser used Unreal Engine to pre-plan shots based on the natural lighting conditions of the filming location in the Jordan desert. “Instead of going out to locations to scout them, you can import Google Maps into Unreal and see a 3D model of the location,” explained Toonen. In Unreal Engine, Fraser could place characters and set pieces into the environment and see precisely how the sunlight and shadows would look for a specific time of day.
Toonen added, “Greig has been working with Epic Games and Unreal Engine for a while now. There’s a great project from Unreal called MetaHumans that anyone can use for free to make realistic characters. Greig worked with several Unreal Engine artists to create MetaHumans Lighting, a bundle of six lighting presets. So you can put your MetaHuman model(s) and any other assets (like a set pre-built in 3D software) into a location and start to visualize lighting setups. You can work in real-time and light the entire scene with the same lights you’d be using on the real film set. You get a super clear idea of how it’s going to look way in advance, before you’ve started constructing a single thing. It opens up the tools for everyone that’s on set.”
USING THE BEST TOOLS
Unreal Engine 5 is free to download and use, and the UE Marketplace has free assets, plugins, textures, blueprints, learning tools, and other materials. “There’s a huge asset library by Quixel that’s free with Unreal. These assets were captured with photogrammetry. So artists captured images of real objects, like buildings, props, rocks, caves, cliffs, and so on, and created assets from those, which you can then use to build environments in Unreal,” Toonen said. Assets can also be created in other visual effects programs, such as Autodesk’s Maya, SideFX’s Houdini, Blender (opensource software), etc., and imported into UE5. Artists can use their best tools for asset creation and aren’t limited to only working within UE5.
Learning UE5 — a powerful game engine used to create AAA game titles — won’t happen overnight. In reality, proficiency might take months or even a year. Toonen admits to giving up the first time he tried to learn it. But after seeing how Unreal Engine was used for “The Mandalorian,” he redoubled his efforts. The benefits of real-time rendering and lighting, virtual scouting, pre-visualization, and live compositing outweighed the time commitment needed to learn UE5. “Just be patient with your progress. There are a lot of pitfalls at the beginning, but this is a tool that you can use for the next five to 10 years; it’s that powerful,” Toonen added.
One of the biggest roadblocks for Toonen was understanding how all the systems in the engine work together. “At first, I didn’t know why what I was seeing in the viewport wasn’t what rendered out when I tried to make a movie to put in my editing timeline. And it wasn’t obvious why they were mismatching,” he said. “That’s why it’s so important to understand the game engine side of things. The best way to start learning Unreal for filmmaking is actually to start by making a really simple game. If you skip over that first step of learning game development entirely, I can almost guarantee you’re going to hit a roadblock and you’re going to have no idea how to fix it because you just haven’t used UE5 like a game engine.”
There are YouTube tutorials for learning UE5 for both game creation and filmmaking, a resource Toonen used while learning. But he warned, “Unreal Engine makes signifi-cant updates every couple of months, and if you’re just jumping into Unreal, you have to go through the maze of old forum articles and pages. The troubleshooting resources for previous versions of Unreal don’t always translate to the current version.”
The arduous challenge of learning a game engine for filmmaking led Toonen to start Unreal for VFX (www.unrealforvfx. com), a website that offers online training courses that teach the ins and outs of UE5. “It took me a long time to figure out Unreal Engine and to learn what I have so far. I want to help people shave off six months, a year, or even more, and shortcut them through that experience so they can get into the fun, creative parts of filmmaking using Unreal,” said Toonen.
He also helps instruct filmmakers hands-on. Last year, he was involved with the Virtual Production Workshop hosted by the CSC (Canadian Society of Cinematographers) and sponsored by IATSE 667. He worked with Pixomondo to design virtual backgrounds for Unreal Engine and helped teach filmmakers about virtual production on an LED Volume.
He also worked on a “Ghostbusters” short film with Canadian–American filmmaker Jason Reitman and director Gil Kenan (“Ghostbusters: Frozen Empire”). For this collaboration between Sony and Pixomodo, they used UE5 to recreate downtown New York City where Ecto-1 could drive through the streets and face off against the Stay Puft Marshmallow Man.
Toonen said, “It was a great way to get this technology into the hands of filmmakers. The animations were done beforehand, so Jason [Reitman] could go in as a camera operator and shoot in this digital space yet still get that live-action handheld look. Jason is definitely someone who loves live-action and working with actors, but we could show him that even for heavy CGI or visual effects sequences, he can be the one holding the camera. We can help bring that vision to life as a collaboration between the visual effects studio and the filmmakers.”
FOCUSING ON THE CREATIVE SIDE
There are many powerful tools and features built into UE5. As a visual effects artist, Toonen feels Lumen (the real-time lighting system) is one of the most powerful for filmmaking. Lighting scenes using traditional visual effects software takes time because making a change means waiting for the image to render. “You move this light and wait three to five minutes. It’s almost like a real film set where you wait for people to physically move lights around. With Unreal, you can make creative lighting decisions in real-time and see exactly how it will look through a real camera,” he said. “You can put a MetaHuman or 3D character into a 3D space in Unreal and then light it. You can move a hair light higher up or right above the actor instead of behind them. You can change the light color, or swing it around. All those small things that are very stressful to do in a short amount of time on a film set can be done in real-time in Unreal, and planned out before production starts. You get instant results so you can focus on the creative side of it.”
Toonen added, “Whether you work on a film set every day or are a visual effects artist, using UE5 will train your creative eye for better lighting results. I learned the most about lighting in Unreal because the response is so fast and you get accurate bounce light.”
There are also useful third-party tools for UE5. Toonen recommends DASH from Polygonflow, which uses basic physics simulations to scatter multiple copies of an object (such as rocks or bricks) into an environment, and Ultra Dynamic Sky (sold by Everett Gunther in the UE Marketplace), which is a blueprint for a flexible dynamic sky system with natural cloud motion, and customizable sun, moon, and stars.
“Honestly, Unreal has numerous features built into the engine itself. You just have to enable them. That’s a testament to Epic Games because Unreal works great right out of the box,” said Toonen.
LOWERING THE BARRIERS
UE5 is democratizing filmmaking by significantly lowering the barriers to entry. Besides being free to use, the capabilities covered in this article (virtual scouting for film locations, pre-visualization and pre-production planning for set construction, lighting, etc., and environment creation) merely scratch the surface of what’s possible. “UE5 lets directors and filmmakers tell bigger stories for a lower cost, and that’s across the board. That’s at the highest studio level and at the indie level as well,” said Toonen, who used UE5 to jumpstart his directing career.
One of his recent projects was a 10-minute-long music video for the metal band TesseracT, created using UE5, motion capture actors, and a motion-capture camera. “With a team of just five artists, we were able to pull off a project that would have cost hundreds of thousands of dollars, and we did it in a short amount of time. We’re able to punch above our weight and to stretch ourselves creatively. The music video, ‘War of Being,’ now has a million views on YouTube,” Toonen said.
For visual effects artists and editors, UE5 provides more opportunities for face-to-face collaboration with filmmakers. “All of us in the visual effects industry love making films. We are clamoring for new ways to not be on a computer in a dark room. We have ideas to share, and UE5 allows us to share them faster, to share them on a film set. That’s honestly been the most exciting part,” said Toonen.
On the directing and camera-operating side, UE5 provides instant creative feedback for filming CGI-heavy sequences, as director Reitman experienced with his “Ghostbusters” short film. “They’re making creative decisions while holding a camera,” said Toonen. “It’s an opportunity to take filmmaking into the virtual realm, but they’re still holding that camera, they’re on an actual set working with creative people; we’re collaborating on set together. That’s been some of my favorite experiences in visual effects, period,” Toonen said.
Jennifer Walden is a freelance writer specializing in post-production technology.