Virtual Production At America’s Premier Film And TV Production School

The School of Cinematic Arts at the University of Southern California (USC) is renowned for its wide range of courses and degrees focused on TV and movie production and all of the sub-categories that relate to both disciplines. Following real-world industry trends, as it always has, last fall the school launched its inaugural (and highly popular) series of classes on virtual production—that is, shooting in-camera effects against a large LED wall.

For its new course, called “Virtual Production in LED Volumes, the school has retrofitted one of its existing sound stages on campus with a new a 20ft wide x 11 feet tall LED back wall, cameras, software, and other equipment, all to support the new curriculum dedicated to the latest method for on-set production

TheBroadcastBridge.com sat down with Bradley Kean, Director of Creative Technology, The School of Cinematic Arts, USC, to talk about what students are learning, their prospects for getting a job in the industry and how collaboration in this discipline is the best way to teach this complex yet awe-inspiring production method.

TheBroadcastBridge.com: Why is it important to add Virtual production to your curriculum?

Bradley Kean: Virtual Production expands the types of stories that students can tell and prepares them for the tools they’ll encounter in the industry.

It’s the first time students have been able use digital visual effects where the camera is capturing “final pixel”. That means you don’t have to do any post-production visual effects work. The LED wall is high enough quality that the background is convincing and the entire shot is done, hence “final pixel”.

You can still tweak the shot if you’d like, but the idea is you don't have to. Some call this technique in-camera visual effects.

As far as our curriculum is concerned, this has changed the type of students that can use digital effects. Previously, there’d be a bunch of post-production visual effects work, so students would have to be interested in that kind of work. The fact that it is all captured on the stage frees you up to just be more creative. There's all kinds of students—writers, producers, editors— that could benefit from this class and learn the technique. Ultimately, it's up to them how technical they want to get.

Brad Kean, Director of Creative Technology, The School of Cinematic Arts, USC.

Brad Kean, Director of Creative Technology, The School of Cinematic Arts, USC.

TheBroadcastBridge.com: So, how are you structuring the courses?

Kean: One of the hallmarks of our school is we're really concerned with the fundamental concepts of being a good storyteller. If you're interested, you can learn the technology and tools, but it's all supposed to serve “how can I tell a better story.” And “what are the tools available to tell that story?”

The difference is that it's not an obsession with the specific tools, because as we know, technology changes all the time. We want to make sure that five, ten years from now you can apply these fundamental techniques to whatever version of the technology is out then, right? So rather than being a subject matter expert, we’re teaching students to be storytellers who are comfortable embracing new technologies because that will serve them throughout their whole career. 

TheBroadcastBridge.com: When you discuss virtual production, do you tell students it saves filmmakers money, or are they doing it because they can actually have more control over the pixel, like you mentioned earlier? What's a driving force to produce a TV show or movie on an LED stage?

Kean: The cost question is interesting because the financials are so different around a film production in the industry versus at a school like USC. A student can save money because they can grab some pre-built digital environments that are readily available. Because you're not building a physical set, there’s a huge financial benefit to doing it this way. It would save a student money for sure. In the industry, you need a whole virtual art department to build that digital environment and doing it all bespoke out of assets crafted specifically for that production.

However, all that planning in the run up to production, it does become quite expensive. As that workflow gets streamlined there will be cost savings in the industry as well. Our inaugural LED wall class was last semester [2023] in the fall, and the students created some really interesting projects that benefited from use of the wall. But my understanding is that this kind of work in the industry is still fairly expensive.

I think the driving force is creative freedom and that you can see the actual environment you are shooting in, in real time. It really helps with visualization and creating an immersive experience for both the audience and the people making the TV show or movie. Looking through the camera’s lens, you really know what your shots are going to be. So rather than just looking at an animatic or a storyboard, you see the final shot or close to the final shot.

With new technologies like our Sony Crystal LED screen, you're seeing what the shot truly is because that LED screen is such high quality. The pixel pitch is 1.5mm and uses the same great color science as Sony cinema cameras. So, you're seeing the finished shot as you shoot it. You're not seeing an estimation of it. It also gives you immersion in terms of the light that's cast from it. 

TheBroadcastBridge.com: Explain that.

Kean: Traditionally, we’d use green screen, which is great for flexibility but it does cast a green spill onto the actors. We try to do all kinds of things to try to minimize that in photography and post. It can diminish the realism and breaks the illusion that actor is really in that space. But with this new LED wall technique, because the environment is behind the actors, it's casting the ‘true’ color of what the environment is onto them.

Also, the whole crew and the actors can turn around and look and see what the environment is. So it's really bringing everyone together on the same page. And it really makes actors more comfortable.So between wardrobe, production design staged in front of the wall, and the infinite depth of the environment behind you, you really get a feel for and can precisely control the environment you are shooting in. 

TheBroadcastBridge.com: Let's talk a little bit about the curriculum you've designed. It’s called Virtual Production and LED Volumes.” If I'm a Production student, do I have to take other courses as a pre-requisite? And what degree do I come away with?

Students of the “Virtual Production and LED Volumes” class learn the full gamut of disciplines involved with virtual production.

Students of the “Virtual Production and LED Volumes” class learn the full gamut of disciplines involved with virtual production.

Kean: The School of Cinematic Arts at USC has a number of divisions and degree programs. There's producing, production (which includes all the major crafts), writing, games, media theory, etc... Students from any of those disciplines can participate if they have the prerequisites required. So it might be a student focused on directing that’s also interested in virtual production. Or it might be someone focused on writing or producing games, but they can all come together and bring their own contribution to a virtual production project. The cross-disciplinary collaboration is one my favorite aspects of the school. 

TheBroadcastBridge.com: This course requires that students are familiar with real-time rendering engines like Unreal Engine and Unity. Why is that?

Kean: Films made with this technique combine the high-speed intensity of on-set production with the real-time rendering power of a game engine. If you’ve never used a game engine before it might bottleneck the process. Due to the school’s legacy of being so strong in game development, we have courses in both Unity and Unreal Engine. And we've been doing that for 20 years. So that area is comfortable for us. With the virtual production and LED volume classes, we’re bringing that knowledge onto the stage environment to combine it with live-action principal photography. The class has several instructors teaching together including a real-time engine instructor, a cinematography instructor, an editing instructor, a production design instructor, and a directing instructor. Students get to learn how all these crafts come together and by the end of the course they’ve made a short film using the LED wall.

TheBroadcastBridge.com: What do you tell students is most important to virtual production?

Kean: Great question. On a technical level, it’s understanding how to bring all these tools together for maximum effect. Using a real-time rendering engine in tandem with physical camera tracking is what makes these shots most convincing because that's the only way to bring parallax into the background.

That’s the thing that's different from using this software just for game design. We have to take the physical camera that we use to shoot the movie and track it and send that tracking information into the game engine so that the digital camera moves exactly like the real physical camera. So you have two worlds that are married together. You have the physical world, the camera and the actors and the director and everyone standing around on set.

And then there’s the digital environment existing in the game world and displayed on the LED screen in high enough quality to look real. So that's the new technique that this class offers; how to marry those two worlds together in real-time. It all has to happen in less than a 24th of a second for them to be in sync.

It’s an interesting class because you learn all of these different technologies. You have the LED wall, you have a camera tracking system, you have a game engine, and you have a real physical camera. All of them have to work together precisely. 

The LED screen can be controlled by the image render engine to carte synchronized effects in real time.

The LED screen can be controlled by the image render engine to carte synchronized effects in real time.

TheBroadcastBridge.com: So the key is developing a workflow where all of the technologies to work together in sync?

Kean: What you get from that is a moving camera where the background moves and you have full parallax. Until you have all those systems talking to each other and working together, you don't have the magic. The magic is when the camera moves and the environment moves with it. That's really the goal of this class. The skills you learn—the basic film set skills and a real-time game engine—all those prerequisites build to this final project. 

TheBroadcastBridge.com: Let's talk about the physical production stage you’ve retrofitted with a new LED screen on campus (and just opened in the fall of 2023).

Kean: The main LED wall that we have is Sony’s Crystal LED display. It’s 20ft wide x 11 feet tall. The resolution is 4K, which works perfectly because it plays nicely with everything we've done before. So it's a very manageable technology to implement. We also have a lot of very high-end hardware driving it. We're using HP’s Z6 workstations with NVIDIA RTX 6000 Ada Generation graphics cards to run the wall. There’s virtually no latency, which is also very important to successful virtual production. It's critical to avoid stutter or other aliasing effects. With this setup we know we can generate the best photoreal backgrounds possible.

We actually have multiple workstations that can work in tandem. So some students could be working in the game engine, moving things around the scene and prepping the environment. An additional workstation is actually rendering what the game engine’s virtual camera sees and sending that image to the wall.

We also have a separate motion capture stage down the hall. So we could even have actors performing there, driving digital characters that appears in the virtual environment on the LED wall stage. We invested heavily in a dedicated high-speed computer network so these rooms can talk to each other easily.

TheBroadcastBridge.com: Besides the large LED wall, How big is the production sound stage?

Kean: That stage is 48ft by 42ft. There's a neighboring green screen stage and a partition between them that can be retracted so you could double the width of the stage to 96ft if you wanted to shoot the LED wall from even further away. Those stages are located in the Robert Zemeckis Center for Digital Arts and we’ve been using them constantly for about 25 years, regularly upgrading them with the latest technology. 

TheBroadcastBridge.com: Another thing that you teach as part of this class is sound design. How is that part of a virtual production?

Kean: Interestingly, there are challenges within LED volume stages where you can get sound reflections off the wall if you don't record the audio carefully. You have to be mindful of how the sound interacts with the set and sound stage. So there are extra things to learn, besides what you know about traditional production sound on a film set.

TheBroadcastBridge.com: What about lighting a virtual production stage?

Kean: Lighting is really exciting to work with in this class because we purchased LED light fixtures that are married to the digital environment we created. The game engine will tell the light what color to display, based on what the environment looks like. Light fixtures are now packed with technology and they live on the computer network and receive live control data. We use smart lights from Nanlux and Kino Flo. We have the Kino Flo MIMIK lights, which look almost like a low resolution TV capable of displaying a moving image. We’re sending it video of the environment. So if the environment is moving on the LED wall, the lighting can move to match.

There's some programming setup required to put it on the network and have it talk to the rendering engine. But then the color from the fixture is all dynamically driven by the rendering engine. So you don't have to change the light manually. The rendering engine will feed it in real time. 

At USC students use the latest professional production tools, including motion capture systems to produce a 15-minute short film.

At USC students use the latest professional production tools, including motion capture systems to produce a 15-minute short film.

TheBroadcastBridge.com: You also teach students how to calibrate motion capture as part of the course?

Kean: Yes, exactly. And we’re using the same OptiTrack system to track the physical camera that we’re using on our Motion Capture stage for performance capture. That was incredibly helpful because we’ve been teaching Motion Capture for over 15 years and are very comfortable with that technology.

There are two ways to do tracking, you can do what’s called inside-out tracking where a device placed on the camera looks at the environment and figures out where it is in 3D space, usually with the help of tracking markers placed on the ceiling. Or there’s outside-in tracking where you have tracking cameras placed around the environment, looking at the cinema camera to figure out where it is. OptiTrack is an example of outside-in tracking and the advantage is that it scales up to any size of stage.

So, what our students are learning will work with any production in the real world. Pixomondo, the Sony Pictures owned VFX and Virtual Production company, operates a professional LED volume (“Stage Seven”) on the Sony lot (in nearby Culver City). The Pixomondo volume uses the same LED panels as those deployed in the USC stage, but on a grander scale. As an industry standard facility, the Pixomondo volume uses proprietary technology for content optimization and management and utilizes OptiTrack for camera tracking. However, the knowledge our students attain in the principles of ICVFX, would allow them to go to the Sony lot tomorrow and shoot in an LED volume of professional scale, employing some of the same techniques as Hollywood filmmakers.

TheBroadcastBridge.com: So when students graduate, what are the prospects for them to get a job in the field?

Kean: Within the field of virtual production, the prospects are great at the moment. We actually have companies coming to us all the time saying we're desperate for people that know these techniques. Some of our faculty work in the industry on huge feature films in the virtual production space. And they'll hire some of their students even before they graduate.

So, there's a really great pipeline there. When I went to film school 25 years ago, the going wisdom was, “They always need sound people. So if you learn sound, you'll have a job right out of school.” Right now it’s virtual production. So if you learn virtual production, you will have job offers immediately.

You might also like...

Audio For Broadcast: Cloud Based Audio

With several industry leading audio vendors demonstrating milestone product releases based on new technology at the 2024 NAB Show, the evolution of cloud-based audio took a significant step forward. In light of these developments the article below replaces previously published content…

Designing IP Broadcast Systems: Addressing & Packet Delivery

How layer-3 and layer-2 addresses work together to deliver data link layer packets and frames across networks to improve efficiency and reduce congestion.

Designing IP Broadcast Systems: Integrating Cloud Infrastructure

Connecting on-prem broadcast infrastructures to the public cloud leads to a hybrid system which requires reliable secure high value media exchange and delivery.

Designing IP Broadcast Systems: Where Broadcast Meets IT

Broadcast and IT engineers have historically approached their professions from two different places, but as technology is more reliable, they are moving closer.

Comms In Hybrid SDI - IP - Cloud Systems - Part 2

We continue our examination of the demands placed on hybrid, distributed comms systems and the practical requirements for connectivity, transport and functionality.