Inside Amazon Studios Huge New VVC Virtual Production Studio On Historic Culver City Lot
Amazon’s new Stage 15 facility uses volumetric capture workflows to streamline production projects.
Exciting new types of on-premise and cloud-based feature film and episodic television production and post workflows are now being experimented with and deployed at Amazon Studios’ recently opened virtual production stage, dubbed Stage 15, in Culver City, Calif.
Stage 15 was originally built in 1940 and was home to productions that included films like "It’s a Wonderful Life", "RoboCop", "Airplane", “The Three Amigos”, and “Armageddon”, as well as popular TV series “Star Trek”, and “Batman.”
The state-of-the-art volumetric video capture (VVC) space is the centerpiece of a newly formed Amazon Studios Virtual Production (ASVP) department that will produce a variety of projects for the Amazon Prime OTT video service and utilize Amazon Web Services tools where appropriate.
During a gala opening on December 6 th, Amazon executives said that by launching the new production facility, they were creating a new playbook for entertainment as they have in retail and for the book business. The company said it recognized that there was a tremendous pool of creative and business talent that would flock to the large budgets and creative freedom being offered to select individuals there.
Clients that have already worked there are calling it “an invaluable storytelling tool.”
New Production Facility For Today’s Frenetic Workflows
Stage 15 combines the largest two former stages into a 34,000-square-foot mega-stage with a ceiling height of 46 feet. It includes a 5000 square feet LED wall that measures 80 feet in diameter and 26 feet tall.
There’s also 17,000 square feet of space dedicated to set construction and production support and a two-story “Sandbox” support stage/lab within the building that features a virtual location-scouting stage, a performance-capture stage, a tech-scouting stage, a green screen simulcam stage, and a client-facing VIP viewing area for visiting executives, filmmakers, and guests.
Amazon is hoping to have a significant impact on how episodic TV shows and films are created and produced on a sound stage.
This space also features a second, smaller LED stage, with a completely mobile LED wall, camera-tracking system, and control cart, along with an engineering workshop, scanning, 3D-printing, production workspace, and equipment storage.
Volumetric Capture Tied To The Cloud
Ken Nakada, Head of Virtual Production Operations at the new ASVP building, said that the division has a full-time executive, engineering, and creative team of 20 that has been operating in stealth mode since 2020 on the design, pipeline, and build-out of Stage 15. Working with the volume wall, production creatives can interact with digital assets and processes in a manner that mirrors live-action to enable digital world capture, visualization, performance capture, simulcam, and in-camera visual effects.
[“Simulcam” refers to a real and virtual camera, which can superimpose actors onto a virtual simulation at the same time, allowing motion capture results to be used in real-time.]
The ASVP VVC wall is composed of over 3,000 LED panels that includes two large XYZ moveable doors and a “flyable“ roving wall that accommodates a wide variety of virtual scenic backgrounds. There are over 100 OptiTrack motion capture cameras mounted and tightly synchronized throughout the space to capture the live action from all angles.
“We base 66 of the motion capture cameras in the inner volume,” said Nakada. “There is an additional 60ft X 90ft contiguous outer volume area covered by 39 motion capture cameras for performances that move into and outside of the LED volume. This configuration can be modified depending upon project need.”
A wide variety of filmmakers have been invited to come into the stage to explore different creative approaches using their own traditional filmmaking tools.
Filmmakers Encouraged To Come Play In The Sandbox
“Whether it’s cameras or lighting, we’re encouraging directors, cinematographers, production designers and full creatives teams to experiment using these tools in the virtual environment so that they understand the value of having access to virtual production tools and processes like AR, VR, mixed reality and in-camera visual effects on set,” said Nakada.
Virtual production mirrors live action production so production teams go through the same stages of project development found in productions up to this point.
“Working with the volume wall, production creatives can interact with digital assets and processes in a manner that mirrors live-action to enable digital world capture, visualization, performance capture, simulcam, and in-camera visual effects.”
He added that because virtual production mirrors live action production, they typically go through the same stages of project development found in productions up to this point. There’s look development, staging and shot composition, and storyboarding and animatics. From there, previsualization and visualization techniques are developed and employed to define how to shoot a project in the most efficient way.
“We identify what the physical components and the virtual components are, and how to combine the two for optimal final composites,” he said.
In terms of on-set technology, several different systems have been installed to control camera movement/tracking and how the LED backdrop/environment moves accordingly.
Camera Tracking Tightly Linked To Virtual Sets
Nakada said that these techniques are invaluable when the crew’s focus fluctuates between live action sequences and what's displayed on the LED wall.
“In terms of camera tracking, we are tracking the physical cinema camera and attaching that position and orientation data to the digital double camera in Unreal Engine,” he said. “So that whatever the real live action camera does, the virtual camera will project with perfect parallax as an extension of what’s happening on the physical set. Everything is in perfect sync and coordination, using tools like Unreal Engine and OptiTrack motion tracking. We are taking focus data off of the camera lens and translating that data into the Unreal Engine so that the background follows the focus of the physical lens.”
Stage 15 is fully connected into the AWS cloud, and is a key piece of the facility’s “production in the cloud” ecosystem. Each production has access to a camera-to-cloud workflow, with direct connection from Stage 15 to S3 storage to make dailies instantly available to editorial, VFX, sounds and post teams from any location. Every shot taken on Stage 15 ends up in the cloud in near real time, with the ability to safely and securely distribute assets around the globe as well.
The crew shoots outside the building, when required, for things like driving shots, where they can shoot plates of city streets and environments and project them on the LED wall, then shoot the driving shots on the VVC stage.
“We can also capture locations—whether it be an existing site or difficult to access locations and bring them onto the LED wall to create very realistic facsimiles of location shoots,” Nakada said.
Another benefit of shooting on a VVC stage is that set up and strike a new set is relatively easy because there’s less physical set to build. Nakada said they are constantly building physical and digital workflows to expedite stage load-ins/outs.
VVC Production Is All About Creative Control
Cost aside, the real advantage with virtual production for filmmakers is “creative control.”
“You can change things in your background much easier,” Nakada said. “With enough preparation we can change lighting or time of day on the fly. In editorial, the editor doesn’t have to deal with a temporary green-screen composite, but immediately can have in-camera visual effects composites to work with.”
Among the biggest benefits of this VVC process, he said, are full immersion for the actors, whereby you don’t have to tell them to imagine the environment they’re acting in, and what they’re reacting to. They can immediately see where they are and react to what’s in the scene. Likewise, the director, DP and production designer, can all make creative decisions on lighting and composition on set without having to wait for a process much further down the line in postproduction.
“They get a clear vision of final picture while shooting on set,” said Nakada.
What’s also clear is that virtual production is gaining in momentum and has evolved into a new formula for making production faster and less costly than the traditional Hollywood studios ever could. It is a disruptive model that fits well with Amazons long history of disruptive business models.
You might also like...
NDI For Broadcast: Part 1 – What Is NDI?
This is the first of a series of three articles which examine and discuss NDI and its place in broadcast infrastructure.
Brazil Adopts ATSC 3.0 For NextGen TV Physical Layer
The decision by Brazil’s SBTVD Forum to recommend ATSC 3.0 as the physical layer of its TV 3.0 standard after field testing is a particular blow to Japan’s ISDB-T, because that was the incumbent digital terrestrial platform in the country. C…
Designing IP Broadcast Systems: System Monitoring
Monitoring is at the core of any broadcast facility, but as IP continues to play a more important role, the need to progress beyond video and audio signal monitoring is becoming increasingly important.
Broadcasting Innovations At Paris 2024 Olympic Games
France Télévisions was the standout video service performer at the 2024 Paris Summer Olympics, with a collection of technical deployments that secured the EBU’s Excellence in Media Award for innovations enabled by application of cloud-based IP production.
HDR & WCG For Broadcast - Expanding Acquisition Capabilities With HDR & WCG
HDR & WCG do present new requirements for vision engineers, but the fundamental principles described here remain familiar and easily manageable.