Mo-Sys Takes Virtual Production To A Truer Focus

Mo-Sys Cinematic XR Focus enables Cinematographers to seamlessly pull focus between real and virtual worlds.

Cinematographers who needed to pull focus between real foreground objects – such as actors – and virtual objects displayed on an LED wall – such as a car - have been unable to do this as the lens focal plane stops at the LED wall, meaning the car always remains out of focus.

Now, with Cinematic XR Focus, Focus Pullers using the same wireless lens control system they’re used to, can pull focus seamlessly from real objects through the LED wall to focus on virtual objects that appear to be positioned behind the LED wall. The reverse focus pull is also possible.

Cinematic XR Focus is an option for Mo-Sys’ virtual production software VP Pro working with Mo-Sys’ StarTracker camera tracking technology. Cinematic XR Focus synchronises the lens controller with the output of the Unreal Engine graphics, relying on StarTracker to constantly track the distance between the camera and the LED wall. The solution is available from Mo-Sys and is compatible with Preston wireless lens controllers (Hand Unit 3 and MDR-3).“

Production companies have excitedly embraced virtual production and on-set finishing using LED volumes. The ability to create any internal or external set background or set extension with an LED volume, has truly changed the dynamics of film making,” said Michael Geissler, CEO of Mo-Sys.

“But there have been limitations. Pulling focus – a fundamental part of the grammar of movies, to direct the audience’s attention to different parts of the screen – has been difficult. The Cinematic XR Focus software add-on transforms the possibilities, allowing Cinematographers to freely realise their creative ambitions.

”Recent advances in LED display technology have meant that it is now perfectly practical to shoot in-camera VFX shots in real-time. Having the finished virtual graphics on the LED wall means that there is no green/blue spill to remove in post-production, as the LED volume casts the correct soft lighting around the talent. All this reduces the cost of post-production compositing which, along with the savings in location costs, make virtual production a financially attractive and time-saving choice for producers.

You might also like...

Designing IP Broadcast Systems: Addressing & Packet Delivery

How layer-3 and layer-2 addresses work together to deliver data link layer packets and frames across networks to improve efficiency and reduce congestion.

The Cinematographers View On The 2024 NAB Show

Our resident cinematographer and all-round imaging expert Phil Rhodes walked the floor at the 2024 NAB Show and this is what he made of it all.

Next-Gen 5G Contribution: Part 1 - The Technology Of 5G

5G is a collection of standards that encompass a wide array of different use cases, across the entire spectrum of consumer and commercial users. Here we discuss the aspects of it that apply to live video contribution in broadcast production.

Virtual Production At America’s Premier Film And TV Production School

The School of Cinematic Arts at the University of Southern California (USC) is renowned for its wide range of courses and degrees focused on TV and movie production and all of the sub-categories that relate to both disciplines. Following real-world…

Why AI Won’t Roll Out In Broadcasting As Quickly As You’d Think

We’ve all witnessed its phenomenal growth recently. The question is: how do we manage the process of adopting and adjusting to AI in the broadcasting industry? This article is more about our approach than specific examples of AI integration;…