Mo-Sys Takes Virtual Production To A Truer Focus

Mo-Sys Cinematic XR Focus enables Cinematographers to seamlessly pull focus between real and virtual worlds.

Cinematographers who needed to pull focus between real foreground objects – such as actors – and virtual objects displayed on an LED wall – such as a car - have been unable to do this as the lens focal plane stops at the LED wall, meaning the car always remains out of focus.

Now, with Cinematic XR Focus, Focus Pullers using the same wireless lens control system they’re used to, can pull focus seamlessly from real objects through the LED wall to focus on virtual objects that appear to be positioned behind the LED wall. The reverse focus pull is also possible.

Cinematic XR Focus is an option for Mo-Sys’ virtual production software VP Pro working with Mo-Sys’ StarTracker camera tracking technology. Cinematic XR Focus synchronises the lens controller with the output of the Unreal Engine graphics, relying on StarTracker to constantly track the distance between the camera and the LED wall. The solution is available from Mo-Sys and is compatible with Preston wireless lens controllers (Hand Unit 3 and MDR-3).“

Production companies have excitedly embraced virtual production and on-set finishing using LED volumes. The ability to create any internal or external set background or set extension with an LED volume, has truly changed the dynamics of film making,” said Michael Geissler, CEO of Mo-Sys.

“But there have been limitations. Pulling focus – a fundamental part of the grammar of movies, to direct the audience’s attention to different parts of the screen – has been difficult. The Cinematic XR Focus software add-on transforms the possibilities, allowing Cinematographers to freely realise their creative ambitions.

”Recent advances in LED display technology have meant that it is now perfectly practical to shoot in-camera VFX shots in real-time. Having the finished virtual graphics on the LED wall means that there is no green/blue spill to remove in post-production, as the LED volume casts the correct soft lighting around the talent. All this reduces the cost of post-production compositing which, along with the savings in location costs, make virtual production a financially attractive and time-saving choice for producers.

You might also like...

HDR & WCG For Broadcast - Expanding Acquisition Capabilities With HDR & WCG

HDR & WCG do present new requirements for vision engineers, but the fundamental principles described here remain familiar and easily manageable.

What Does Hybrid Really Mean?

In this article we discuss the philosophy of hybrid systems, where assets, software and compute resource are located across on-prem, cloud and hybrid infrastructure.

HDR & WCG For Broadcast - HDR Picture Fundamentals: Color

How humans perceive color and the various compromises involved in representing color, using the historical iterations of display technology.

Audio At IBC 2024

Great audio is fundamental to any great broadcast and professional audio remains one of the busiest areas of the show both in terms of number of exhibitors and innovative new technologies on show. IP and cloud developments seem set to…

Encoding & Transport For Remote Contribution At IBC 2024

The technology required to get high quality content from the venue to the viewer for live sports production remains an area of intense research and development, so there will be plenty of innovation and expertise in this area on the…