Disguise Integrates With Epic Games’ Unreal Engine 4.27

Disguise’s most recent r19 release is integrated with Unreal Engine 4.27 allowing to make real-time content changes directly from disguise’s software interface Designer.

r19’s new functionalities included dynamic control with remote textures, streamlined workflows and 3D object transforms. Disguise says its users can now bring live imagery and video input from Designer into Unreal as well as synchronise tracked objects with 3D object transforms for free object rotation. The software, hardware and system infrastructure in r19 empowers dynamic control of content engines, allowing users to achieve total creative control of any production.

Epic Game’s latest release, 4.27 updates its in-camera visual effects toolset for virtual production, adding support for more features such as Multi-User Editing. Live Link Vcam, a new iOS app, enables users to drive a Cine Camera in-engine using an iPad. There are improvements for producing correct motion blur in travelling shots, accounting for the physical camera with a moving background. Other enhancements include GPU Lightmass for significantly faster light baking.

Disguise’s r19 software will have its system already set up to accommodate all 4.27 configuration requirements. Operators who currently run RenderStream can skip the setup, download their updated RenderStream plugin and get straight to work on creating and delivering real-time, photorealistic content.

“Unreal Engine, much like disguise, is leading the way for real-time graphics in virtual production, film, broadcast and many other industries,” says disguise Technical Solutions Director Peter Kirkup. “We wanted to make it easy for our customers to take advantage of the many cutting-edge features that 4.27 has to offer as soon as possible with minimal setup requirements. With the setup out of the way, our users can focus more on telling their best stories, knowing that they can rely on a consistent, repeatable and scalable workflow to make it happen.”

Recently, disguise was selected to support the installation and implementation of hardware in the virtual production stage in Epic Games’ London Innovation Lab.

Opened in 2020, Epic Games’ virtual studio in Tottenham Court Road acts as a “hub” for the creative community. Creators across industries can produce real-time photorealistic content and create immersive virtual experiences using the latest and greatest technology from the lab’s technology partners Brompton, ROE Visual, ARRI, Vicon, CVP, NVIDIA and now, disguise.

The lab has been equipped with one disguise vx 2 server for real-time content scaling, three disguise rx II render nodes for high-speed graphics processing as well as a preconfigured network switch - disguise fabric - connecting the rx and vx servers at the high bandwidth for minimal latency. It was one of the first stages to receive disguise’s rx II machines, which launched to market in June 2021. 

You might also like...

Next-Gen 5G Contribution: Part 2 - MEC & The Disruptive Potential Of 5G

The migration of the core network functionality of 5G to virtualized or cloud-native infrastructure opens up new capabilities like MEC which have the potential to disrupt current approaches to remote production contribution networks.

Designing IP Broadcast Systems: Addressing & Packet Delivery

How layer-3 and layer-2 addresses work together to deliver data link layer packets and frames across networks to improve efficiency and reduce congestion.

Next-Gen 5G Contribution: Part 1 - The Technology Of 5G

5G is a collection of standards that encompass a wide array of different use cases, across the entire spectrum of consumer and commercial users. Here we discuss the aspects of it that apply to live video contribution in broadcast production.

Virtual Production At America’s Premier Film And TV Production School

The School of Cinematic Arts at the University of Southern California (USC) is renowned for its wide range of courses and degrees focused on TV and movie production and all of the sub-categories that relate to both disciplines. Following real-world…

Why AI Won’t Roll Out In Broadcasting As Quickly As You’d Think

We’ve all witnessed its phenomenal growth recently. The question is: how do we manage the process of adopting and adjusting to AI in the broadcasting industry? This article is more about our approach than specific examples of AI integration;…