Disguise Integrates With Epic Games’ Unreal Engine 4.27

Disguise’s most recent r19 release is integrated with Unreal Engine 4.27 allowing to make real-time content changes directly from disguise’s software interface Designer.

r19’s new functionalities included dynamic control with remote textures, streamlined workflows and 3D object transforms. Disguise says its users can now bring live imagery and video input from Designer into Unreal as well as synchronise tracked objects with 3D object transforms for free object rotation. The software, hardware and system infrastructure in r19 empowers dynamic control of content engines, allowing users to achieve total creative control of any production.

Epic Game’s latest release, 4.27 updates its in-camera visual effects toolset for virtual production, adding support for more features such as Multi-User Editing. Live Link Vcam, a new iOS app, enables users to drive a Cine Camera in-engine using an iPad. There are improvements for producing correct motion blur in travelling shots, accounting for the physical camera with a moving background. Other enhancements include GPU Lightmass for significantly faster light baking.

Disguise’s r19 software will have its system already set up to accommodate all 4.27 configuration requirements. Operators who currently run RenderStream can skip the setup, download their updated RenderStream plugin and get straight to work on creating and delivering real-time, photorealistic content.

“Unreal Engine, much like disguise, is leading the way for real-time graphics in virtual production, film, broadcast and many other industries,” says disguise Technical Solutions Director Peter Kirkup. “We wanted to make it easy for our customers to take advantage of the many cutting-edge features that 4.27 has to offer as soon as possible with minimal setup requirements. With the setup out of the way, our users can focus more on telling their best stories, knowing that they can rely on a consistent, repeatable and scalable workflow to make it happen.”

Recently, disguise was selected to support the installation and implementation of hardware in the virtual production stage in Epic Games’ London Innovation Lab.

Opened in 2020, Epic Games’ virtual studio in Tottenham Court Road acts as a “hub” for the creative community. Creators across industries can produce real-time photorealistic content and create immersive virtual experiences using the latest and greatest technology from the lab’s technology partners Brompton, ROE Visual, ARRI, Vicon, CVP, NVIDIA and now, disguise.

The lab has been equipped with one disguise vx 2 server for real-time content scaling, three disguise rx II render nodes for high-speed graphics processing as well as a preconfigured network switch - disguise fabric - connecting the rx and vx servers at the high bandwidth for minimal latency. It was one of the first stages to receive disguise’s rx II machines, which launched to market in June 2021. 

You might also like...

HDR: Part 1 - The State of HDR

Over the century or so we’ve been making moving images, a lot of improvements have been dreamed up. Some of them, like stereo 3D and high frame rate, have repeatedly suffered a lukewarm reception. Other things, like HD, and e…

2022 NAB Show Review, Part 2

With fewer exhibits and smaller crowds, the 2022 NAB Show aisles were easier to navigate and exhibitors had more time to speak with visitors.

2022 NAB Show Review, Part 1

Many annual NAB Shows have become milestones in TV broadcasting history. The presence of the 2022 NAB Show marked the first Las Vegas NAB Show since 2019.

Understanding XR

Although virtual sets and even Augmented Reality (AR) have been around for decades now, the concept has evolved with the arrival of new display technologies like LED videowalls, which provide alternatives to traditional chroma keying.

Creative Technology - HPA Review: Remote Operation

For decades, a television studio’s production team has been no further from the action than a cable can comfortably be run.