Disguise’s r19 software will have its system already set up to accommodate all 4.27 configuration requirements for Epic Unreal Engine.
Disguise’s most recent r19 release is integrated with Unreal Engine 4.27 allowing to make real-time content changes directly from disguise’s software interface Designer.
r19’s new functionalities included dynamic control with remote textures, streamlined workflows and 3D object transforms. Disguise says its users can now bring live imagery and video input from Designer into Unreal as well as synchronise tracked objects with 3D object transforms for free object rotation. The software, hardware and system infrastructure in r19 empowers dynamic control of content engines, allowing users to achieve total creative control of any production.
Epic Game’s latest release, 4.27 updates its in-camera visual effects toolset for virtual production, adding support for more features such as Multi-User Editing. Live Link Vcam, a new iOS app, enables users to drive a Cine Camera in-engine using an iPad. There are improvements for producing correct motion blur in travelling shots, accounting for the physical camera with a moving background. Other enhancements include GPU Lightmass for significantly faster light baking.
Disguise’s r19 software will have its system already set up to accommodate all 4.27 configuration requirements. Operators who currently run RenderStream can skip the setup, download their updated RenderStream plugin and get straight to work on creating and delivering real-time, photorealistic content.
“Unreal Engine, much like disguise, is leading the way for real-time graphics in virtual production, film, broadcast and many other industries,” says disguise Technical Solutions Director Peter Kirkup. “We wanted to make it easy for our customers to take advantage of the many cutting-edge features that 4.27 has to offer as soon as possible with minimal setup requirements. With the setup out of the way, our users can focus more on telling their best stories, knowing that they can rely on a consistent, repeatable and scalable workflow to make it happen.”
Recently, disguise was selected to support the installation and implementation of hardware in the virtual production stage in Epic Games’ London Innovation Lab.
Opened in 2020, Epic Games’ virtual studio in Tottenham Court Road acts as a “hub” for the creative community. Creators across industries can produce real-time photorealistic content and create immersive virtual experiences using the latest and greatest technology from the lab’s technology partners Brompton, ROE Visual, ARRI, Vicon, CVP, NVIDIA and now, disguise.
The lab has been equipped with one disguise vx 2 server for real-time content scaling, three disguise rx II render nodes for high-speed graphics processing as well as a preconfigured network switch - disguise fabric - connecting the rx and vx servers at the high bandwidth for minimal latency. It was one of the first stages to receive disguise’s rx II machines, which launched to market in June 2021.
You might also like...
NAB have announced the show scheduled for October 2021 has been cancelled.
Violent weather storms are wreaking havoc on the East Coast of the U.S. and radio and TV stations there are struggling to get the life-saving news out. In the past two months alone storms have knocked out TV antenna…
Timing accuracy has been a fundamental component of broadcast infrastructures for as long as we’ve transmitted television pictures and sound. The time invariant nature of frame sampling still requires us to provide timing references with sub microsecond accuracy.
Dialogue is king in television. Let’s face it, you don’t watch an episode of your favorite police procedural or reality show just to listen to the sound design or the incidental music. But whether the content is scripted or …
For the past year an international group of technology companies, funded by the European Union (EU), has been looking into the use of 5G technology to streamline live and studio production in the hopes of distributing more content to (and…