Disguise Used For OMEN Challenge Esports

Disguise gx 2c and gx 1 media servers were used to generate content that accompanied the gameplay of the OMEN Challenge. The esports event was the first to be broadcast using disguise xR workflows, which combined Augmented, Virtual and Mixed Reality technologies.

The fifth annual OMEN Challenge was held with the support of OMEN by HP, with eight players competing for a prize pool of $50,000. The Counter-Strike: Global Offensive tournament went beyond the traditional esports tournament format, offering a battle arena with immersive stage and content.

The project, run by creative media agency AKQA, was realised in collaboration with Scott Millar and Pixel Artworks, who designed and developed a bespoke pipeline toolset using disguise, Notch and TouchDesigner, to create a broadcast with AR, Mixed Reality and live show elements. It was broadcast on Twitch and other streaming platforms.

"HP wanted to do something different with big LED video walls, AR and MR tricks that would look good to the live and broadcast audiences,” notes Technical Producer Scott Millar, working alongside Pixel Artworks to power video for the event using disguise. Real-time MR content was made in Notch, with data coming in from the game to power the info graphics. The final output was carried out in disguise and output to the LED video walls.”

Mixed Reality was used in two ways during the show. The first was in using a Notch generated studio environment, which allowed the casters to be transported to a different world in-camera. The world could also be rendered from the game engine to place the casters directly into the map. The second use was to allow players and interviewers to “step into” the game world and replay the biggest moments. Using a Steadicam and rendering the game engine into both the LED and virtual worlds, the players could see themselves in the game, and describe their best moves.

“Working with the developer of the OMEN game and amateur map designers, we built a custom solution to make the real-world set, game data and content align with the digital equivalent,” explains Oliver Ellmers, Interactive Developer with Pixel Artworks. “We essentially created a virtual studio and OMEN set which, through camera, would seem as though a real person was in the game. This was achieved by using xR to enlarge and overlay content in-camera while using MR to track the real-broadcast camera world, including camera tracking, so the live broadcast cameras were aligned with the digital cameras in the game.

You might also like...

We Need Perfect Lenses After All

Veteran cinematographers and DOPs have long understood that lenses have a personality with a specific look and feel. In the same way that an actor imparts his or her interpretation on a film’s story, the DOP selects a lens t…

Making Remote Mainstream:  Part 2 - Core Infrastructures

In part-1 of this three-part series we discussed the benefits of Remote Production and some of the advantages it provides over traditional outside broadcasts. In this part, we look at the core infrastructure and uncover the technology behind this revolution.

Color and Colorimetry – Part 10

Since its adoption for NTSC, essentially every subsequent electronic distribution means for color images has relied on color differences, making it a topic of some importance.

Making Remote Mainstream: Part 1 - Understanding The Benefits

Recent international events have overtaken normality causing us to take an even closer look at how we make television. Physical isolation is greatly accelerating our interest in Remote Production, REMI and At-Home working, and this is more important now than…

NAB 2020 Will Not Be Rescheduled; NAB Express To Begin Online in April

NAB 2020 will not be rescheduled this year, the NAB has announced. Instead, the organization will begin NAB Express in April, an online initiative, and will enhance the NAB Show New York at the end of the year with additional programs…