The Future Group Unleashes Pixotope

Pixotope takes live virtual production to a new level, and does so with single-operator software running on Microsoft’s Windows10 and off-the-shelf equipment.

The Future Group, a creative services company based in Norway, is converging the real and virtual worlds of digital productions with a powerful new software suite called Pixotope.

Running on commodity hardware, the graphics suite promises to be a game changer. In a conference call, company executives Halvor Vislie, CEO and CCO, Mike Grieve, Chief describe the software suite's key features.

Combining 3D, VR, AR and data technology, the software has been created as a fascinating 2nd generation live virtual production system.

Viewers of Super Bowl LIII, saw Pixotope being used extensively during the game’s spectacular opening sequence. The company says components of the show's opening graphics were composited live and not pre-rendered.

The Broadcast Bridge has already examined how IMR (Immersive Mixed Reality) technology can be used. In addition to the excerpts seen in the video at the top of this story, the article, The Weather Channel Gives Us a Football Forecast in Advanced IMR, reviewed how the WC used the technology to illustrate how changing weather can affect the outcome of a major football game. The weathercaster has been producing such content with an earlier version of the software, called Frontier, which was launched two years ago.

The Weather Channel used Frontier to create a video showing how weather can affect football games.  Click to enlarge.

The Weather Channel used Frontier to create a video showing how weather can affect football games. Click to enlarge.

Now, in the run up to this year's NAB show, The Future Group is debuting their second generation of this software.

“Frontier was fairly complex to use,” Vislie said, “and we wanted Pixotope to be a system with an easier, more intuitive learning curve. Also, since it is completely software-based and light on hardware, it requires less of an investment to bring into your production chain.”

“We wanted to provide our clients with more scalability, flexibility and mobility than ever before,” Vislie said. “You can add whatever number of cameras or servers your project requires, but the whole system can still be operated from a single user interface.”

Running the Unreal Engine's rendering power natively... Click to enlarge.

Running the Unreal Engine's rendering power natively... Click to enlarge.

...gives clean motion-tracked VR composites. Click to enlarge.

...gives clean motion-tracked VR composites. Click to enlarge.

Behind the GUI

“First, we chose the Unreal Engine from Epic Games as our rendering platform to create our photo-realistic real-time graphics,” he began. The center of the system is the Pixotope Director, which oversees the user configuration of the system.

“The Director is where you control all your licenses,” Grieve said. “If you purchase one monthly subscription, you get three licenses that can be run independently and concurrently from each other. But only one of these three licenses can connect to video I/O and camera tracking for use in a live virtual studio. The other two licenses are 'design seats', which are otherwise identical in functionality.”

Director is built on top of an extremely low-latency data bus, called Pixotope DataHub.

“The DataHub is where we connect everything from data sources to multiple camera configurations, including MOS (Media Object Server) workflows, custom control panels, DMX light control and GPI,” Grieve said.

Also integral to the Director is the Controller, where graphics operators design and automatically generate custom control panels that can run inside browsers on any device.

“Then, we have the the Editor, which is a very special feature of our system,” said Grieves. “This WYSIWYG editor imports digital assets from all standard content creation tools like Autodesk Maya and Maxon Cinema 4D. What is special about the Editor is that it is rendering all the time as you are interactively working on a model.”

The CG is called Text3D and uses simple slider control to fly logos and text. Currently this is a keyed layer on top of the graphics, but at NAB 2019 Grieves says it will be shown as a 3D text generator within a virtual set.

The system provides a best-in-class chroma keyer using a new 3D algorithm. All of the keyer’s real-time features can be controlled through a user-friendly remote GUI.

“Then we also have something unique called the Studio Creator, which is a procedural tool,” said Grieves. “We’ve heard from broadcasters that creating a virtual set from scratch complete with lighting sources can be pretty complex.

“But if you already know the dimensions of your desired set, the lighting sources and the props, the program can give you an automated head start,” he said. “Thereafter you can go in and add the details like special lighting conditions and geometric objects.”

Mo-Sys is just one of the tracking systems Pixotope can reference. Click to enlarge.

Mo-Sys is just one of the tracking systems Pixotope can reference. Click to enlarge.

The last module is the Tracking Server that takes the tracking data from popular camera tracking systems and distributes it back to all the cameras so that their virtual feed matches the pseudo real-world environment for that particular camera and lens.

As displayed in the Super Bowl LlII opening, the process enables an operator to cut seamlessly between the AR cameras.

“We can have all this running on one server,” Grieves said, “or we can break it out and have the Tracking Server run on one machine, Studio Creator on a second, and the Real Time Renderer on a third.”

One Pixotope subscription can be purchased for $2,500 per month per camera.

Let us know what you think…

Log-in or Register for free to post comments…

You might also like...

ZombieLoad And Other Things That Go Bump In The Night

May 14, 2019 may not have seemed a particularly important date for those who edit and color grade on Mac’s and PC’s. But it was. By chance, that day I went looking for the May Windows 10 Feature Update (1903). I was sur…

Building On IP COTS

Transitioning to IP improves flexibility and scalability, both of which are achievable using COTS IT equipment. But can COTS solve every challenge? Or does broadcasting still have some unique and more demanding requirements that need further attention? In this article,…

HDR - Part 2 - Brightness Encoding

Dealing with brightness in camera systems sounds simple. Increase the light going into the lens; increase the signal level coming out of the camera, and in turn increase the amount of light coming out of the display. In reality, it’s…

Color and Colorimetry – Part 3

The human visual system (HVS) sees color using a set of three overlapping filters, which are extremely broad. As a result, the HVS is completely incapable of performing any precise assessment of an observed spectrum.

HDR - Part 1 - The State of HDR

Over the century or so we’ve been making moving images, a lot of improvements have been dreamed up. Some of them, like stereo 3D and high frame rate, have repeatedly suffered a lukewarm reception. Other things, like HD, and e…