The Future Group Unleashes Pixotope

Pixotope takes live virtual production to a new level, and does so with single-operator software running on Microsoft’s Windows10 and off-the-shelf equipment.

The Future Group, a creative services company based in Norway, is converging the real and virtual worlds of digital productions with a powerful new software suite called Pixotope.

Running on commodity hardware, the graphics suite promises to be a game changer. In a conference call, company executives Halvor Vislie, CEO and CCO, Mike Grieve, Chief describe the software suite's key features.

Combining 3D, VR, AR and data technology, the software has been created as a fascinating 2nd generation live virtual production system.

Viewers of Super Bowl LIII, saw Pixotope being used extensively during the game’s spectacular opening sequence. The company says components of the show's opening graphics were composited live and not pre-rendered.

The Broadcast Bridge has already examined how IMR (Immersive Mixed Reality) technology can be used. In addition to the excerpts seen in the video at the top of this story, the article, The Weather Channel Gives Us a Football Forecast in Advanced IMR, reviewed how the WC used the technology to illustrate how changing weather can affect the outcome of a major football game. The weathercaster has been producing such content with an earlier version of the software, called Frontier, which was launched two years ago.

The Weather Channel used Frontier to create a video showing how weather can affect football games.  Click to enlarge.

The Weather Channel used Frontier to create a video showing how weather can affect football games. Click to enlarge.

Now, in the run up to this year's NAB show, The Future Group is debuting their second generation of this software.

“Frontier was fairly complex to use,” Vislie said, “and we wanted Pixotope to be a system with an easier, more intuitive learning curve. Also, since it is completely software-based and light on hardware, it requires less of an investment to bring into your production chain.”

“We wanted to provide our clients with more scalability, flexibility and mobility than ever before,” Vislie said. “You can add whatever number of cameras or servers your project requires, but the whole system can still be operated from a single user interface.”

Running the Unreal Engine's rendering power natively... Click to enlarge.

Running the Unreal Engine's rendering power natively... Click to enlarge. clean motion-tracked VR composites. Click to enlarge. clean motion-tracked VR composites. Click to enlarge.

Behind the GUI

“First, we chose the Unreal Engine from Epic Games as our rendering platform to create our photo-realistic real-time graphics,” he began. The center of the system is the Pixotope Director, which oversees the user configuration of the system.

“The Director is where you control all your licenses,” Grieve said. “If you purchase one monthly subscription, you get three licenses that can be run independently and concurrently from each other. But only one of these three licenses can connect to video I/O and camera tracking for use in a live virtual studio. The other two licenses are 'design seats', which are otherwise identical in functionality.”

Director is built on top of an extremely low-latency data bus, called Pixotope DataHub.

“The DataHub is where we connect everything from data sources to multiple camera configurations, including MOS (Media Object Server) workflows, custom control panels, DMX light control and GPI,” Grieve said.

Also integral to the Director is the Controller, where graphics operators design and automatically generate custom control panels that can run inside browsers on any device.

“Then, we have the the Editor, which is a very special feature of our system,” said Grieves. “This WYSIWYG editor imports digital assets from all standard content creation tools like Autodesk Maya and Maxon Cinema 4D. What is special about the Editor is that it is rendering all the time as you are interactively working on a model.”

The CG is called Text3D and uses simple slider control to fly logos and text. Currently this is a keyed layer on top of the graphics, but at NAB 2019 Grieves says it will be shown as a 3D text generator within a virtual set.

The system provides a best-in-class chroma keyer using a new 3D algorithm. All of the keyer’s real-time features can be controlled through a user-friendly remote GUI.

“Then we also have something unique called the Studio Creator, which is a procedural tool,” said Grieves. “We’ve heard from broadcasters that creating a virtual set from scratch complete with lighting sources can be pretty complex.

“But if you already know the dimensions of your desired set, the lighting sources and the props, the program can give you an automated head start,” he said. “Thereafter you can go in and add the details like special lighting conditions and geometric objects.”

Mo-Sys is just one of the tracking systems Pixotope can reference. Click to enlarge.

Mo-Sys is just one of the tracking systems Pixotope can reference. Click to enlarge.

The last module is the Tracking Server that takes the tracking data from popular camera tracking systems and distributes it back to all the cameras so that their virtual feed matches the pseudo real-world environment for that particular camera and lens.

As displayed in the Super Bowl LlII opening, the process enables an operator to cut seamlessly between the AR cameras.

“We can have all this running on one server,” Grieves said, “or we can break it out and have the Tracking Server run on one machine, Studio Creator on a second, and the Real Time Renderer on a third.”

One Pixotope subscription can be purchased for $2,500 per month per camera.

You might also like...

The Back Of The Brain May Soon Rule The Roost

If industry reports are to be believed, Apple is poised to release a mixed-reality headset at some point in 2023. Of course, it’s anyone’s guess when Apple’s Reality Pro will actually see the holographic light of day, but one t…

Learning From The Experts At The BEITC Sessions at 2023 NAB Show

Many NAB Shows visitors don’t realize that some of the most valuable technical information released at NAB Shows emanates from BEITC sessions. The job titles of all but one speaker in the conference are all related to engineering, technology, d…

System Showcase: Gravity Media Opens New Production Centre In London For Remote And On-Prem Projects

Production and media services company Gravity Media has just completed the first phase of technology integration at its new London Production Centre White City, located on the fifth floor of The Westworks building in London. The state-of-the-art facility, officially opened…

Interlace: Part 3 - Deinterlacing

Now that interlace is obsolete, we are left only with the problem of dealing with archive material that exists in the interlaced format. The overwhelming majority of video tapes, whether component or composite, analog or digital, would be interlaced.

Magicbox Puts Virtual Production Inside An LED Volume On Wheels

Virtual production studios are popping up across the globe as the latest solution for safe and cost/time-effective TV and movie production. This method replaces on location shooting and, by utilizing all-encompassing LED walls (often called “volumes”), is fundamentally changing the…