The Future Group Unleashes Pixotope

Pixotope takes live virtual production to a new level, and does so with single-operator software running on Microsoft’s Windows10 and off-the-shelf equipment.

The Future Group, a creative services company based in Norway, is converging the real and virtual worlds of digital productions with a powerful new software suite called Pixotope.

Running on commodity hardware, the graphics suite promises to be a game changer. In a conference call, company executives Halvor Vislie, CEO and CCO, Mike Grieve, Chief describe the software suite's key features.

Combining 3D, VR, AR and data technology, the software has been created as a fascinating 2nd generation live virtual production system.

Viewers of Super Bowl LIII, saw Pixotope being used extensively during the game’s spectacular opening sequence. The company says components of the show's opening graphics were composited live and not pre-rendered.

The Broadcast Bridge has already examined how IMR (Immersive Mixed Reality) technology can be used. In addition to the excerpts seen in the video at the top of this story, the article, The Weather Channel Gives Us a Football Forecast in Advanced IMR, reviewed how the WC used the technology to illustrate how changing weather can affect the outcome of a major football game. The weathercaster has been producing such content with an earlier version of the software, called Frontier, which was launched two years ago.

The Weather Channel used Frontier to create a video showing how weather can affect football games.  Click to enlarge.

The Weather Channel used Frontier to create a video showing how weather can affect football games. Click to enlarge.

Now, in the run up to this year's NAB show, The Future Group is debuting their second generation of this software.

“Frontier was fairly complex to use,” Vislie said, “and we wanted Pixotope to be a system with an easier, more intuitive learning curve. Also, since it is completely software-based and light on hardware, it requires less of an investment to bring into your production chain.”

“We wanted to provide our clients with more scalability, flexibility and mobility than ever before,” Vislie said. “You can add whatever number of cameras or servers your project requires, but the whole system can still be operated from a single user interface.”

Running the Unreal Engine's rendering power natively... Click to enlarge.

Running the Unreal Engine's rendering power natively... Click to enlarge. clean motion-tracked VR composites. Click to enlarge. clean motion-tracked VR composites. Click to enlarge.

Behind the GUI

“First, we chose the Unreal Engine from Epic Games as our rendering platform to create our photo-realistic real-time graphics,” he began. The center of the system is the Pixotope Director, which oversees the user configuration of the system.

“The Director is where you control all your licenses,” Grieve said. “If you purchase one monthly subscription, you get three licenses that can be run independently and concurrently from each other. But only one of these three licenses can connect to video I/O and camera tracking for use in a live virtual studio. The other two licenses are 'design seats', which are otherwise identical in functionality.”

Director is built on top of an extremely low-latency data bus, called Pixotope DataHub.

“The DataHub is where we connect everything from data sources to multiple camera configurations, including MOS (Media Object Server) workflows, custom control panels, DMX light control and GPI,” Grieve said.

Also integral to the Director is the Controller, where graphics operators design and automatically generate custom control panels that can run inside browsers on any device.

“Then, we have the the Editor, which is a very special feature of our system,” said Grieves. “This WYSIWYG editor imports digital assets from all standard content creation tools like Autodesk Maya and Maxon Cinema 4D. What is special about the Editor is that it is rendering all the time as you are interactively working on a model.”

The CG is called Text3D and uses simple slider control to fly logos and text. Currently this is a keyed layer on top of the graphics, but at NAB 2019 Grieves says it will be shown as a 3D text generator within a virtual set.

The system provides a best-in-class chroma keyer using a new 3D algorithm. All of the keyer’s real-time features can be controlled through a user-friendly remote GUI.

“Then we also have something unique called the Studio Creator, which is a procedural tool,” said Grieves. “We’ve heard from broadcasters that creating a virtual set from scratch complete with lighting sources can be pretty complex.

“But if you already know the dimensions of your desired set, the lighting sources and the props, the program can give you an automated head start,” he said. “Thereafter you can go in and add the details like special lighting conditions and geometric objects.”

Mo-Sys is just one of the tracking systems Pixotope can reference. Click to enlarge.

Mo-Sys is just one of the tracking systems Pixotope can reference. Click to enlarge.

The last module is the Tracking Server that takes the tracking data from popular camera tracking systems and distributes it back to all the cameras so that their virtual feed matches the pseudo real-world environment for that particular camera and lens.

As displayed in the Super Bowl LlII opening, the process enables an operator to cut seamlessly between the AR cameras.

“We can have all this running on one server,” Grieves said, “or we can break it out and have the Tracking Server run on one machine, Studio Creator on a second, and the Real Time Renderer on a third.”

One Pixotope subscription can be purchased for $2,500 per month per camera.

You might also like...

Boring Ads Offset Super Bowl 2020 Viewing Innovations For Fox

Super Bowl may not be the most watched sporting event in the world but remains a showpiece for US broadcasting where the latest technologies and innovations in coverage are displayed.

The Future Of Performance Capture, Gollum, Apes And Andy Serkis

Andy Serkis has become the face - and body - of performance capture. The technology has allowed the creation of more convincing and realistic looking fantasy creatures, notably Serkis’ own standout roles as Gollum in the Lord of the Rings…

AI – ML – DL – The Differences

Artificial Intelligence is more than just one element. In this article, we look at and describe the many parts AI encompasses.

Weather Channel Uses Immersive Mixed Reality to Educate Viewers About the Dangers From Winter Ice

In their latest hyper-realistic VR weather warning, The Weather Channel helps viewers better understand the potential dangers created by ice storms.

2018 NAB Show Highlights Complex State of the Industry

Following numerous private conversations and panel discussions at the recent 2018 NAB Show, it’s become clear that broadcasters are being challenged like never before to hold the line on CapEx spending while delivering more content across their linear platforms. Because o…