Blurring the real and the virtual live

The capture of depth information in scenes is an increasingly rich field of development but it has so far remained on the fringes of TV production because of the need to either use physical markers on objects or to render the output in post. Start-up company Zinemath aims to change that with a technology called zLense it claims to be the first real-time 3D depth mapping tool for broadcast.

The zLense houses six sensors in a matte box-style unit attached to the front of a camera to collect information about the depth of objects for each pixel in a frame without need for external tracking sensors or markers. Software then computes the depth (or z axis) alongside camera tracking data and fuses it with the images from the camera. The output is rendered in realtime in an adjacent workstation using multiple GPU cards and SDI IO cards with 4K support to provide the final composite video. The box is light enough to used with handheld or Steadicam.

A studio-based product will launch early 2015 with a starting price of $100,000. A second version is being developed also for release next year and targeting the feature film and location shooting market.

“With this device the gap between the VFX options that a broadcaster has available and those of a film producer will get much smaller,” says Bruno Gyorgy, president and CEO of Zinemath. “TV producers will have the ability to introduce certain VFX in realtime which are not possible without [zLense].”

Examples he cites include the ability to animate a 3D graphic (such as a dolphin - pictured) behind and in front of a presenter or object using just a green screen and lighting effects such as volumetric lighting, global illumination, shadow casting and reflections.

'TV producers will have the ability to introduce certain VFX in realtime which were not possible before' says Bruno Gyorgy

'TV producers will have the ability to introduce certain VFX in realtime which were not possible before' says Bruno Gyorgy

“Current VR technologies are layer based, and cannot provide accurate 3D compositing of the real and virtual worlds,” he says. “Also the lack of full 3D scene information does not allow perfect visual interaction between the virtual and real world, like reflections, shadows and artificial lighting. Also to make real time interactions between the talent and the virtual world, an external motion tracking solution is required.”

Interactions between a presenter and a virtual object can also be achieved without visible markers or a separate tracking system. So a sports presenter could be on the field of play at half time analysing set pieces with virtual graphics rather than being in the studio.

“Installing the rendering engine in an OB or SNG truck can provide real-time augmented reality enhanced broadcast from the field,” he says. “We believe this allows creatives to get certain visual values on the screen which was not done before.”

He suggests unscripted programming such as elements of game shows could benefit from making freer use of graphics to enhance production. Product placement could also be made easier, he suggests.

“You could shoot the show and then get a deal on product placement, or easily insert different branded products into scenes for sale of the programme in different territories.”

While the studio product has a distance of 12 metres, the film version which is primarily intended for pre-visualization, will be optimised for longer range.

“This technology will have a major impact on how films are made by making the creation of certain VFX accessible to smaller budget movies,” claims Gyorgy. “It will allows directors and actors to see in realtime what they are doing in a virtual scene.”

The basic package comes with a Unity3D rendering engine for “cost-effective studio production” but for channels looking for higher performance integration with rendering engines from VizRT and Orad is facilitated.

According to inventor Norbert Komenczi, the focal length of a shot could also be altered in post production. “If you have shot footage with a great depth of field you can shorten it as much you want and put the focus point anywhere.”

In depth

The zLense unit contains two small RGB cameras, an inertial motion unit to record the acceleration of camera movement and three Time Of Flight depth sensors. The latter are developed by Bluetechnix and fire infrared light into a scene to measure the distance between the lens and objects in front of it. According to Komenczi the Time Of Flight sensors do not interfere with the actual lighting setup nor the light hitting the camera sensor. The depth information is recoded by the system in parallel to that of the image which is recorded – in HD, 2K, 4K or beyond – by the camera as normal. Realtime 3D composition can be viewed by a director on a monitor.

Zinemath has its R&D centre in Budapest. Komenczi, a former TV director, initiated development in 2010 and gained investment from Luxembourg's Docler Group in 2012.

Gyorgy is also COO at Docler Entertainment and was a former EVP production services at Budapest’s Korda Studios.

“Depth sensing cameras may well support everyday news and sports broadcasting as a result of their additional dimensions and the simplicity of their use since they do not require a studio environment or any complex calibration,” he added.

“They may allow informative and spectacular presentations as well as a set of graphics applications that bring flat diagrams and explanatory illustrations to life. The border between the real and generated content may be blurred.”

You might also like...

Wi-Fi Gets Wider With Wi-Fi 7

The last 56k dialup modem I bought in 1998 cost more than double the price of a 28k modem, and the double bandwidth was worth the extra money. New Wi-Fi 7 devices are similarly premium-priced because early adaptation of leading-edge new technology…

NAB Show 2024 BEIT Sessions Part 2: New Broadcast Technologies

The most tightly focused and fresh technical information for TV engineers at the NAB Show will be analyzed, discussed, and explained during the four days of BEIT sessions. It’s the best opportunity on Earth to learn from and question i…

Standards: Part 6 - About The ISO 14496 – MPEG-4 Standard

This article describes the various parts of the MPEG-4 standard and discusses how it is much more than a video codec. MPEG-4 describes a sophisticated interactive multimedia platform for deployment on digital TV and the Internet.

Chris Brown Discusses The Themes Of The 2024 NAB Show

The Broadcast Bridge sat down with Chris Brown, executive vice president and managing director, NAB Global Connections and Events to discuss this year’s gathering April 13-17 (show floor open April 14-17) and how the industry looks to the show e…

Essential Guide: Next-Gen 5G Contribution

This Essential Guide explores the technology of 5G and its ongoing roll out. It discusses the technical reasons why 5G has become the new standard in roaming contribution, and explores the potential disruptive impact 5G and MEC could have on…