Applied Technology: Shotoku Free-d² & Broadcast Graphics

The use of on-screen graphics to add value and content to a broadcast is a longstanding production technique. The integration of interactive graphics into a live broadcast, although a more recent addition, is now a staple of news and sports productions worldwide.

In this live environment, for the interaction of a presenter and their ‘augmented reality’ to at once be convincing and effective, the challenge has always been to accurately and reliably track and capture the positional data of the broadcast camera, so that rendered graphical content can be composited seamlessly onto the video feed. Breakdowns in this process during live playout, or accumulated errors in positional data, quickly become evident to even an untrained viewer. While of course being professionally unacceptable, it would also call into question the value of the technology to help broadcasters tell their story. Additionally a tracking system must be flexible, user friendly and not impede the operation of the camera unnecessarily.

Free-d² is the second generation of ‘absolute’ tracking technology Shotoku Broadcast Systems is developing in partnership with BBC R&D in the UK. Building on the original system with refreshed and technologically advanced hardware, Free-d² calculates the camera’s absolute position, from externally mounted fixed markers and communicates this to a graphics engine in real-time. Being absolute, the system is immune to positional ‘drift’ and can be deployed onto any form of camera support - Tripod, Pedestal, Jib system or indeed, handheld - and has virtually no impact on the operation of the camera chain. 

The spotter camera’s LED shoots upward, reading the reference location disks mounted on the studio ceiling.  The system develops positional calculations including the camera’s X,Y and height positions, but also Pan, Tilt and Roll, without the need of supplementary data from accelerometers or encoders.

The spotter camera’s LED shoots upward, reading the reference location disks mounted on the studio ceiling. The system develops positional calculations including the camera’s X,Y and height positions, but also Pan, Tilt and Roll, without the need of supplementary data from accelerometers or encoders.

The system works by viewing the reflected light from a series of markers mounted at two distinct heights in the studio ceiling space, using an upward facing video camera. The ceiling markers, each with a unique ID, are produced from a highly retro-reflective material. This ensures light from the spotter camera’s LED’s is efficiently reflected directly back toward the camera, eliminating any impact from other bright light sources in the studio.

During installation, a studio calibration file is created containing the coordinates of each marker in extremely accurate X,Y and height positions. During operation, the video feed from the spotter camera is analysed against this calibration file and from it, the precise position of the broadcast camera in the 3D space is calculated with a minimal delay of just 1 to 2 frames. This is combined with zoom and focus data from the lens and is then distributed to the graphics rendering engine via industry standard protocols and timed to the video output.

A key benefit of the Free-d² system is that because the calibration file is for the studio, and not the individual camera, the file is deployed across all current and future additional channels. This file can remain valid for the duration of the installation, which makes the system extremely low maintenance. A new calibration is only required if there are significant changes to a number of the marker positions.

Due to the unique ID of each marker, and their bi-planar installation, positional calculations are able to report accurately and reliably not just the camera’s X,Y and height positions, but also Pan, Tilt and Roll, without the need of supplementary data from accelerometers or encoders. There is also no intervention or operational input required from the broadcast camera operator which keeps integration into workflows very simple. 

A Free-d² equipped studio affords broadcasters the opportunity to enrich programming content with the use of live, real-time graphical content. This could be in a full green screen environment, a format popular in data-intensive, non-regular broadcasts such as election night specials. A full green screen studio with reliable tracking can also be utilised to produce multiple, individually branded shows from a single location with set changes available at the push of a button.

More common recently, and perhaps a tougher test of any camera tracking system is the option to augment a traditional set with graphical content, played out into defined areas of the physical set when relevant, for example, a virtual video screen. This is a technique that has become more common as sports productions, and more recently light entertainment shows, have begun to explore the benefits of the Virtual Studio. For this to add value to a broadcast - and not to detract from it - graphical content must be placed accurately and remain stable during on-air camera moves.

Through the use of modern technologies Shotoku Broadcast’s Free-d² seeks to build on the original system’s reputation as one of the most consistently accurate and reliable tracking systems available, and as such provides the perfect platform for this type of programming. The system allows full and credible engagement with virtual set elements through its accurate positional reporting. Utilising the external, fixed positional markers, positional data will not drift, ensuring dependable, repeatable performance time after time.

Finally, by removing the need for regular recalibration the system minimises maintenance demands and with the spotter camera being operationally independent of the broadcast chain, the tracking system is both unobtrusive and undemanding on the camera operator allowing them to focus fully on making the shots. 

David Shepherd, Sales Manager, Manual & Virtual Studio, Shotoku

David Shepherd, Sales Manager, Manual & Virtual Studio, Shotoku

You might also like...

Learning From The Experts At The BEITC Sessions at 2023 NAB Show

Many NAB Shows visitors don’t realize that some of the most valuable technical information released at NAB Shows emanates from BEITC sessions. The job titles of all but one speaker in the conference are all related to engineering, technology, d…

Magicbox Puts Virtual Production Inside An LED Volume On Wheels

Virtual production studios are popping up across the globe as the latest solution for safe and cost/time-effective TV and movie production. This method replaces on location shooting and, by utilizing all-encompassing LED walls (often called “volumes”), is fundamentally changing the…

Celebrating BEITC At NAB Show

As we approach the 2023 NAB Show in the NAB centenary year, we celebrate the unique insight and influence of the Broadcast Engineering & IT Conference that happens alongside the show each year.

Artificial Inspiration – Debating The Implications Of Training AI To Create Images

There is growing debate over the ethical and legal implications of using millions of images drawn from the internet to train AI powered software to create ‘new’ images. It feels like the beginning of a journey which could have profound imp…

Capturing Nitin Sawhney’s Ghosts In The Ruins

The 2022 City of Culture festival concluded with a performance created by Nitin Sawhney CBE. Filmed on the URSA Broadcast G2, an edited broadcast of Ghosts In The Ruins aired on the BBC.