Television production these days is tricky enough without adding virtual elements like augmented reality (AR) graphics, but that’s exactly what Taipei-based production company Getop did for the live telecast of the 2020 Golden Melody Awards (GMA). The highly rated annual televised awards ceremony - considered the “Asian Grammys” by many - celebrates top musical talent from across southeast Asia.
Originally scheduled for June 27th, the ceremony was rescheduled to October 3rd, due to the pandemic, and moved to the outdoor Taipei Music Center. It was broadcast to high ratings on the Taiwan Television Channel and streamed live, with Taiwanese talent taking home many of the top prizes.
AR Takes Center Stage
The event featured a surprise performance by Taiwanese Artist Hebe Tien that was supported by a large videowall backdrop and a number of AR effects floating across the stage and around the star. The complex AR segment presented a number of technical challenges in making the virtual effect work for viewers at home. Besides the immense logistics - working with hundreds of people from different companies and departments - the AR portion of the production took a lot of technical coordination and finessing to get it right. Paul Chou, CEO of Getop, said they began preparing for the event a month and a half prior and spent a lot of time rehearsing to ensure everything worked as desired.
He said the key to making the AR effect believable and seamless during Tien’s performance was synchronizing the timing of the live streams of each camera angle in real time using software and hardware. They had to get it perfect or the stunning effect would be lost on viewers. The pressure was intense during production.
“The high risk of live television made this production a high-pressure working environment because it’s either rated 0 or 100 at the end of the day,” said Chou. “AR brings these richly produced images to the production and opens up more opportunities for creativity. It satisfies the needs of the new generation that is used to seeing content from the internet that contains a lot of effects.”
The Getop team used a variety of 12G equipment from AJA Video Systems in tandem with other vendors to make it happen. They developed a flexible live AR production system for the event that included a Dell 7920 workstation, an AJA Corvid 88 card for audio and video I/O, Pixotope software - a software-based solution on COTS hardware used to create the AR graphics - and Epic Games’ Unreal Engine for real-time rendering and compositing. An AJA GEN 10 sync generator handled synchronization of the camera feeds, while an HD10CEA HD/SD-SDI converter was used to split the feed into two to generate additional signals for the effect.
During pre-production there were time delay issues in the Unreal Engine workflow, and an AJA Ki Pro Ultra 12G multichannel HD recorder, which supports the Apple ProRes codec, was used to solve the delay. Images were composited in real-time and output to an OB truck parked on site. With Ki Pro Ultra 12G’s multichannel HD recording capabilities, the crew was able to capture the output of the AR rendering engine alongside other live production feeds coming from the OB truck and the cameras. This was critically important, since the delay of each signal was different and had to be adjusted in order for the AR effect to work.
Chou said these signal delays occurred for several reasons, including the continuous large amounts of data processing in Unreal Engine to precisely position high fidelity AR effects. But the most noticeable delay, he said, came from the signal that was passed through the broadcast systems and captured with the AJA recorder. Because the live camera outputs often contained longer delay times than the PGM output, there was a noticeable lag when the AR elements were produced in sync with the musical performance.
“When we received the signal from the truck to monitor the program feed, it contained some delay problems,” said Chou. “For all of these signals, where we were getting different delay times, the Ki Pro Ultra 12G made it easy for us to record all the signals in sync, without having to use genlock as a reference. This helped reduce a lot of time spent managing the image files.”
For routing, Getop chose an AJA KUMO 3232-12G router, which handled all of the various audio and video sources, and automatically distributed them to the appropriate destinations (within the venue and outside), using a number of preset parameters. The Getop crew also used a host of Mini-Converters to change the SDI signal to HDMI for the inputs of the large monitors on-stage behind Tien as she sang. The AR elements appeared to flow out of the 2D screens (and 3D graphics) and land lightly on the stage before disappearing. These effects were only seen by viewers at home and on smaller monitors mounted throughout the venue.
Grading For REC.709 Color Space
Tien’s AR performance was recorded on a Ki Pro Ultra 12G recorder for archival purposes and monitored with an EIZO CG319X Grade One monitor working in the SDR REC.709 color space. The reference monitor was used extensively to check for correct color rendition.
At the end of the day, Chou said that using technology that is pro-certified to work seamlessly together makes networking and production workflows run smoother because the equipment all speaks the same protocol language. This interoperability during a live production saves setup time on-site and ensures a successful production, but experience and becoming familiar with how the technology operates counts most.
“Virtual production is a mix of the real world and the virtual world, so you must master the techniques from both and accumulate a big amount of experience in order to get it right,” said Chou, adding that deploying a comprehensive workflow built using technology from AJA Video Systems for this project helped ensure that everything worked together on the same local area network. “At the end of the day, you’ll need a high level of experience to present the art that has been created here and a good technology partner like AJA.”
You might also like...
A recent Lawo remote activities case study notes, “It should be obvious by now that remote operation has been seriously underrated. For some, it allows to save substantial amounts of money, while others will appreciate the time gained from not…
In this article, George Kroon, research broadcast engineer, takes a look at how Negative ARQ protocols similar to those used for internet streaming and contribution can be improved specifically for broadcast television.
Many people and cultures celebrate special New Year dates. Organizations designate fiscal years. Broadcasters traditionally mark their new technology year mid-April, at annual NAB Shows. Old habits die hard.
There are many types of codecs, all used for specific purposes to reduce file sizes and make them easier to distribute down a limited bandwidth pipe. Lossy compression and Lossless compression are the two most common categories of data compression…
OTT has dramatically expanded the range of delivery outlets for content and continues to do so. This has had a direct effect on content production, enabling almost any organization or person to create and distribute live content, which has increased…