Asia’s Golden Melody Awards Feature Stunning AR-Supported Performance

Television production these days is tricky enough without adding virtual elements like augmented reality (AR) graphics, but that’s exactly what Taipei-based production company Getop did for the live telecast of the 2020 Golden Melody Awards (GMA). The highly rated annual televised awards ceremony - considered the “Asian Grammys” by many - celebrates top musical talent from across southeast Asia.

Originally scheduled for June 27th, the ceremony was rescheduled to October 3rd, due to the pandemic, and moved to the outdoor Taipei Music Center. It was broadcast to high ratings on the Taiwan Television Channel and streamed live, with Taiwanese talent taking home many of the top prizes.

AR Takes Center Stage

The event featured a surprise performance by Taiwanese Artist Hebe Tien that was supported by a large videowall backdrop and a number of AR effects floating across the stage and around the star. The complex AR segment presented a number of technical challenges in making the virtual effect work for viewers at home. Besides the immense logistics - working with hundreds of people from different companies and departments - the AR portion of the production took a lot of technical coordination and finessing to get it right. Paul Chou, CEO of Getop, said they began preparing for the event a month and a half prior and spent a lot of time rehearsing to ensure everything worked as desired.

KUMO 3232-12G router.

KUMO 3232-12G router.

He said the key to making the AR effect believable and seamless during Tien’s performance was synchronizing the timing of the live streams of each camera angle in real time using software and hardware. They had to get it perfect or the stunning effect would be lost on viewers. The pressure was intense during production.

“The high risk of live television made this production a high-pressure working environment because it’s either rated 0 or 100 at the end of the day,” said Chou. “AR brings these richly produced images to the production and opens up more opportunities for creativity. It satisfies the needs of the new generation that is used to seeing content from the internet that contains a lot of effects.”

The Getop team used a variety of 12G equipment from AJA Video Systems in tandem with other vendors to make it happen. They developed a flexible live AR production system for the event that included a Dell 7920 workstation, an AJA Corvid 88 card for audio and video I/O, Pixotope software - a software-based solution on COTS hardware used to create the AR graphics - and Epic Games’ Unreal Engine for real-time rendering and compositing. An AJA GEN 10 sync generator handled synchronization of the camera feeds, while an HD10CEA HD/SD-SDI converter was used to split the feed into two to generate additional signals for the effect.

HD10CEA HD/SD-SDI converter.

HD10CEA HD/SD-SDI converter.

Pre-Production Challenges

During pre-production there were time delay issues in the Unreal Engine workflow, and an AJA Ki Pro Ultra 12G multichannel HD recorder, which supports the Apple ProRes codec, was used to solve the delay. Images were composited in real-time and output to an OB truck parked on site. With Ki Pro Ultra 12G’s multichannel HD recording capabilities, the crew was able to capture the output of the AR rendering engine alongside other live production feeds coming from the OB truck and the cameras. This was critically important, since the delay of each signal was different and had to be adjusted in order for the AR effect to work.

Overcoming Delay

Chou said these signal delays occurred for several reasons, including the continuous large amounts of data processing in Unreal Engine to precisely position high fidelity AR effects. But the most noticeable delay, he said, came from the signal that was passed through the broadcast systems and captured with the AJA recorder. Because the live camera outputs often contained longer delay times than the PGM output, there was a noticeable lag when the AR elements were produced in sync with the musical performance.

“When we received the signal from the truck to monitor the program feed, it contained some delay problems,” said Chou. “For all of these signals, where we were getting different delay times, the Ki Pro Ultra 12G made it easy for us to record all the signals in sync, without having to use genlock as a reference. This helped reduce a lot of time spent managing the image files.”

For routing, Getop chose an AJA KUMO 3232-12G router, which handled all of the various audio and video sources, and automatically distributed them to the appropriate destinations (within the venue and outside), using a number of preset parameters. The Getop crew also used a host of Mini-Converters to change the SDI signal to HDMI for the inputs of the large monitors on-stage behind Tien as she sang. The AR elements appeared to flow out of the 2D screens (and 3D graphics) and land lightly on the stage before disappearing. These effects were only seen by viewers at home and on smaller monitors mounted throughout the venue.

Grading For REC.709 Color Space

Tien’s AR performance was recorded on a Ki Pro Ultra 12G recorder for archival purposes and monitored with an EIZO CG319X Grade One monitor working in the SDR REC.709 color space. The reference monitor was used extensively to check for correct color rendition.

At the end of the day, Chou said that using technology that is pro-certified to work seamlessly together makes networking and production workflows run smoother because the equipment all speaks the same protocol language. This interoperability during a live production saves setup time on-site and ensures a successful production, but experience and becoming familiar with how the technology operates counts most.

Ki Pro Ultra 12G multichannel HD recorder.

Ki Pro Ultra 12G multichannel HD recorder.

“Virtual production is a mix of the real world and the virtual world, so you must master the techniques from both and accumulate a big amount of experience in order to get it right,” said Chou, adding that deploying a comprehensive workflow built using technology from AJA Video Systems for this project helped ensure that everything worked together on the same local area network. “At the end of the day, you’ll need a high level of experience to present the art that has been created here and a good technology partner like AJA.”

Supported by

You might also like...

Creative Analysis: Part 18 - Cinematographer Johnny Derango On Fatman

Possibly one of the last features to wrap before the global pandemic effectively shut down production was Fatman, a dark action comedy directed by Eshom and Ian Nelms and starring Mel Gibson as the kind of Santa Claus who would…

HDR: Part 21 - Creative Technology - The Size Of Light

Most people are aware that the amount of detail in a photograph is dependent on a few things. Assuming focus is right, it’s easy to assume it’s a combination of how well the lens was made and the…

Creative Analysis: Part 17 - DOP Simon Rowling On Legacy Of Lies

Some films languish in preproduction hell for years. For cinematographer Simon Rowling, though, only a matter of days elapsed between first discussing Legacy of Lies and flying out to begin location scouting. The production, which stars Scott Adkins as a…

HDR: Part 20 - DOPs: Do We Still Need Green?

With the recent advances in artificial intelligence, post-production rotoscope tools are now so good that some DOPs are asking if we still need to use green screen at all, or at least, in quite the same way. Suddenly, it seems…

RF At Super Bowl LV

The #1 rule of live broadcasting is that things tend to fail at the worst possible time. The greater the ratings, the more likely something highly unlikely but mission-critical will fail, broadcast RF and wireless communication systems included. Count on it.