Esports Expands Audiences Using Broadcast IP Production & Distribution – Part 2 – The IP Technology
Esports viewership worldwide is on a steep upward trajectory and will soon begin to challenge traditional sports broadcast audience figures. As the esports and traditional sports communities converge, what can traditional broadcasters learn from the remote production workflows being pioneered by one of esports’ major game developers? In part 2 of this two-part series, we look at distribution platforms and infrastructure.
The 2018 LoL Worlds were broadcast in 19 different languages across 30 platforms and television channels, including Twitch, YouTube, ESPN+, OGGN, Facebook, Syfy and TNT Sports. How Riot typically produces and distributes its tournaments is potentially a signpost to the future of sports television broadcasting.
Riot Games is headquartered on a 20-acre campus in West Los Angeles, where one of its buildings houses a fully equipped broadcast facility and maintains 23 other offices worldwide. Riot has been employing REMI or at-home remote audio and video production workflows for several years, producing broadcasts of the League of Legends Championship Series (LCS) in North America and Europe, the international Mid-Season Invitational (MSI) and the League of Legends World Championship from its L.A. facility.
Production of European competition broadcasts have been transitioned to the LoL European Championship (LEC) studio, which was built out at Riot’s location in Berlin, one of Europe’s major esports hubs, where it moved in 2014. The facility features a Calrec Artemis console in the broadcast audio production room while a Calrec Brio acts as a monitor console for the players, with a CEDAR system eliminating distracting crowd noise from their headsets.
League of Legends is a free-to-play GaaS or game-as-a-service, with approximately 120 million monthly active players and around 25 million playing daily. The largest share of players, roughly 25 percent, are in western Europe, according to server traffic, with Korea only fractionally behind that figure. Nordic and eastern European players combined account for about 13 percent.
The GaaS model enables the developer to keep its fans engaged through constant improvements and updates, charging only for premium content. To support that cloud-based model, the company has built out a worldwide private network, the Riot Direct WAN, which interconnects its servers and offices with a broadband pipe. Part of that ISP’s bandwidth is reserved for production traffic, which is what allows Riot to remotely produce and broadcast events at venues around the world from its L.A. facility.
With all aspects of Riot’s broadcast production, the name of the game is constant evolution and iteration in the service of efficiency and resilience. “We’ll do the same show 15 times and never once do it the same way,” says Matthew Donovan, broadcast engineering manager at Riot’s West L.A. production facility. “There’s always a way to improve. There’s always a change that’s going to deliver more value.”
Just as broadcast producers have had to find unique workflows and presentation solutions for individual traditional sports, so too must video game developers find a way to best present their specific competitions. “It’s unique to each game. You have to know the game, the capabilities and the limitations, and be embedded with the company that makes that game, to be able to visually represent that game in an engaging way for its fans,” says Donovan. “The technical obstacles you need to overcome will be different given different circumstances. The critical part is having a team embedded with the individuality of the games and experiences they’re trying to recreate and working with those challenges to create a good technical solution.” Happily, he says, solutions manufacturers are making the necessary technologies more accessible. “Things that were more challenging are now a lot easier to accomplish,” he says.
The organization is less conservative than traditional broadcast operations, not least because the personnel, like the game’s players, are generally younger and, with a few engineering exceptions, are not from a television broadcast culture. “We don’t want to keep doing the same thing. That’s not winning; that’s failure to us,” says Donovan. “If we’re not delivering more, pushing things, going for something that’s more engaging, then we’re failing. This outlook is great when everything goes to plan but we recognize the potential to faceplant. We take failover testing very seriously and it’s a critical part of our tech rehearsals on these international shows. Failure is okay if it’s on a path to progression, but it’s better to fail gracefully and not negatively impact the viewer experience.”
Like the competitions themselves, Riot’s LoL broadcast production is a team sport, and one of the key players is the IT department. When an international event is planned for Vietnam and Taipei, for instance, as in the case of the May 2019 MSI competition, the company’s network engineers create a direct pathway between a POP (point of presence) at a nearby Riot Game office — there is one in Ho Chi Minh City and another in Shanghai, as examples — from the Los Angeles POP over the enterprise WAN. Riot’s network engineers handle all the planning, routing, switching, InfoSec data security, monitoring and management of the infrastructure for each event. They also contract with a local or regional service provider to connect over the last mile to the venue.
“We always try to have redundant lines, even two different vendors, where possible,” says Donovan. “We try to add as much resiliency into the signal path as we can.” Should both paths fail—if internet connectivity is completely lost—a bonded cellular path is typically in place as a failsafe.
At-Home Production Efficiencies
Remote trucks have essentially been eliminated from Riot’s production workflow, a transition to an at-home paradigm that has been accelerated by the recent introduction of various solutions from a handful of manufacturers. That transition has also been driven by Riot’s simple philosophy: “We try to keep as many people home as possible so they can sleep in their beds,” says audio engineer Dave Talavera, a veteran of the broadcast industry who worked for NFL Films during the early 2000s.
Consequently, the audio production mix is handled from Los Angeles. On-site cameras can be controlled, switched and shaded from Riot’s facility, and the director calls the show from L.A. Video packages created by Riot’s L.A. producers are played back during the show from the facility. While players are looking at their own game on their individual screens, three observers at Riot L.A. can position themselves anywhere within the game, following along and selecting action for replay in the broadcast from within the system, without disturbing the competitors, in addition to also feeding segments to EVS for replay.
For previous championship events, a mix-minus world feed was generated in L.A. from the English-language version of the show and fed to distribution partners to add commentary in their respective languages, either in-venue or at their own facilities. But for the South Korea Worlds, the international feed was generated using a different switcher and a second audio room at the L.A. facility and distributed to every international partner, including the main audio studio in the same building. Riot’s English-language announcers called the action from a TV studio just down the hallway rather than from the venue.
Certain functions are still handled locally at the venues. For the time being, an engineer at the venue mixes IFBs for the talent and competitors, who are on Dante-enabled wired packs from Studio Technologies. That mixer also adds redundancy, with the ability to generate and distribute a feed from his desk should all else fail. But Riot has also used new REMI audio products such as Calrec’s RP1 remote production unit, controlled from a Calrec Artemis mixing console at the L.A. facility, for a couple of LoL championship shows in Europe. The RP1 includes DSP that allows latency-free IFBs to be generated at the remote site. Riot Games has installed a Calrec Artemis and a Calrec Brio at its Berlin esports complex in Germany.
Any company interested in reducing expenses related to its remote productions is likely considering at-home workflows. “I don’t want to take away jobs but there are some things that make sense,” says Talavera. “We now do roughly 12 remote large-arena shows a year. If you account for airfare, hotel, food, wages and equipment rental, that one IFB mix position is easily $50,000 to $60,000.”
Signals and Transports
A typical international event includes 30 or more inbound live 1080p60 video signals to Los Angeles plus 10 to 15 outbound, together with an average of 40 inputs of audio from the remote site. Ten cameras are dedicated to the two competing five-person teams. During finals, which include opening ceremonies featuring live entertainment and musical performances, audio will fill a 64-channel MADI stream. “We typically put out eight to 10 audience mics and we have camera mics—the same stuff you would see in a traditional sports broadcast,” says Talavera.
Fig 2 – control center.
Until two years ago, as the Berlin regional operation began to come online, the L.A. facility would handle productions for North America and Europe from Wednesday through Sunday every week for 40 weeks a year. During regional playoffs, there could be arena shows, each up to six hours long, on two continents back to back. “It was so much fun,” says Talavera. “When the first show is over, toss to commercial, three and a half minutes later, press a couple of buttons and it’s switched over. Another crew would walk in during the break. It was like a Swiss watch.”
Previously, signal distribution was handled by an Evertz ATP (Advanced Optical Transport Platform) over SONET (Synchronous Optical NETwork) pipes. That system could handle six MADI streams, says Talavera, including primary and backup streams for audio and for comms. A primary and a backup MADI stream from the second venue enabled the crew to prep that show’s broadcast before switching over, he says. Since the L.A. facility now has an in-house Riedel comms system, productions use VoIP to communicate with the remote location rather than sending MADI tielines.
“To attach ourselves to a manufacturer that had to have bigger pipes limited us,” he says, so Riot has iterated its at-home production infrastructure, at the same time adopting a Haivision transport that significantly reduces bandwidth requirements. Instead of taking up 8.7Gb over SONET lines using JPG2000 compression the bandwidth has been reduced to 1.3Gb, says Talavera, while handling additional paths and encoders. “At Worlds in South Korea we did upwards of 40 encoders in under 1.5 gigs. It gives the production way more flexibility.”
IP and Codec Issues
At the venue, audio is sent through a router and into a Nevion multi-format contribution codec, which encapsulates the MADI stream. In L.A., the incoming signals are routed to the Nevion decoders then into the mixing console. Video signals are converted to baseband and fed into the switcher. “It’s no different than having a truck,” says Talavera.
Processing time through the H.264 video encoders takes longer than the audio equivalents, so Talavera and the Riot engineers use the delay capabilities of the Nevion decoders in combination with Lawo’s V_remote4 IP remote production processors to resynchronize sound and picture on a per-channel basis or across an entire stream. The Nevion has a 200 mS buffer with the Lawo offering an additional 320 mS. On one recent remote Riot’s engineers had to introduce nearly 400 mS of delay for the worst case, Talavera says.
“We’ve been playing around with RAVENNA and other ways to get audio in via IP but it’s not consistent, whereas Nevion is very consistent,” says Talavera. The L.A. facility produced the 2017 North America LCS summer finals from Boston’s TD Garden arena using RAVENNA, he says. “But that was a dedicated pipe, 10-gig SONET lines.”
In addition to latency issues, working with IP transports also requires the production to carefully manage firewalls and potential IP address conflicts. When two back-to-back shows are leapfrogging, says Talavera, relevant production components must be on a separate address for each show. “We may have a 10.22-whatever network,” he says, referring to the class A block of IP space reserved for private networks worldwide. But the two productions can’t be tied together over the same address, so for the second show the paths need to be assigned to different addresses. “That’s time consuming,” he says, and while Riot’s network team handles the overall IP plan, it’s often up to the audio engineers to handle their own switchovers – until they find an automated solution, anyway.
IT Takes a Different Mindset
Of the 18 or so distribution partners, only the four with the largest audience can de-embed the multiple stems, isos and other sources they need to build their own shows. Although surround formats are supported by Twitch and YouTube, for the moment events are produced in stereo. Most of the partners can only take a 2-channel mix in any case, says Talavera. “They take it off YouTube. It’s crazy, but they do it and it works, and people watch it” on all sorts of devices, including phones. For anyone coming to esports from a television broadcast culture, it requires a change of mindset, he says. “It’s so not what we’re used to.”
Of course, it may be that Riot’s production workflows are simply not applicable to traditional broadcast. “What we’re doing is a little unique for our industry, but what we do works for us,” says Donovan. “It may not work for the larger broadcasters.”
And while automation and at-home workflows will likely bring about a reduction in production personnel, that’s not necessarily a bad thing, just different, Donovan says, and ultimately should make for a better final product. For those personnel concerned about the impact of esports and the broadcast workflows associated with it, he says, “It’s breaking down the walls of traditional sports. You can have more shows, more games. There are more outlets, more eyeballs. So, if you’re willing to learn and grow, that work gets spread around.”
Part of a series supported by
You might also like...
The Technology Of The Internet: Part 2 - Bandwidth And Adaptive Streaming
How CDNs, Adaptive Bit Rate streaming protocols and codecs have evolved together to take video streaming forward from the earlier inconsistent and low-quality experience, to deliver broadcast quality today in many cases.
The Importance Of CDN Selection To Achieve Broadcast-Grade Streaming Experiences
Multi-CDN is a standard model for today’s OTT Operators, which automatically requires a CDN selection solution. As streaming aims to be broadcast-grade and cost-effective, how are CDN Selection solutions evolving to support these objectives?
OTT’s Unique Storage Requirements
Central storage systems for OTT are unique in the world of media storage. They combine a set of requirements which no other media use case must handle.
OTT Content Origination
Content Origination is in the midst of significant transformation, like all parts of the OTT video ecosystem. As OTT grows and new efficiencies are pursued, Origination must play its part as a fundamental element of the delivery chain. But Origination…
The Technology Of The Internet: Part 1 - How Video Streaming Shaped The Internet
This is the first in a series of articles examining the technology underlying video streaming as that becomes the dominant transmission medium for TV. This first article dissects the internet itself and examines the impact of streaming on it, setting…