Streaming Video: Live, But Not Quite

More than 52% of survey participants report at least three-second latency delays or more in their live streaming broadcasts.

The best attribute of analog broadcasting was its imperceivable latency. Many people used to watch network NFL games on TV while listening to the local radio game broadcast because the sound and video were in sync and they liked their home-team announcer's bias. The goal of streaming media is to eliminate perceivable latency.

Low-latency video delivery has always been the holy grail of streaming. But the need to replicate real-life exchanges on a global scale has never been so crucial as it has in the past two years. It started in early 2020 when everything went virtual. Conferences, shopping centers, casinos, and doctors’ offices became digital places accessed via the internet. From there, many in-person experiences continued to transform into hybrid environments connecting both remote and on-site participants.

Content distributors in every industry were forced to adapt. Manufacturers looked to tele-welding. Commercial real estate firms began leveraging live 360° streaming. And live event producers experimented with every option for increasing engagement via interactivity. For many, low-latency streaming has moved from ‘nice-to-have’ to a key requirement.

The Wowza Media Systems Video Streaming Latency Report 2021 gathered data from more than 200 broadcasters around the world, in industries ranging from media and entertainment and other video streamers to examine the impact these changes have had on the industry and the state of video streaming latency in 2021. Information and graphics for this story are courtesy of Wowza.

Q: What type of content are you streaming?

Industries from healthcare and funerals to live sports and underwater exploration are digitally transforming at an unprecedented rate by leveraging video technology. Live sports events remain the leading type of broadcast, and its demand for low latency keeps growing. Nobody wants to learn the final score on their Twitter feed while the stream they’re watching lags behind.

Conferences, tradeshows, and conventions have also become major players because the pandemic has forced traditionally in-person events to adopt virtual and hybrid models. Interactive technologies are fundamental to bringing these broadcasts to life, which is why low latency video delivery remains a top priority.

Q: What is the most important user experience (UX) factor for your use?

High-quality streaming is a must, but low-latency delivery runs a close second. Real-time interactivity requires low latency. It stands to reason that minimizing delay is the most critical capability for many of today’s live broadcasters. High-quality video leads at 41.71%. Low, end-to-end latency was second at 32.16%. Real-time interactivity was rated at 14.57%. The agility to scale was 11.56%.

Graph of current latencies shows sub-3-seconds as the most prominent, but more than 52% of survey participants experience delays in excess of 3 seconds.

Graph of current latencies shows sub-3-seconds as the most prominent, but more than 52% of survey participants experience delays in excess of 3 seconds.

Q: How little latency do you hope to achieve in the future?

More than 57% hope to achieve sub-one second delivery. Although real-time delivery sounds like a good goal, Wowza recommends being open to some lag when your use case allows. Why? Because configuring streams for speedy delivery can introduce complexity and costs that aren’t necessary. Most passive broadcasts should be safe in the sub-ten-second range, and it’ll allow for a smoother, more reliable stream.

On the flip side, for content distributors who simply can’t let the seconds pile up, such as those deploying interactive experiences or mission-critical applications, real-time delivery becomes non-negotiable. This is easier said than done, though, as indicated by the disparity between current and desired latencies in the graph below.

The gap between actual and desired latencies in the sub-1 second and above-3 seconds categories illustrates where streaming video technology is headed.

The gap between actual and desired latencies in the sub-1 second and above-3 seconds categories illustrates where streaming video technology is headed.

Q: What is your audience size?

The majority of streaming video broadcasters surveyed aim for real-time delivery to audiences over 300 viewers. The requirements of low latency and scale used to be at odds with one another. That’s because the only format that ensures sub-one second streaming is Web Real-Time Communications (WebRTC), which was designed for small video chat environments rather than one-to-many broadcasts.

Q: Are you currently using low-latency streaming services? 

Approximately 54% yes, 46% no. Top reasons for “No” were “Don’t need it”, “Still testing”, “Lack of vendor support”, “Too expensive or difficult to scale”, and “Challenges with reliability and stability.”

Q: What problems would real-time streaming solve for you? 

The most common answer was, "Interactivity with online audience."

RTMP is widely supported on the ingest side and RTMP-based workflows are well defined. But that’s not to say it will remain this way.

RTMP is widely supported on the ingest side and RTMP-based workflows are well defined. But that’s not to say it will remain this way.

Q: Which streaming formats are you currently using for ingest?

The Real-Time Messaging Protocol (RTMP) remains the number one format for ingest, with more than 76% of broadcasters indicating they use it. WebRTC comes in second, and its usage is expected to continue growing. For one, it’s the fastest technology of the bunch and can also be used from end to end. Adoption of WebRTC on the delivery side is also ramping up.

Q: Which streaming formats are you currently using for delivery?

Predictably, more than 70% of respondents deliver their live streams using Apple’s HTTP Live Streaming (HLS) protocol. From there, MPEG-DASH and WebRTC are neck in neck. Smooth Streaming and HDS are dying a slow death, with vendor support waning.

Today's most popular streaming delivery format is HLS.

Today's most popular streaming delivery format is HLS.

It’s worth pointing out that the last two graphs add up to more than 100%. That’s because multi-protocol delivery and hybrid workflows are becoming the norm. Many respondents indicated using a handful of formats on each end of the streaming workflow, with RTMP in and HLS popping up most often.

Funny sidebar: When Wowza initially launched this survey, it was attacked by bots. The first indicator that the data coming in was suspect was that Smooth Streaming was leading the pack in delivery formats. Once Wowza scrubbed the data and added bot-blocking measures, the more accurate picture depicted here revealed itself.

Q: Where do you experience the largest delay in your workflow?

Being as HLS came in as the number one delivery format, those who marked player buffer as their most significant source of delay were right on the money. But for the 23% of respondents who weren’t sure where their delay stems from, let’s look at where latency creeps in from capture to playback.

Encoding: Bitrate, resolution, which codec you use, and even segment size impact the speed of video encoding. The higher the bitrate and resolution, the longer encoding will take. It’s a good idea to use a video transcoding solution provider to ensure efficiency at the encoder.

First-Mile Upload: Contribution delays often result from transmitting data over suboptimal networks and synchronizing multiple video sources. By choosing a protocol designed for low-latency content acquisition like SRT, this can easily be avoided in remote locations. Otherwise, the connection type is key.

Transcoding and Packaging: Traditional streaming protocols such as RTSP and RTMP support low-latency streaming, but they aren’t supported by many players. Many broadcasters choose to transport live streams to their media server using RTMP and then transcode it for multi-device delivery. The process itself injects latency, as do common delivery protocols like HLS.

Last Mile: The farther your viewers are from the edge server, the longer it’ll take to distribute a stream. This part of the workflow is largely outside of your control. The end-users’ proximity to the CDN edge and their network conditions will influence last-mile delivery. That said, some protocols like WebRTC weren’t designed for large-scale delivery via a typical CDN, meaning you’ll need a custom solution like a Real-Time Streaming at Scale feature to keep this step up to speed.

Player Buffer: Many specifications require a certain number of segments to be loaded before playback can begin. This buffer is intended to improve the viewer experience. When real-time delivery is essential, you’ll want to swap out a traditional HTTP-based format for WebRTC.

Streaming broadcasters reveal their future low-latency plans.

Streaming broadcasters reveal their future low-latency plans.

Q: How are you currently reducing latency?

As latency requirements become more aggressive, broadcasters are transitioning from the tried-and-true approach of decreasing segment length (42%) to implementing technologies designed with speed in mind (a.k.a. low-latency protocols, as indicated by 46% of respondents).

This is the only way to get streams into the sub-three-second range, but requires vendor support across the CDN and player.

Q: Which low-latency technologies do you plan to use in the future?

WebRTC provides real-time interactivity without a plugin but is difficult to scale without a streaming platform. Apple Low-Latency HLS provides sub-three-second streaming supported by Apple, but large-scale deployments are not yet commonplace.

Low-Latency CMAF for DASH is a standard HTTP format that can be leveraged for low latency, but the introduction of Low-Latency HLS has stifled momentum. 

Broadcast Bridge Survey

You might also like...

KVM & Multiviewer Systems At NAB 2024

We take a look at what to expect in the world of KVM & Multiviewer systems at the 2024 NAB Show. Expect plenty of innovation in KVM over IP and systems that facilitate remote production, distributed teams and cloud integration.

Wi-Fi Gets Wider With Wi-Fi 7

The last 56k dialup modem I bought in 1998 cost more than double the price of a 28k modem, and the double bandwidth was worth the extra money. New Wi-Fi 7 devices are similarly premium-priced because early adaptation of leading-edge new technology…

NAB Show 2024 BEIT Sessions Part 2: New Broadcast Technologies

The most tightly focused and fresh technical information for TV engineers at the NAB Show will be analyzed, discussed, and explained during the four days of BEIT sessions. It’s the best opportunity on Earth to learn from and question i…

Standards: Part 6 - About The ISO 14496 – MPEG-4 Standard

This article describes the various parts of the MPEG-4 standard and discusses how it is much more than a video codec. MPEG-4 describes a sophisticated interactive multimedia platform for deployment on digital TV and the Internet.

The Big Guide To OTT: Part 9 - Quality Of Experience (QoE)

Part 9 of The Big Guide To OTT features a pair of in-depth articles which discuss how a data driven understanding of the consumer experience is vital and how poor quality streaming loses viewers.