AWS Completes Streaming Protocol Set With SRT Support

AWS (Amazon Web Services) has joined the SRT Alliance and added native support for the SRT protocol in its AWS Elemental MediaConnect package.

This completes the low latency protocol set for Elemental MediaConnect, which already supported video transport using protocols including Zixi, Reliable Internet Stream Transport (RIST), and Real-time Transport Protocol (RTP) with forward error correction (FEC).

Although last to be added, AWS indicated customer demand had built up, which it said determines which protocols it supports and which technologies it adopts or builds. The inclusion of SRT within the portfolio also extends to AWS Elemental Live, an on-premises appliance and software based live video encoder, whose users now have access to the protocol in the latest software release. Elemental Live can now receive streams using SRT, taking as an input a secure, reliable, and low-latency video source with protection against packet loss. SRT can be used in applications that require security, since video can be secured end-to-end using 128 bit or 256 bit AES encryption via the protocol.

“SRT has shown that it’s an important transport protocol, and it provides secure and reliable transport of live video to and from the AWS Cloud,” said Dan Gehred, solutions marketing manager for AWS. “With SRT protocol input and output in AWS Elemental MediaConnect, along with input in AWS Elemental Live appliances and software, AWS customers have more options when it comes to building scalable, reliable, and secure live video workflows.”

This comes at a time of growing momentum behind SRT, a month after Sony joined the Alliance, with other scalps including Microsoft, Alibaba Cloud, Sony and Tata Communications among over 450 members in total.

However, SRT still has a close rival in RIST (Reliable Internet Stream Transport) protocol proposed by the Video Services Forum, which is similar in capabilities and approach. Both address latency by reducing the delay associated with the TCP protocol retransmitting lost IP packets after receiving requests from the receiving end of a network link. Both work by incorporating an error-correction mechanism already associated with the UDP (User Datagram Protocol) called Automatic Repeat ReQuest (ARQ). If the receiver identifies a gap in a stream resulting from missing IP packets, it requests those to be sent again via a negative acknowledgment (NAK) packet. The advantage of this is that only missing packets are retransmitted and this is done faster because sender and receiver remain in conversation throughout the transmission.

SRT is sometimes described as combining the speed of UDP without ARQ with the reliability of TCP. But in practice ARQ does add some latency, worse impact depends on the retransmission or round-trip time between sender and receiver, largely a function of distance.

This distance-related delay comprises signal latency, the time taken to traverse the physical medium, but the bigger contributor is the time taken to buffer IP packets while waiting for possible retransmissions. The farther apart the nodes, the higher the latency as buffers have to be larger to accommodate the added delay in transmission, and so take longer to fill and empty.

Between say London and New York, that total delay would be about 150 milliseconds, rising to up to 500 ms if the source and destination are almost diametrically opposite on the globe. That may still be acceptable for video streaming, but not for some two way interactive applications such as video conferencing and gaming. For those applications, the WebRTC protocol will usually be preferred because this involves no packet retransmission at all. Instead Forward Error Correction (FEC) is used to cater for dropped IP packets up to a certain level, by incorporating some redundant information in the stream. This allows some packet recovery while imposing very little additional latency, but there is a bandwidth overhead and FEC cannot be guaranteed to deliver high definition streams at sufficient quality over the internet.

SRT was developed by Haivision around 2012 to meet the challenges of low latency video streaming by cutting down on those TCP error correcting delays. Wowza Media Systems later joined the party and together with Haivision launched the SRT Alliance in April 2017, while making the protocol available open source to encourage adoption.

You might also like...

PTP V2.1 – New Security & Monitoring For IP Broadcast Infrastructures - Part 2

In the last article in this series, we looked at how PTP V2.1 has improved security. In this part, we investigate how robustness and monitoring is further improved to provide resilient and accurate network timing.

Public Service Broadcasters Juggle Linear And Online To Maximize Coverage

The decline of public service broadcasting has been one of those long running narratives that is sometimes defied by reality, like the death of the set top box.

TV Stations Coping With Severe Weather But Keep The Reports Coming

Violent weather storms are wreaking havoc on the East Coast of the U.S. and radio and TV stations there are struggling to get the life-saving news out. In the past two months alone storms have knocked out TV antenna…

PTP V2.1 – New Security & Monitoring For IP Broadcast Infrastructures - Part 1

Timing accuracy has been a fundamental component of broadcast infrastructures for as long as we’ve transmitted television pictures and sound. The time invariant nature of frame sampling still requires us to provide timing references with sub microsecond accuracy.

Creative Audio - Noise Reduction With Bob Bronow

Dialogue is king in television. Let’s face it, you don’t watch an episode of your favorite police procedural or reality show just to listen to the sound design or the incidental music. But whether the content is scripted or …