AWS Completes Streaming Protocol Set With SRT Support

AWS (Amazon Web Services) has joined the SRT Alliance and added native support for the SRT protocol in its AWS Elemental MediaConnect package.

This completes the low latency protocol set for Elemental MediaConnect, which already supported video transport using protocols including Zixi, Reliable Internet Stream Transport (RIST), and Real-time Transport Protocol (RTP) with forward error correction (FEC).

Although last to be added, AWS indicated customer demand had built up, which it said determines which protocols it supports and which technologies it adopts or builds. The inclusion of SRT within the portfolio also extends to AWS Elemental Live, an on-premises appliance and software based live video encoder, whose users now have access to the protocol in the latest software release. Elemental Live can now receive streams using SRT, taking as an input a secure, reliable, and low-latency video source with protection against packet loss. SRT can be used in applications that require security, since video can be secured end-to-end using 128 bit or 256 bit AES encryption via the protocol.

“SRT has shown that it’s an important transport protocol, and it provides secure and reliable transport of live video to and from the AWS Cloud,” said Dan Gehred, solutions marketing manager for AWS. “With SRT protocol input and output in AWS Elemental MediaConnect, along with input in AWS Elemental Live appliances and software, AWS customers have more options when it comes to building scalable, reliable, and secure live video workflows.”

This comes at a time of growing momentum behind SRT, a month after Sony joined the Alliance, with other scalps including Microsoft, Alibaba Cloud, Sony and Tata Communications among over 450 members in total.

However, SRT still has a close rival in RIST (Reliable Internet Stream Transport) protocol proposed by the Video Services Forum, which is similar in capabilities and approach. Both address latency by reducing the delay associated with the TCP protocol retransmitting lost IP packets after receiving requests from the receiving end of a network link. Both work by incorporating an error-correction mechanism already associated with the UDP (User Datagram Protocol) called Automatic Repeat ReQuest (ARQ). If the receiver identifies a gap in a stream resulting from missing IP packets, it requests those to be sent again via a negative acknowledgment (NAK) packet. The advantage of this is that only missing packets are retransmitted and this is done faster because sender and receiver remain in conversation throughout the transmission.

SRT is sometimes described as combining the speed of UDP without ARQ with the reliability of TCP. But in practice ARQ does add some latency, worse impact depends on the retransmission or round-trip time between sender and receiver, largely a function of distance.

This distance-related delay comprises signal latency, the time taken to traverse the physical medium, but the bigger contributor is the time taken to buffer IP packets while waiting for possible retransmissions. The farther apart the nodes, the higher the latency as buffers have to be larger to accommodate the added delay in transmission, and so take longer to fill and empty.

Between say London and New York, that total delay would be about 150 milliseconds, rising to up to 500 ms if the source and destination are almost diametrically opposite on the globe. That may still be acceptable for video streaming, but not for some two way interactive applications such as video conferencing and gaming. For those applications, the WebRTC protocol will usually be preferred because this involves no packet retransmission at all. Instead Forward Error Correction (FEC) is used to cater for dropped IP packets up to a certain level, by incorporating some redundant information in the stream. This allows some packet recovery while imposing very little additional latency, but there is a bandwidth overhead and FEC cannot be guaranteed to deliver high definition streams at sufficient quality over the internet.

SRT was developed by Haivision around 2012 to meet the challenges of low latency video streaming by cutting down on those TCP error correcting delays. Wowza Media Systems later joined the party and together with Haivision launched the SRT Alliance in April 2017, while making the protocol available open source to encourage adoption.

You might also like...

Essential Guide: Flexible IP Monitoring

Video, audio and metadata monitoring in the IP domain requires different parameter checking than is typically available from the mainstream monitoring tools found in IT. The contents of the data payload are less predictable and packet distribution more tightly defined…

Is Remote Operation Underrated?

A recent Lawo remote activities case study notes, “It should be obvious by now that remote operation has been seriously underrated. For some, it allows to save substantial amounts of money, while others will appreciate the time gained from not…

Improving Negative ARQ Protocols For Internet Delivery

In this article, George Kroon, research broadcast engineer, takes a look at how Negative ARQ protocols similar to those used for internet streaming and contribution can be improved specifically for broadcast television.

ATSC 3.0: Right Place, Right Time

Many people and cultures celebrate special New Year dates. Organizations designate fiscal years. Broadcasters traditionally mark their new technology year mid-April, at annual NAB Shows. Old habits die hard.

Lossless Compression: Rewriting Data More Efficiently

There are many types of codecs, all used for specific purposes to reduce file sizes and make them easier to distribute down a limited bandwidth pipe. Lossy compression and Lossless compression are the two most common categories of data compression…