ATEME And Broadpeak Launch Machine Learning Encoder

French video technology vendors Broadpeak and ATEME have jointly launched a low-latency live video streaming package supporting the emerging DVB-I standard ecosystem.

Demonstrated at IBC2019, this combines machine learning, multicast ABR, low latency CMAF and HTTP chunked transfer encoding in a bid to get down latency of live stream encoding. The two vendors Broadpeak, best known for its Content Delivery Network (CDN) products, and ATEME, which specializes in transcoding, claim the package enables pay TV operators going multiscreen to deliver live streams at 1 second end-to-end latency.

“Latency is a major issue for our customers. By leveraging our continuous efforts towards innovation, we have implemented patented machine learning-based compression, minimizing any video pipeline buffer,” said Michel Artières, ATEME’s CEO.

The package combines ATEME’s Content Adaptive Encoding tools within its TITAN range with Broadpeak’s BkS350 Origin Server and Packager, BkE200 transcaster server, and nanoCDN multicast ABR technology. Broadpeak claims its nanoCDN is the only multicast ABR technology on the market today that combines multicast delivery with CMAF and HTTP CTE, allowing continuous on the fly video delivery. This achieves a stability closer to IPTV walled garden services, while keeping buffering down to an acceptable level.

TITAN’s Content Adaptive Encoding incorporates machine learning algorithms that adapt to the content and reduce bitrate further than the underlying codec, saving up to 50 percent of bandwidth for a given video quality. However, the actual saving achieved will depend on the content complexity and for scenes that are fast moving or have a lot of color variation, it will be substantially less than 50 per cent.

Broadpeak and ATEME are members of DVB and have been involved in the DVB-I/multicast ABR standardization process. DVB-I is being developed by the DVB in recognition that broadcasters will be moving beyond hybrid services towards IP-only delivery. The aim therefore is to ensure standalone TV services can be transmitted over the internet with the same quality, scalability, reliability and user-friendliness as traditional linear broadcast.

Let us know what you think…

Log-in or Register for free to post comments…

You might also like...

Color and Colorimetry – Part 6

It is almost a hundred years since the color space of the human visual system was first explored. John Watkinson looks at how it was done.

Color and Colorimetry – Part 5

In a multi-disciplinary subject such as color space, it is hard to know where to start. John Watkinson argues that the starting point is less important than the destination.

Ultra HD Forum Factors High Frame Rate Into Latest Infrastructure Guidelines

The Ultra HD Forum has given a stimulus to UHD deployments with the release of its latest 2.1 guidelines that give proper weight to all the ingredients constituting next generation A/V (Audio/Video).

Color and Colorimetry – Part 4

A long chain of events is needed to see a color picture on a TV set. Only by considering every link in the chain can we strengthen any weak links.

Color and Colorimetry – Part 3

The human visual system (HVS) sees color using a set of three overlapping filters, which are extremely broad. As a result, the HVS is completely incapable of performing any precise assessment of an observed spectrum.