IRT and EBU collaborate over online subtitling for IBC

As online video comes of age, it must match legacy broadcast services for traditional features such as consistent access to subtitles. This will be a focus of separate demonstrations at IBC 2015 from the EBU (European Broadcasting Union) and Institut für Rundfunktechnik GmbH (IRT), the research arm of German broadcasters ARD, ZDF and DLR, along with Austria’s ORF and Swiss public broadcaster SRG / SSR.

IRT will be showing how the latest HbbTV 2.0 standard enables smart TVs to access subtitles in a uniform way that is independent of the manufacturer. It will demonstrate a prototype service developed by IRT in cooperation with the Innovation Projects of ARD broadcaster Rundfunk Berlin-Brandenburg (rbb) and Samsung. This combines live streaming using MPEG-DASH with subtitles based on the EBU-TT-D specification. The work was carried out under HBB4ALL, a European co-funded project on media accessibility in a connected TV environment. Indeed for the last two years, IRT has been one of the leading contributors to both HbbTV 2.0 and EBU-TT-D, which together are aiming to unite the broadcast and broadband world.

Subtitles have been part of traditional TV services for decades, but over the Internet have been limited for live streaming by the lack of a standardized solution and by timing issues. In combination with MPEG-DASH, EBU-TT-D is a solution, enabling a common subtitle format to be used for video on demand services like catch-up TV as well as live streaming.

Meanwhile the EBU will be giving a series of EBU-TT related presentations on its booth at IBC, 10.F.20. EBU-TT, or EBU Timed Text, is the follow-up to the widely used EBU STL format and comprises three specifications. The Base spec is EBU-TT part 1, defining an easy-to-use XML structure for the interchange and archiving of subtitles. It builds on the W3C Timed text Markup Language (TTML) 1.0, which was designed for exchange of textual information synchronized to timing signals over the Internet. This was initially developed for transcoding or exchanging timed text information among legacy distribution content formats already in use for subtitling and captioning within online video.

The second EBU-TT specification is STL mapping, designed to map legacy EBU STL files to the new EBU-TT format. As such this is a key requirement for support of subtitling in hybrid services.

Then the third specification is Live contribution, covered by EBU-TT Part 3 published in June 2015 for authoring and contribution of live subtitling. It also introduces the concept of processing 'nodes' that can help improve subtitling quality in production and reduce the effort involved.

You might also like...

Brazil Adopts ATSC 3.0 For NextGen TV Physical Layer

The decision by Brazil’s SBTVD Forum to recommend ATSC 3.0 as the physical layer of its TV 3.0 standard after field testing is a particular blow to Japan’s ISDB-T, because that was the incumbent digital terrestrial platform in the country. C…

Broadcasting Innovations At Paris 2024 Olympic Games

France Télévisions was the standout video service performer at the 2024 Paris Summer Olympics, with a collection of technical deployments that secured the EBU’s Excellence in Media Award for innovations enabled by application of cloud-based IP production.

Standards: Part 18 - High Efficiency And Other Advanced Audio Codecs

Our series on Standards moves on to discussion of advancements in AAC coding, alternative coders for special case scenarios, and their management within a consistent framework.

HDR & WCG For Broadcast - Expanding Acquisition Capabilities With HDR & WCG

HDR & WCG do present new requirements for vision engineers, but the fundamental principles described here remain familiar and easily manageable.

What Does Hybrid Really Mean?

In this article we discuss the philosophy of hybrid systems, where assets, software and compute resource are located across on-prem, cloud and hybrid infrastructure.