IRT and EBU collaborate over online subtitling for IBC

As online video comes of age, it must match legacy broadcast services for traditional features such as consistent access to subtitles. This will be a focus of separate demonstrations at IBC 2015 from the EBU (European Broadcasting Union) and Institut für Rundfunktechnik GmbH (IRT), the research arm of German broadcasters ARD, ZDF and DLR, along with Austria’s ORF and Swiss public broadcaster SRG / SSR.

IRT will be showing how the latest HbbTV 2.0 standard enables smart TVs to access subtitles in a uniform way that is independent of the manufacturer. It will demonstrate a prototype service developed by IRT in cooperation with the Innovation Projects of ARD broadcaster Rundfunk Berlin-Brandenburg (rbb) and Samsung. This combines live streaming using MPEG-DASH with subtitles based on the EBU-TT-D specification. The work was carried out under HBB4ALL, a European co-funded project on media accessibility in a connected TV environment. Indeed for the last two years, IRT has been one of the leading contributors to both HbbTV 2.0 and EBU-TT-D, which together are aiming to unite the broadcast and broadband world.

Subtitles have been part of traditional TV services for decades, but over the Internet have been limited for live streaming by the lack of a standardized solution and by timing issues. In combination with MPEG-DASH, EBU-TT-D is a solution, enabling a common subtitle format to be used for video on demand services like catch-up TV as well as live streaming.

Meanwhile the EBU will be giving a series of EBU-TT related presentations on its booth at IBC, 10.F.20. EBU-TT, or EBU Timed Text, is the follow-up to the widely used EBU STL format and comprises three specifications. The Base spec is EBU-TT part 1, defining an easy-to-use XML structure for the interchange and archiving of subtitles. It builds on the W3C Timed text Markup Language (TTML) 1.0, which was designed for exchange of textual information synchronized to timing signals over the Internet. This was initially developed for transcoding or exchanging timed text information among legacy distribution content formats already in use for subtitling and captioning within online video.

The second EBU-TT specification is STL mapping, designed to map legacy EBU STL files to the new EBU-TT format. As such this is a key requirement for support of subtitling in hybrid services.

Then the third specification is Live contribution, covered by EBU-TT Part 3 published in June 2015 for authoring and contribution of live subtitling. It also introduces the concept of processing 'nodes' that can help improve subtitling quality in production and reduce the effort involved.

You might also like...

Linear vs D2C: The Future Of Sports Media & Fan Engagement - Part 2

Our sports media COO featured in this article continues to reflect on how the D2C business opportunity drives their decisions about where content is made available, how content is created and produced for different audiences, and how the “D2C…

India Spotlights The Importance of Converged “Direct-To-Mobile” Broadcasting In Today’s Mobile Video

As the U.S. continues to roll out NextGen TV services in markets large and small across the country, 5G wireless technology is being considered (and tested) to augment the OTA signal and provide a fast and accurate backchannel to…

IP Security For Broadcasters: Part 9 - NMOS Security

NMOS has succeeded in providing interoperability between media devices on IP infrastructures, and there are provisions within the specifications to help maintain system security.

Linear vs D2C: The Future Of Sports Media & Fan Engagement - Part 1

This is a story about the COO of a media business, that shines a light on the thinking underway at the leading edge of the media industry, where the balance shift from Linear Broadcasting to D2C Streaming is firmly…

NAB22 BEIT Sessions: ATSC 3.0, Web 3.0, And The Metaverse

What we’ve seen as ATSC 3.0 deploys and develops is just the tip of the NextGen TV iceberg.