ATSC 3.0 Details Explained, Part 4

“We will control the horizontal. We will control the vertical,” said The Outer Limits ‘Control Voice’ in the opening of the ABC-TV series from 1963-65. Broadcast TV is about to go ‘One Step Beyond’ all that with streams of content and commercials containing a variety of signaling and announcements designed to personalize the television experience for the viewer. Metadata will turn one-to-many broadcasting into one-to-one personalcasting.

Part 4 of this ATSC 3.0 series passes the virtual mic back to Mark Corl, Senior VP Emergent Technology Development at Triveni. The first webinar was produced by Triveni, and it was covered in Part 1 of ATSC Details Explained. ATSC Details Explained Part 2 and Part 3 were from a webinar produced by SMPTE, ATSC and NAB. This story covers a third webinar, presented by Triveni on 20 June, focused on ATSC 3.0 Signaling & Announcement. The following is a mixture of quotes and paraphrases of Mark Corl’s words edited for the story.

ATSC 3.0 is a large standard. It is described in 19 documents, only 4 of which discuss RF. The other 15 documents relate to metadata, also known as signaling. What was learned over the last 20 years is now encompassed by ATSC 3.0, which is designed for future extension. Signaling and Announcement are part of what makes it special.

ATSC 1.0 consists of two core standards. There are approximately 24 standards defined on atsc.org divided among 1.0, Datacasting, DASE, M/H and ATSC 2.0. ATSC 3.0 spans all that functionality and much more.

The bottom four layers shown in the lead graphic of the ATSC 3.0 Protocol Stack were covered in earlier parts of ATSC Details Explained. To review, PHY Layer Signaling contains Bootstrap, Preamble and Transmitter ID (TxID). ATSC Link Layer Signaling (ALP) that hides various transport types from PHY. ATSC 3.0 PHY only acrries ALP packets.

Multicast defines a range of IP destination addresses from 224.0.0.0 to 239.255.255.255. ATSC 3.0 defines specific addresses and ports for various purposes. ATSC 3.0 Elementary Streams are identified by destination address and port instead of PID.

S13 Part 4 starts with ROUTE and works its way up the protocol stacks. ROUTE is Real-time Object delivery over Unidirectional Transport. It allows for a source and a repair stream in the ROUTE environment. Typically, the FEC stream is used in the lower level segments, but it could be used in higher level route streams. All data streams including essence streams are transmitted including NRT, ESG and other streams are sent over ROUTE if it is chosen.

Figure 2. Different transports are best suited for different functions but not necessarily tethered to those functions.

Figure 2. Different transports are best suited for different functions but not necessarily tethered to those functions.

Multimedia Multiplexing Transport Protocol (MMTP) is a MPEG standard. It consists of Media Processing Units (MPUs) wrapping ISO Base Media File Format (BMFF) files with metadata for broadcast delivery. It carries essence streams (audio, video and captioning) only. All the other data such as Non-Real-Time (NRT) and other signaling needs to be sent with ROUTE. HTTP or HTTPS, the familiar Hypertext Transport Protocol, is used to pull data using TCP/IP over broadband.

S15 ISO BMFF is either in MMTP as a MPU or in the DASH segment. It is a general file structure for time-based media files similar to MP4. MP4 uses the same BMFF file formats. They break up the essence into manageable segments and presentation metadata. Each of the files is essentially a collection of object-oriented boxes, each has a type and a length. In most cases, boxes may be and typically are nested.

The Media Source Extensions (MSE) and Encrypted Media Extensions specify definitions for Initialization Segments, Media Segments and Random Access Points within the boxes in the file format. This is how video, audio, captioning and most of the essence is carried.

The next level up the stack is NRT File Delivery. NRT is pushed via ROUTE over broadcast, and/or pulled via HTTP broadband. Any content can be delivered. NRT is signaled like any other service and referenced via a Uniform Resource Identifier (URI) from other metadata. The Announcement (or Promotion) standards are still under discussion, but will likely be similar to what is now done live.

Figure 3. Other functions of ATSC 3.0 Metadata.

Figure 3. Other functions of ATSC 3.0 Metadata.

Watermarking

Audio and video watermarking is in the audio and/or video and is intended to give cable and satellite viewers the same experience as over-the-air viewers. It only works when a HDMI connection is available. Among other things, it provides an inaudible “pilot” that tells a set-top box to establish an internet connection with the station’s server for frame-accurate supplemental data.

The recovery metadata is all the signaling and other information that the other stacks would have normally carried if the broadcast was received over-the-air. On cable and satellite, that data is pulled in via broadband and presented to the viewer as if it were live.

App Signaling and Triggers describes, among other things, how a companion device such as an iPad or phone could connect to an ATSC 3.0 television.

Personalization describes information that can be collected about the user. Personalization is an opt-in system, and is a standard of how the data is used up through the system and perhaps back to the broadcaster. Personalization is what makes the ATSC 3.0 experience feel more like the internet.

Service Usage Reporting tells what services have been watched, how long they were watched, and a variety of other information about the receiver and what it is doing during the broadcast.

Figure 4. Comparing roles between PSIP v. ATSC 3. S&A.

Figure 4. Comparing roles between PSIP v. ATSC 3. S&A.

ATSC 1.0 PSIP has Program Identifiers (PIDs) described by the Virtual Channel Table (VCT) with the Service Location Descriptor (SLD) and called Service Discovery. In the ATSC 3.0 standard these are IP addresses, no longer PIDs, and described by the Service List Table (SLT).

The next layer up is the Service Description. In the case of PSIP, it is the channel Extended Text Table (ETT) that provides the ability for extended descriptions. ATSC 3.0 has service descriptions and service icons are allowed to be defined, allowing stations to promote the service.

In PSIP, titles and descriptions of content and program promotion are a limited amount of text. ATSC 3.0 has program title and description text description as well. It also has a mechanism to include program icons, preview clips and program genre. It allows the Electronic Program Guide (EPG) to be much more interesting and provides many new opportunities for promotion.

Figure 5. Bootstrapping in ROUTE is based on a pointer in the SLT that points ROUTE SLS. SLS points to all the Service Components including NRT, all carried under ROUTE.

Figure 5. Bootstrapping in ROUTE is based on a pointer in the SLT that points ROUTE SLS. SLS points to all the Service Components including NRT, all carried under ROUTE.

Bootstrapping

In MPEG Media Transport (MMT), all of the MPUs are essence only. The MMTP pointer points to the MMT Signaling Components. It points to Streaming Service Components in the MPUs and NRT Service Components.

As shown at the left side of Figure 5, within the SLT is where to find SLS signaling. SLT finds two services, Service 1 and Service 2. Service 1 points to Physical Layer Pipe (PLP) 1. The PLP contains source IP address (sIP), destination IP address (dIP) and destination port number (dPort).

PLP 1 service signaling data structures essentially describe all the bits, and where to find the video and audio segments. The Media Presentation Description (MPD) shown in Figure s23 upper right in a MPEG2 standard which tells the system how to get that information up to the receiver so it can start decoding. The MPD is an XML file similar to a playlist that points to various components to be played back by the local receiver.

Similarly, in MMT, SLS in Service 2 (Figure s23) points to the User Service Bundle Description (USBD)/User Service Description (USD). It points to the MPU components and will start playing back the video and audio. The USBD/USD is the ATSC 3.0 version of 3GPP Multimedia Broadcast/Multicast Service (MBMS) fragment. For MMT, it only includes the component location information.

The Layered Coding Transports (LCTs) in PLP 1, for example, provide segmented pieces of the audio, video and service signaling.

The SLT is the Broadcast Stream ID (BSID), similar to a transport stream ID. The Service-based Transport Session Instance Description (S-TSID) describes the ROUTE Sessions, the SID, PLP and IP addresses. These pull the TSID, which is part of ROUTE, and the LCT channel.

Figure 6. Announcement consists of OMA BCAST, which is similar to the announcement done in the Mobile/Handheld (ATSC-M/H) standard.

Figure 6. Announcement consists of OMA BCAST, which is similar to the announcement done in the Mobile/Handheld (ATSC-M/H) standard.

Announcements are basically XML fragments with some extensions for ATSC 3.0. They are carried in a Service Guide Delivery Unit (SGDU) which is a binary header wrapped around individual SGXML fragments. The Service Guide Delivery Descriptor is an XML that contains an exhaustive list of SGDUs in this service guide. Think of the Delivery Descriptor as a pointer or playlist to all of the other components and information in the rest of the service guide.

For broadcast delivery, the SGDDs are delivered on a single ROUTE LCT channel referred to as the Service Guide Announcement Channel so the receiver always knows how to find it. The SGDUs may be delivered on one or more LCT channels. They are essentially just files being delivered and the SGDD tells the system where to find them. Generally they will all be delivered in one place, but they could be delivered anywhere, depending on the sources and where they are coming from.

In broadband delivery the SGDDs and SGDUs are delivered via HTTP. The SGDUs are referenced by the SGDD. Icons and preview data are referenced by URLs and may be delivered via NRT (ROUTE) or broadband.

Figure 7. The SGDD is general guide information listing all the SGDUs.

Figure 7. The SGDD is general guide information listing all the SGDUs.

The SGDUs contains a Schedule Fragment which provides a presentation window and a set of reference IDs to Content Fragments. Each Content Fragment has Preview URLs, Icon URLs, and the components that will be used. This happens during presentation times and would also reference the Service Fragment which describes any icons, audio language, genre and other information and services.

Questions?

Why is ATSC 3.0 IPv4 and not IPv6? Answer: Simultaneous IPv4 and IPv6 are significantly more work. Wanting to get ATSC 3.0 approved and to market kept the focus on IPv4. The fact that the system is also generally self-contained diminished the need for IPv6, but it will likely appear in ATSC 3.1 or 3.2.

This concludes the ATSC 3.0 Details Explained series but is in no way the final word. Many of the details and nuances of ATSC 3.0 are still being debated. Candidate standards deadlines have been extended from 31 July into the fall.

All graphics courtesy of Triveni Digital.

You might also like...

Why AI Won’t Roll Out In Broadcasting As Quickly As You’d Think

We’ve all witnessed its phenomenal growth recently. The question is: how do we manage the process of adopting and adjusting to AI in the broadcasting industry? This article is more about our approach than specific examples of AI integration;…

Designing IP Broadcast Systems: Integrating Cloud Infrastructure

Connecting on-prem broadcast infrastructures to the public cloud leads to a hybrid system which requires reliable secure high value media exchange and delivery.

Video Quality: Part 1 - Video Quality Faces New Challenges In Generative AI Era

In this first in a new series about Video Quality, we look at how the continuing proliferation of User Generated Content has brought new challenges for video quality assurance, with AI in turn helping address some of them. But new…

Minimizing OTT Churn Rates Through Viewer Engagement

A D2C streaming service requires an understanding of satisfaction with the service – the quality of it, the ease of use, the style of use – which requires the right technology and a focused information-gathering approach.

Designing IP Broadcast Systems: Where Broadcast Meets IT

Broadcast and IT engineers have historically approached their professions from two different places, but as technology is more reliable, they are moving closer.