Transmission Encoding & MUX Global Viewpoint – June 2016

ATSC 3.0 Mysteries Explained, Part 1

The proposed ATSC 3.0 TV transmission standard will change television, much as POTS telephones were changed by wireless and the internet. Once adopted, every department in every TV station and the viewer’s watching habits will become as outdated as videotape.

The more ATSC 3.0 matures, the more it’s being defined. Activity is ramping up and word of the new standards is being spread. Last week, two ATSC 3.0 webinars presented a great deal of information to participants. One was produced by Triveni, the other by SMPTE. Part 1 of this story will highlight the information presented by Triveni on 14 June. Part 2 will cover the latest ATSC 3.0 information presented in the 16 June SMPTE/ATSC webinar.

Next gen TV skips a generation

On one hand, more people are watching more content on the smallest screens in the history of TV. On the other hand, some pundits are saying 4K is not enough. ATSC 3.0 can handle UHD and 4K, but that’s not the point of ATSC 3.0.

A new ATSC 3.0 receiver will receive and display ATSC 3.0 pictures, but there’s more to 3.0 than traditional linear content. In many ways, ATSC 3.0 is headed to mimic and aggregate the internet through a box called the home gateway. The LG Home Gateway, recently on display at the 2016 NAB ATSC Pavilion, is a good example.

The LG Home Gateway receives ATSC 3.0 signals and redistributes via Wi-Fi.

The LG Home Gateway receives ATSC 3.0 signals and redistributes via Wi-Fi.

Much like a PC, a Home Gateway does memory caching and reporting. That means forwarding some content such as ads and logos to local storage, and backhauling viewer reporting data via the internet. Its the kind of data Arthur C. Nielsen presumably dreamed of.

The 14 June Triveni webinar, “ATSC 3.0 Beyond RF: Now What?” was presented by Mark Corl, the company’s senior vice president of emergent technology development. ATSC 3.0 will be transmitted using OFDM modulation, which was not topic of the webcast. “Now What” indicates the perspective of "Now that we’re on the air with it, what are we going to do with it?" The following paraphrases Mark Cori's comments.

ATSC 3.0 is:

The 3.0 standard carries a significant amount of information and capabilities not found in the 1.0 standard. ATSC 3.0 is a large, layered system. It is defined in roughly 19 standard documents, 15 of which are above the RF layer. Those 15 documents talk about capabilities and functions that give broadcasters a significant number of possibilities, some possibly beyond the scope of what most people will want to do today.

Extensibility is a core requirement of all of the standards being produced for 3.0. The goal is to make it future-proof. One of the key aspects of 3.0 is that it leaps industry proprietary technology barriers found in 1.0, such as MPEG2 Transport, ASI, and SMPTE 310M. 3.0 leverages web commodity technologies such as IP, HTML5, Java Script, and off-the-shelf IP gear such as routers and switches. Only the Physical and Data link layers are restricted to the broadcast domain, and even those layers are programmable. The 3.0 standard allows the use of commodity equipment that couldn't previously be used for broadcast work without conversions between proprietary technologies and IP.

1.0/2.0 protocol stacks

ATSC 2.0 standards were produced looking at interactivity but they didn’t get much traction because 3.0 pre-empted it so rapidly. The broadcast stack at the top of Figure 1 shows the layers in 1.0/2.0. PSI (Program Specific Information) is the PAT (Program Association Table) and PMT (Program Map Table). 2.0 added everything in Blue at the top of Figure 1.

Figure 1. The IP layer is outlined in red because it is growing and shrinking. Image Triveni

Figure 1. The IP layer is outlined in red because it is growing and shrinking. Image Triveni

The ATSC 3.0 stack at the bottom of Figure 1 is obviously more complex. On the Broadcast side (left), the Link-Layer Protocol (ALP) carrier wraps all the content above the ALP layer in a well-known standard layer that allows 3.0 to be treated independently without understanding the layers above.

Figure 1 bottom right shows equivalent layers on the Broadband side of 3.0. It shows the familiar HTTP stack with TCP/IP and UDP, with most being transmitted in UDP. This layer uses low-level signaling, which is a set of tables wrapped in a binary shell.

MMTP (MPEG Media Transport Protocol) and ROUTE (Real-Time Object Delivery over Unidirectional Transport) are two transport carriers. ROUTE is a carrier for IP broadcast data. Everything to this level is pushed and turned around into a HTTP Proxy mechanism ready to be read. The files are MPEG-DASH (ISO Base Media File Format MP4) files. DASH (Dynamic Adaptive Streaming over HTTP).

Above the HTTP Proxy are the MPU (Media Processing Unit) player/decoders. ROUTE is played using a DASH Player/Decoder. At this layer there are also a variety of signaling mechanisms along with Announcement and NRT (Non-Real-Time) file delivery. This layer is where the Broadcast and Broadband stacks merge to form a hybrid that run the Applications.

Shifting focus

3.0 shifts the focus of broadcast technology from wires and components to IP-based software and services. The new world of functions has moved from changing boxes and wires to checking boxes for new licenses, features and capabilities. Dynamic reconfiguration in how content is delivered can be accomplished with no changes to the infrastructure.

The problem with such digital complexity is that management tasks will grow exponentially with the addition of more functions and paths requiring integrated control. Objects and things at each end of the chain need to know what things at the other end are doing.

Modules can be applied to deploy different business missions simultaneously in vertical markets, and be adjusted as desired. Centralized configuration and control will be essential for efficiency.

Figure 2. Comparing the simplicity of ATSC 1.0 with internet workflow with the complexity of ATSC 3.0. Image Triveni

Figure 2. Comparing the simplicity of ATSC 1.0 with internet workflow with the complexity of ATSC 3.0. Image Triveni

Figure 2 shows another contrast between 1.0 and 3.0. The top part shows the familiar flow of 1.0. When the internet was added, variable IP costs and variable IP bandwidth were usually managed as a separate entity. Distribution by CDN requires several formats, resolutions and AV encoders for each. This brought about the ability for TV stations to talk to tablets and phones.

ATSC 3.0 (Figure 2 bottom) includes the fixed cost of the transmitter. By changing to HEVC encoders and Dolby AC4 codecs, stations can bring in UHD and HD and begin repurposing and working with the content cache and transmitted data.

At the same time, stations can begin dynamic provisioning with dynamic control of the forward error correction of each of the four physical layer pipes (PLPs) carried in their ATSC 3.0 broadcast signal. The Signaling Only pipe has the maximum RF reach at a nearly negative S/N ratio. It could be used, for example, to transmit SD channels to locations with the worst RF reception in a station's market.

Up to 64 PLPs are possible, but receiver manufacturers have asked ATSC to initially use only four. Dynamic provisioning allows broadcasters to determine the penetration for a given set of data at any given time. Dynamic provisioning allows pre-positioning of content to receivers with content caches, or to a Gateway with a receiver chip that stores content to be accessed by every device on its network. Think of it as a home CDN.

The 3.0 standard also allows broadcasters to offer interactive applications to receivers and devices. Broadcasters can create a HTML5 Java Script component, think of it as a web page with lots of content under it, and use it to control the display and access to the cloud, and merge it together. This requires that NRT data delivery and app packaging to work together.

The Watermark Sync tells the receiver in a MVPD environment where the information is and how to get it. Watermarks are in both the audio and video. Watermark information tells the receiver where to pull the replacement data off the internet.

ATSC 3.0 describes usage metrics, which begin when a viewer locks in. Receivers report back to the broadcaster what viewers have seen, how long they were on the channel, what type of activities on the channel were used and how. This feedback loop helps recognize and redirect information through provisioning.

Figure 3. Certain Broadcast Services Management Platform vertical service management features are enabled by licensing. Image Triveni

Figure 3. Certain Broadcast Services Management Platform vertical service management features are enabled by licensing. Image Triveni

To manage the avalanche of data, Triveni developed the Broadcast Services Management Platform. The left column of Figure 3 lists vertical management activities. Service Management Functions shown are on the right. As the color lines indicate, there’s a lot of talking taking place between the Service Management and Function Management columns because it’s all tied together.

Vertical Core Signaling and EPG Management, for example, requires many management function activities. They all need to be made known so they could be sent information. EPG is tied to all service Management functions, as is Interactive Services vertical management activities.

Interactive Services are broadcaster applications.

Addressable Content Management is essentially ad replacement. Data Distribution Management is the mechanism to send out NRT data. Each is available by licensing. Core and metadata management functions are required for most ATSC 3.0 features. Each vertical service uses many core functions in varying degrees.

Ad Replacement

Addressable Content Management is essentially addressable content and ad replacement based on user preferences. Hybrid delivery of ad content can be either delivered over NRT in the broadcast, or over broadband if so desired.

Because content is referenced by URL, it can be pulled from the broadcast stream or over-the-top. Its managed by a broadcaster-supplied interactive application that allows the broadcaster to pick and choose how they want Addressable Content Management to work. Stations must determine what business reasons the station wants to use to determine what particular ads to replace, and for which viewers.




">

Figure 4. The Broadcast Studio workflow diagram shows how the Broadcast Services Management Platform works. Image Triveni



The Broadcast Studio illustrated in Figure 4 shows the workflow. The system must manage all possible cases connected via an antenna, the internet, or an MVPD-only STB connection over HDMI.

The Traffic system places an ad (1). The Automation system knows when it’s time to get ready (2) and it tells the Timing and Scheduling system (3) to start working on this particular ad insertion point and the rest of the system is notified. Dynamic Provisioning (4) starts allocating space to download the applications and the ad data itself. It also prepares to do the ad splicing insert, and opens up a data pipe for the ads and applications to be delivered. From that point, the flow can be followed by the numbers in Figure 4.

After provisioning occurs, the NRT data delivery system pushes the content from the broadcast ad server through the data delivery system, out to the broadcast and to the pre-positioned ad in the home. Or, it could be placed in the on-line ad server at the CDN for those TVs without local storage.

At the same time, the AV encoders start pushing out their signaling to indicate its time to be told to insert an ad. Additional signaling is created to indicate what time the ad starts and where it is. That information is pushed through the broadcast so the receiver can find it.

Finally, the Watermarks can be turned on to tell what time it is and that the signaling information could be acquired by receivers that don’t have don’t have a connection to the broadcast. They would pick up the information from the cloud and pull it in from the cloud instead of the broadcast.

When the receiver application that’s been loaded determines it is time to replace an ad, it uses its personalization information data to determine which ad to show, pull it from the local pre-positioned ad server or the on-line ad server, or from carousel data that was on the broadcast. The system would then report back that the insertion occurred, with a report sent to the management system and traffic for billing. This environment will only get more complex as time goes on.

Part 2 of this story will report on the 16 June SMPTE and ATSC webinar perspective of ATSC 3.0 broadcasting. Stay

Editor's note: All Figure graphics courtesy Triveni Digital.

Related Editorial Content

With ATSC 3.0 Looming, Broadcasters Have Momentous Decisions to Make

In the months ahead, OTA television station owners face some momentous decisions. Fast-changing technology will force them to either sell their spectrum in upcoming FCC auctions or rebuild their technological and business infrastructures to operate in a highly competitive Internet-centric…

Triveni to Demonstrate Seamless ATSC 3.0 Migration Plan at NAB

Television broadcast equipment manufacturer Triveni Digital will demonstrate new platforms for easy migration to ATSC 3.0 at NAB 2016.

ATSC 3.0 In The Air and On-The-Air at 2016 NAB Show

The ATSC 3.0 standard specifies an entire next-generation broadcasting system, from the RF transmission through presentation to the viewer or listener and all the necessary items between the camera and the consumer.