How D2C Streamers Are Evolving The OTT Viewing Experience
What are D2C streamers planning for our future viewing experience?
Looking to the future of devices starts with resolution and moves towards immersive viewing experiences. Along this multi-year continuum are the two key subjects of defining the optimal customer experience of content discovery and consumption, and ensuring the cost of producing and delivering the content results in a profitable business.
Resolution is the current focal point for TV manufacturers. But how much resolution can we really experience and benefit from? What can our eyeballs handle? And is the extra delivery cost worth it? These questions are central to the D2C Streamer’s plans.
4K TVs are already commonplace in the market but 4K content is not. We know broadband networks have the ability to deliver 4K streams more effectively than traditional broadcast infrastructure can. But the CDNs must also handle the extra workload and the content needs to be produced in this format. 4K cameras are becoming more commonplace as well, but still the vast majority of production is in 1080P and upgrades to 4K are likely to take many years.
As for 8K TVs and content, this is even further down the road. Plus, in what situations will an 8K TV make sense? Tests already show that our eyeballs cannot handle 8K in a normal living room environment. So, what will happen with ever-increasing resolutions?
First, perhaps resolutions will not continuously increase. D2C Streamers express preference for higher framerates and HDR over higher resolutions. Frame rate leads to picture quality improvement at the same bitrate. 100 frames per second (FPS) instead of 50/60 FPS would be a big benefit. HDR leads to better, sharper colours. Both ultimately lead to increased delivery costs but not like a 3x jump from 1080P at 12 Mbps up to 4K UHD at 36 Mbps.
At the same time, enhanced resolutions are a revenue opportunity because people will pay for the ultimate experience on their 4K TV, and probably on their 8K TV as well. The price for the service, whether continuous or event-based, just needs to be acceptable to the consumer and cover the extra cost of delivery to the D2C streamer.
Where the advanced resolutions could bring big value is in a multi-viewer experience. Imagine an 8K ready TV with three tiles on it, showing 4K, 2K and 2K. Some D2C streamers already support a multi-viewer experience on devices that support it, like Apple TV. Many Samsung TVs support multiple inputs on the same device, which is clunky but still doable.
The main issue for multi-viewer experiences is that the platform must support it by offering multiple video player areas on the device’s screen. Traditionally, TVs have always had just one player area. Android TV offers more, but the device must decode multiple streams. Ironically a smartphone is about 10x more powerful than a TV and yet it is the TV that needs to have this multi-viewer ability.
But a simple multi-viewer experience is just the first step. What about customer choice to configure their own display? That could include multiple video streams for video, gaming, social, statistics and advertising.
A common conversation for years has been about the multi-screen experience (rather than the multi-viewer experience). Can we link the experience between mobile device and big-screen TV in some way? D2C streamers know that the mobile form factor is used predominantly for content snacking. It is not good enough for the full-on live video viewing experience (especially sports), that can often be in a social setting or at least needs to be a sit-back, immersive experience. Foldable devices could create an opportunity for a 6-inch screen to become a 20-inch screen, but that isn’t the same as the 80-inch screen experience. And will it really work as well as a fixed big screen for the majority of viewing scenarios? The answer is probably not…. So, the interest in the multi-screen experience is moving towards a joined up mobile and living room experience, using the mobile device to search and select content that can then be moved onto the big screen. This could conceivably convert across to 2D/3D augmented reality projections, including the use of headsets, but service development is in the early stages.
Devices focus our minds on the subject of content delivery, but there is a production choice in the first place that ties together with the device world. Most leading broadcasters are using 1080P as the highest resolution for standard production (e.g., live sporting events). Many broadcasters are using 1080i and up-converting to 1080P or 4K. Standardising on native production in 4K, and even more futuristically in 8K, is many years in the future. For now, it is realistic to say that when content is all produced in 4K and contribution networks are standardised for 4K, then maybe it will be time to truly leverage 4K for onward distribution. As is normally the case in the media industry, the two sets of glass have a capability to deliver something that the interconnecting infrastructure is not capable of, at least not at scale or cost-effectively. But as always there are pioneers and early adopters who blaze the trail.
The production choice is not just impacted by the infrastructure’s ability and the cost, but also by whether it can be consumed by the customer. Virtual reality (VR) is considered a good use case for 8K. Two 8K streams will be needed to supply feeds to the 2 eyes in a VR headset. This level of quality is required to avoid a negative sensation by the user. But this needs a solution that operates in the Gbps range, so it cannot reach the mass market until the bandwidth is large enough and the content processing is fast enough. Again, this could take years to reach a cost-effective, mass-market solution. But this could be when the immersive viewing experience starts to replace the flat-screen experience for many people.
This brings us to the expected future of devices and the video streaming experience. Today the technology means we are focused on the flat-screen experience where the use of resolution and colour, and the move to a multi-viewer format will give us different and enhanced viewing experience.
The natural next step, already underway with leading edge technology, is to create more immersive viewing experiences. Sports and other live interactive events are a perfect use case for this. In many ways, the real-world stadium / theatre experience is too fixed in what it offers. There is no doubt that the human interaction, the in-person atmosphere, and the multi-sensory experience of being there in person cannot be matched. But the actual ability to watch the sport or the event is severely limited by the position of our seat and the position of the activity at the event. VR is how we can improve on this to create immersive experiences. Real-time motion capture is leading the way now, being used in trials to capture the movement of athletes, which can be used to render a video for a consumer and then inject the viewer into the game/match/fight/race from any number of angles and perspectives.
Devices are central to our viewing experiences of the future, even more today than ever before. From the traditional big-screen TV in new formats, to our mobile devices for convenience, and now to a brave new world of immersive virtual reality experiences, the future of D2C streaming is an exciting place. And it is our OTT mode of delivery, tying in with internet-based norms and connecting with the world of gaming, that is pushing new boundaries. Devices dictate how we view content. But they also open us up to new levels of creativity for content production that will continue to drive the world of OTT forwards.
Part of a series supported by
You might also like...
The Resolution Revolution
We can now capture video in much higher resolutions than we can transmit, distribute and display. But should we?
Microphones: Part 3 - Human Auditory System
To get the best out of a microphone it is important to understand how it differs from the human ear.
HDR Picture Fundamentals: Camera Technology
Understanding the terminology and technical theory of camera sensors & lenses is a key element of specifying systems to meet the consumer desire for High Dynamic Range.
IP Security For Broadcasters: Part 2 - The Problem To Be Solved
By assuming that IP must be made secure, we run the risk of missing a more fundamental question that is often overlooked: why is IP so insecure?
Standards: Part 22 - Inside AIFF Files
Compared with other popular standards in use, AIFF is ancient. The core functionality was stabilized over 30 years ago and remains unchanged.