Monitoring & Compliance In Broadcast: Monitoring QoS & QoE To Power Monetization

Measuring Quality of Experience (QoE) as perceived by viewers has become critical for monetization both from targeted advertising and direct content consumption.

Measuring Quality of Experience (QoE) has become increasingly critical for revenue generation as broadcast and video services fragment across multiple platforms, devices and delivery infrastructures. At the same time, such measurement has remained elusive despite efforts to establish clear surrogates of end user perception that can be monitored and analyzed cost effectively, as well as some actual consumer feedback in near real time via data return paths.

The situation has been compounded both by the increasing complexity of regulatory compliance over privacy associated with ad or content targeting, and by the growing immersivity of the video itself. All of these must be addressed within the delivery infrastructure, which for international services has to take account of differences both in user responses and compliance regulations between countries.

Even catering for the evolution of video quality is challenging enough, and this is fundamental to gauging the user experience. Firstly is the 2D picture itself. On the one hand there has been the progression from HD through 2K, 4K and 8K resolutions at the full image level, making the picture sharper. Then there is the advance in pixel quality with High Dynamic Range and Wide Color Gamut (WCG), boosting depth and resolution. Finally, on the 2D front there is High Frame Rate (HFR), making fast moving action smoother and more pleasing to view.

While these facets can be quite readily accommodated, Extended Reality (ER) introducing a 3D aspect, on top of high-resolution audio and potentially holographic projection as well as haptic feedback, introduce complexities that defy traditional manual methods of QoE assessment, such as A/B testing. They are effectively beyond the reach of online surrogate measures of QoE as well.

Some service providers have been experimenting with instant feedback from users with the offer of incentives in some cases. This has the merit of reintroducing direct human response with the potential to adapt aspects of QoS that might correlate with QoE in near real time. There are a number of projects out there that exploit this potential, at least with internet connected streaming services.

But the downside is that the feedback may not be representative of overall user experience, being subject to sample bias. As a result, the foundations of perceptual Video Quality Assessment (VQA) measurement are the same as they have been for two decades or more, as VQA models and algorithms have diverged into versions optimized for VoD streaming, User Generated Content (UGC) as perfected by the likes of YouTube, ER including virtual and augmented reality (VR and AR), and cloud gaming, as well as for 4K, HDR and HFR.

ER and cloud gaming introduce additional variables which can be subjective, especially when associated with the wearing of headsets, which can invoke nausea and headache upon prolonged use especially, even though they enable the experience.

A major challenge still lies in mapping perceptual VQA onto suitable surrogates that have expanded beyond those basic KPIs (Key Performance Indicators) like startup time, rebuffering, latency, and average bit rate, which used to dominate discussions of QoE measurement. Taken at face value these correspond poorly with real user QoE and so have been combined with other variables, including form factor of the user device when available, viewing environment, content type, and any other information that can be gleaned, such as profile of the consumer.

IF & ML

Naturally, machine learning (ML) has been applied to help sift these correlations and enable more consistent QoE measures as a basis for intervention both in real time, and on an ongoing basis to improve the service. But even AI/ML requires assistance with data preparation and segmentation to work effectively. 

ML can still operate in unsupervised mode to avoid any preconceptions, but QoS data needs to be segmented in a way most conducive for the analysis to operate. This has given rise to segmentation of QoS measurements into distinct so-called Influencing Factors (IFs).

The starting point is the ITU (International Telecommunications Union) broad brush definition of QoE as “the degree of delight or annoyance experienced by the user of an application or service”. To conform with this definition, various QoS and other relevant metrics are segmented by category prior to correlative analysis under ML.

Traditional QoS metrics or KPIs would comprise just one IF category, with others including direct user feedback where available, measures of immersion, interaction and interconnection, as well as viewing environment and aspects of the content handling process. The latter includes video capture, delivery medium, rendering, and final display.

While this yields coherent subsets of data relevant for QoE measurement it also increases complexity. Correlations among these IF categories cannot be calculated just by simple aggregation of the data. It requires the help of subjective tests, and then ML algorithms can come into play by identifying patterns across these IF data sets that match levels of QoE perception by users.

Monetization

There is at least growing evidence that this works, while it has long been quite clear that QoE in turn is a predictor of positive user responses, such as engagement with targeted ads, and satisfaction with content quality. Ad engagement can range from merely watching some of it, to calls for actions which can extend in the best case to click for purchase. These can be measured and acted upon to refine both targeting of ads to the right people for maximum reach, and personalization to increase chance of positive engagement.

Both targeting and personalization can also improve customer QoE by restricting ads to those they find relevant and interesting, if done effectively and sensitively. The realm of QoE management should extend to restricting targeting and presentation of ads, since part of the objective is to optimize engagement by keeping users happy with their overall experience of a given video service.

Compliance

Effective ad targeting and personalization relies on user data in some form, and inevitably that has become governed by increasingly complex and diverse regulations, administered by various bodies that service providers have to be more aware of than in the past. In the UK for example we can identify at least four agencies actively monitoring ads for compliance. These are the Advertising Standards Agency (ASA), Financial Conduct Authority (FCA), telco regulator Ofcom, and the Charities Commission.

It used to be just the ASA which could do no more than administer a slap on the wrist causing some reputational damage, but now the FCA in particular bares real teeth and is gaining even greater powers through the UK’s Digital Markets, Competition and Consumers Bill. The FCA has already cracked down quite hard on the current scourge of greenwashing ads, as has the EU in Europe through its strict GDPR regulations. 

Admittedly we are talking here about ad content, but that is increasingly hard to separate from aspects of QoE monitoring given the dependence on users’ data obtained from first party cookies and other sources that are governed by compliance regulations. There are numerous differences in regulation between countries, and in the USA even among States. There are also one or two outliers among EU member states. A particular jurisdiction might seem laxer in one respect but stricter in another.

Europe with GDPR is considered strict, with its blanket requirement to seek consent from users for accepting cookies with banners, offering “accept all” and “reject all but necessary” choices. Germany has an additional “double opt-in consent” confirmation requirement.

Yet in California even such a request for consent would violate the requirement of waiting at least 12 months following an opt out before seeking authorization for selling or sharing personal information for cross-context behavioral advertising. There are various other anomalies, making compliance highly challenging even for larger service providers and practically impossible for smaller ones.

The latter therefore tend either to confine their advertising to larger platforms with their own compliance engines, or they fall back on traditional contextual advertising. Alternatively, they may resort to a lowest common denominator approach where they assume all countries in which they operate apply the most draconian data privacy regimes. This inevitably stymies their competitiveness, since they cannot exploit the latest ad targeting and personalization technologies effectively, or at all.

Geo-blocking can be applied to confine ad targeting to specific countries for which a service provider has been able to cater, without falling foul of other markets where it operates. Then a service provider can at least exploit ad targeting in key markets.

AI Personalization

There is a trend in the field now towards incorporation of AI across the whole content chain from creation to consumption, especially advertising. Generative AI is being deployed on the ad creation front in response to feedback from previous campaigns, or the ongoing ones to make edits for specific target viewers.

There are now suppliers of AI packages or services that match generation of content to known attributes of demographic, psychographic or geographic segments that have previously been determined. This allows the same body content to be tweaked with different captions, headings, graphics, or soundtracks matched to certain target segments. 

Psychographic segmentation distinguishes people by lifestyle, values, preferences, interests and attitudes, and is distinct from behavioral segmentation on the basis of actions taken more at face value. The latter can be applied successfully without using cookies or requiring customer consent at all, just analyzing data and applying profiles on the basis of transition activities. With the help of machine learning to correlate this with other contextual information, including nature of the content and time of day, targeting and personalization can be executed quite effectively, providing they are done judiciously.

All this does require interactive feedback, and also QoE monitoring, as well as compliance with regulations regarding the ads themselves, as opposed to the data that enabled their targeting. AI will increasingly integrate these activities, while also governing itself and addressing some of the ethical concerns relating to bias that have also surfaced in the context of ad targeting in particular. But that point has not yet been reached.

Part of a series supported by

You might also like...

Live Sports Production: Backhaul In Live Sports Production

Getting content reliably and securely from venue to studio remains key to live sports production so here we discuss the technology and services required.

Local TV In The U.S.A – 1967 Style

Our very own TV pioneer shares recollections of local TV in the US from his start in 1967.

Monitoring & Compliance In Broadcast: Monitoring Delivery In The Converged OTA – OTT Ecosystem

Convergence or coexistence between linear broadcast, IP based delivery and 5G mobile networks creates new challenges for monitoring of delivery paths, both technically and logistically.

Live Sports Production: Broadcast Controllers & Orchestration In Live Sports Systems

As production infrastructure, processing resources and the underlying networks required become ever more complex, powerful tools are required to plan, deploy and monitor.

Monitoring & Compliance In Broadcast: Monitoring The Media Supply Chain

Why monitoring the multi-format delivery ecosystem starts with a holistic approach to the entire media supply chain.