Standards Don’t Answer all the T&M Questions

Without standards, the world would be a very difficult place to live in. There are many kinds of standards that affect almost every aspect of live – technology is just one of those areas. We can consider language as a kind of standard that allows people in one part of the world to communicate with each other. International finance uses standardised methods of accounting to try to provide a consistent framework for doing business. Currency itself is a symbolic representation of value that we use as a standard for exchange of goods and services.

So we need standards to get on with our daily lives. In our industry and many others, technological development is not regulated or centrally organised; it takes place in a free-for-all where commercial realities hold sway. But in order to build workable infrastructure for a national or international cellular phone system or a broadcasting network, these commercial interests have to be tempered by some kind of framework that allows competing energies to be channeled in roughly the same direction.

It’s here that the tension between competition and regulation creates compromises. A standards body usually includes among its members representation for the main competing entities in any field of development –the commercial organisations fighting it out to establish their way of doing things as the dominant way. The aim of the standards body is to establish some framework in which the customer for the technology can enjoy the benefits of an open market (the ability to buy from competing manufacturers) while also benefiting from enough market regulation to ensure that the manufacturers are all playing roughly in tune.

It’s an inherently conflicted endeavour, and the output from most standards committees reflects that conflict. The standard will provide just enough common ground to allow an industry sector to resist splintering and move forward in an agreed direction, but the compromises necessary for reaching an agreement tend to push the standard towards the lowest common denominator, while leaving wriggle room for the more powerful commercial entities to advance their proprietary technologies within the framework. Standards therefore usually represent not the best way to tackle a problem, but the best way that can be agreed on at this point, given the competing interests.

But standards do not provide a tool for dealing with every contingency in the real world. Language for example, is a fluid, constantly-evolving thing; and try as they might to provide a standard of ‘correct’usage, the language academies in some European countries can only fight a losing battle against neologism, foreign influences, and regional variations. Currency unions such as the eurozone are a good idea in some ways, but the standard that they try to establish places strain on some of the economies adhering to the standard. An architect may produce construction drawings for every last detail of a project, but in the real world it’s the builders who create the edifice, and the real world is full of unforeseeable mishaps, nuances, improvisations and infinite shades of grey cement.

So while a standard is conceived in compromise and in some abstraction from the messiness of the real world, it’s also a snapshot of a moment in time. Given the average gestation period required to produce an industry standard, there’s more than a chance that by the time it emerges, it will already have been partly overtaken by events – by the relentless onward march of technological development.

To return to the language analogy, if your academy-sanctioned standard language does not contain words for new concepts and technologies invented elsewhere in the world, the normal solution is to borrow the foreign word for the invention, making it part of the language even if the standard prescribes that foreign words should be avoided. In effect, some divergence from the standard is necessary to maintain the usefulness of the language as a tool for describing the world.

So it’s important in any industry to recognise standards for what they are: a way of dealing with part – but only part –of the messy real world. They provide a baseline –but not a safety net, and in practice there will always be important elements of day-to-day operations that are outside the boundaries of the standard. But a complacent, box-ticking mentality about standards is a trap to avoid, and the idea that if a product is standards-compliant it is ipso facto a perfect tool for the job is a dangerous one.

Simen Frostad is chairman of Bridge Technologies

Simen Frostad is chairman of Bridge Technologies

In the digital media industry, ETR290 (ETSI TR 101 290) is the key standard for evaluating the quality of digital streams, underpinning the common approach to testing and monitoring. But like any standard, it has its limitations. There are grey areas, and a great deal of complexity which makes it difficult for operational staff to grasp fully, unless they have a rare degree of expertise. So given the strain on staff resources and time in the real world of digital media services, monitoring based on ETR290 is often poorly calibrated, and as a result it delivers less accurate data.

Add to this the fact many of the component parts of a digital media service are outside the scope of ETR290: if there’s a fault in the conditional access system, a malfunction in the programme guide, or the wrong language is presented, the effect on service quality can be serious. Yet none of these errors would be picked up by testing based on ETR290. So something in addition to the prevailing standard is needed.

Our response to this is Gold TS Protection –a basis for monitoring and analysis of digital media services which includes high-quality ETR290 testing and extensions to it, but which also tests for other vital elements in a service, such as correct functioning of the CAS and EPG. It provides the safety net that is missing in T&M systems based solely on ETR290, and makes it very much simpler to create an accurate and useful calibration of the parameters for ensuring higher service quality. It’s a practical response to the recognition that digital media monitoring needs more than just the standard if it’s to be truly effective.

You might also like...

Minimizing OTT Churn Rates Through Viewer Engagement

A D2C streaming service requires an understanding of satisfaction with the service – the quality of it, the ease of use, the style of use – which requires the right technology and a focused information-gathering approach.

Production Control Room Tools At NAB 2024

As we approach the 2024 NAB Show we discuss the increasing demands placed on production control rooms and their crew, and the technologies coming to market in this key area of live broadcast production.

Designing IP Broadcast Systems: Where Broadcast Meets IT

Broadcast and IT engineers have historically approached their professions from two different places, but as technology is more reliable, they are moving closer.

Network Orchestration And Monitoring At NAB 2024

Sophisticated IP infrastructure requires software layers to facilitate network & infrastructure planning, orchestration, and monitoring and there will be plenty in this area to see at the 2024 NAB Show.

Audio At NAB 2024

The 2024 NAB Show will see the big names in audio production embrace and help to drive forward the next generation of software centric distributed production workflows and join the ‘cloud’ revolution. Exciting times for broadcast audio.