Checking the quality of digital media streams
Without standards, the world would be a very difficult place to live in. There are many kinds of standards that affect almost every aspect of live – technology is just one of those areas. We can consider language as a kind of standard that allows people in one part of the world to communicate with each other. International finance uses standardised methods of accounting to try to provide a consistent framework for doing business. Currency itself is a symbolic representation of value that we use as a standard for exchange of goods and services.
So we need standards to get on with our daily lives. In our industry and many others, technological development is not regulated or centrally organised; it takes place in a free-for-all where commercial realities hold sway. But in order to build workable infrastructure for a national or international cellular phone system or a broadcasting network, these commercial interests have to be tempered by some kind of framework that allows competing energies to be channeled in roughly the same direction.
It’s here that the tension between competition and regulation creates compromises. A standards body usually includes among its members representation for the main competing entities in any field of development –the commercial organisations fighting it out to establish their way of doing things as the dominant way. The aim of the standards body is to establish some framework in which the customer for the technology can enjoy the benefits of an open market (the ability to buy from competing manufacturers) while also benefiting from enough market regulation to ensure that the manufacturers are all playing roughly in tune.
It’s an inherently conflicted endeavour, and the output from most standards committees reflects that conflict. The standard will provide just enough common ground to allow an industry sector to resist splintering and move forward in an agreed direction, but the compromises necessary for reaching an agreement tend to push the standard towards the lowest common denominator, while leaving wriggle room for the more powerful commercial entities to advance their proprietary technologies within the framework. Standards therefore usually represent not the best way to tackle a problem, but the best way that can be agreed on at this point, given the competing interests.
But standards do not provide a tool for dealing with every contingency in the real world. Language for example, is a fluid, constantly-evolving thing; and try as they might to provide a standard of ‘correct’usage, the language academies in some European countries can only fight a losing battle against neologism, foreign influences, and regional variations. Currency unions such as the eurozone are a good idea in some ways, but the standard that they try to establish places strain on some of the economies adhering to the standard. An architect may produce construction drawings for every last detail of a project, but in the real world it’s the builders who create the edifice, and the real world is full of unforeseeable mishaps, nuances, improvisations and infinite shades of grey cement.
So while a standard is conceived in compromise and in some abstraction from the messiness of the real world, it’s also a snapshot of a moment in time. Given the average gestation period required to produce an industry standard, there’s more than a chance that by the time it emerges, it will already have been partly overtaken by events – by the relentless onward march of technological development.
To return to the language analogy, if your academy-sanctioned standard language does not contain words for new concepts and technologies invented elsewhere in the world, the normal solution is to borrow the foreign word for the invention, making it part of the language even if the standard prescribes that foreign words should be avoided. In effect, some divergence from the standard is necessary to maintain the usefulness of the language as a tool for describing the world.
So it’s important in any industry to recognise standards for what they are: a way of dealing with part – but only part –of the messy real world. They provide a baseline –but not a safety net, and in practice there will always be important elements of day-to-day operations that are outside the boundaries of the standard. But a complacent, box-ticking mentality about standards is a trap to avoid, and the idea that if a product is standards-compliant it is ipso facto a perfect tool for the job is a dangerous one.
Simen Frostad is chairman of Bridge Technologies
In the digital media industry, ETR290 (ETSI TR 101 290) is the key standard for evaluating the quality of digital streams, underpinning the common approach to testing and monitoring. But like any standard, it has its limitations. There are grey areas, and a great deal of complexity which makes it difficult for operational staff to grasp fully, unless they have a rare degree of expertise. So given the strain on staff resources and time in the real world of digital media services, monitoring based on ETR290 is often poorly calibrated, and as a result it delivers less accurate data.
Add to this the fact many of the component parts of a digital media service are outside the scope of ETR290: if there’s a fault in the conditional access system, a malfunction in the programme guide, or the wrong language is presented, the effect on service quality can be serious. Yet none of these errors would be picked up by testing based on ETR290. So something in addition to the prevailing standard is needed.
Our response to this is Gold TS Protection –a basis for monitoring and analysis of digital media services which includes high-quality ETR290 testing and extensions to it, but which also tests for other vital elements in a service, such as correct functioning of the CAS and EPG. It provides the safety net that is missing in T&M systems based solely on ETR290, and makes it very much simpler to create an accurate and useful calibration of the parameters for ensuring higher service quality. It’s a practical response to the recognition that digital media monitoring needs more than just the standard if it’s to be truly effective.
You might also like...
TAG Video Systems takes advantage of over 70,000 globally deployed probing points to give users the ability to dive deep into streaming content monitoring. The company anticipates more than 100,000 probing point deployments by the end of 2021.
In the last article in this series, we looked at why integrated monitoring is a necessity in modern broadcast IP workflows. In this article, we dig deeper to understand what is new in IP monitoring and how this integrates with…
A few years ago, a prominent manufacturer of studio support equipment did something unusual: it went to NAB with an experienced broadcast camera operator to discuss a part of live production that’s invisible when done well. Following a driven g…
Optimization gained from transitioning to the cloud isn’t just about saving money, it also embraces improving reliability, enhancing agility and responsiveness, and providing better visibility into overall operations.
Video, audio and metadata monitoring in the IP domain requires different parameter checking than is typically available from the mainstream monitoring tools found in IT. The contents of the data payload is less predictable and packet distribution more tightly defined…