SeaChange Urges Operators to Raise Game Over Content Management

Operators must adopt a robust and comprehensive CMS (Content Management System) if they are to compete against new and existing rivals in the multiscreen era of diverse target platforms, video sources and delivery models. This argument is set out by video platform and software company SeaChange in a white paper detailing how the latest CMSs dovetail with emerging trends in pay TV services, including agile development methods, cloud delivery and network virtualization.

Operators first have to play catch up by reducing time to market to days rather than months or even years as was the case in legacy broadcast environments. They then need to go further by seizing the initiative and becoming leaders rather than followers in pay TV innovation, which can only be done by moving to a software and cloud-based virtualized infrastructure, where DevOps methods are used to align software development much more closely with operational performance, according to the SeaChange paper.

DevOps has become a mantra for pay TV software development and a popular term in marketing collateral as well as white papers. But it has already proven it can significantly reduce development time and enable relevant features to be added on a much timelier basis in response to demand from operational, customer support or even marketing departments, closer to what already happens for online and mobile services.

DevOps is a contraction of development and operations and means bringing these two functions much closer together for software development, which reduces the time lag between identification of requirements and deployment of new features, as well as speeding up the testing cycle. SeaChange argues that changes in UIs, service lineups and features, such as recommendations or the introduction of enhanced information and viewing options with sporting events, can then be implemented almost instantaneously. Furthermore, functions such as pause and rewind, place-shifting from one device to another, or cloud-based DVR, can readily be implemented as software enhancements on underlying low cost COTS (Commercial Off The Shelf) processing and storage infrastructure.

The paper positions the CMS at the center of the emerging virtualized cloud and argues it must be chosen such that it allows operators to combine their legacy systems as effectively as possible with the new infrastructure. At least here operators can exploit work already done in the enterprise IT world, where critical legacy functions have already been ported as software modules to commodity hardware.

The paper does cite one key development specific to video, the SMPTE (Society of Motion Picture and Television Engineers) 2022.6 standard, which enables integration between the IP and SDI (Serial Digital Interface) domains. This standard is now being implemented in a new generation of video controllers designed to run on commodity hardware and this is enabling broadcasters to package and send SDI signals over IP networks. Legacy SDI-based workflows can then be migrated to IP networks.

Similar convergence is taking place in the contribution and distribution domains, where broadcasters can now convert ASI (Asynchronous Serial Interface)-based transport streams for delivery over IP links by pushing them through new high-density transport stream gateways. As a result, cross-platform management of content is now possible for a wide range of applications across the whole ecosystem, including contribution, studio-to-studio media exchange, in-house signal distribution and routing, post-production and live event coverage.

The most telling point of the paper is its argument that a carefully chosen CMS fully optimized for the cloud and virtualized hardware should enable operators to jump to the front of innovation by exploiting their content catalogues for novel direct-to-consumer services, rather than just replicating legacy channel based pay TV offerings. They should make use of their content assets to develop new brands designed to appeal to online viewing tastes, which include niche audiences as well as larger scale services pitched with an on-demand focus. These, SeaChange contends, require a master workflow that supports end-to-end, highly automated operations starting with original production and extending through all aspects of post-production, content and metadata management.

The paper also highlights the virtues, or indeed necessity, of elasticity in the cloud so that services can scale up or down as demand dictates, to cater for transient surges in audiences for live events streamed over the Internet for example. What the paper does not say is that this depends as much on the resources available in the cloud as on the CMS, so that both must be chosen judiciously.

The paper then defines the key ingredients of a modern CMS, which must include support for automated ingestion of content from any source along with aggregation and cataloging of all relevant metadata, both at the time of ingestion and as new data is added over time. This is needed for convenient access by search and recommendation engines, as well as other mechanisms that enable personalized features. Also on the metadata front, the CMS must be able to drive the stream packaging and manifest manipulation mechanisms required to enable delivery of content to every device with personalized features and dynamic ad insertion as needed.

The system must also be able to apply quality control and policy enforcement processes to ensure assets are fully compliant with rights restrictions, while automatically making corrections. At the same time, it must be able to enforce the operator’s own business rules, for example embracing the various ways content can be configured and categorized to support service models. This includes managing all encoding and transcoding operations across live and file based content and engaging mechanisms that support time-shift modes from instant trick-play functions to catch-up and cloud DVR.

Finally, the workflow must of course support and increasingly automate the steps involved in content protection, from conditional access to digital rights management (DRM) and now watermarking.

There is however one key CMS function omitted from the paper, video analytics. This will need to be integrated with the CMS so that content creators, curators or distributors can monitor performance and health of the whole content library, as well as gain insights into user behaviors. This is also needed for content marketing and tracking Return On Investment (ROI) down to individual items, rather as Google Analytics achieves for online assets.

You might also like...

Sustainability Of Streaming: How Does OTT Compare With OTA? - Part 1

OTA (over-the-air) broadcasting has long-been considered an efficient way to reach very large audiences. As OTT (over-the-top) streaming grows there are concerns that we are going backwards in our levels of efficiency, which is not what we need to do…

The Business Cost Of Poor Streaming Quality - Part 2

Part 1 focused on what poor streaming quality means and what it can cost a D2C Streamer across multiple financial dimensions. This article focuses on preventing and fixing problems.

The Business Cost Of Poor Streaming Quality - Part 1

It is safe to say that probably every streaming service has delivered poor quality experiences to its viewers at some point.

The Importance Of CDN Selection To Achieve Broadcast-Grade Streaming Experiences - Part 2

CDN Selection is a pivotal point in the streaming delivery chain that makes an important difference to the success of a D2C Streamer.

Designing Media Supply Chains: Part 3 - Content Packaging, Dynamic Ad Insertion And Personalization

The venerable field of audio/visual (AV) packaging is undergoing a renaissance in the streaming age, driven by convergence between broadcast and broadband, demand for greater flexibility, and delivery in multiple versions over wider geographical areas requiring different languages and…