Business fashions change, this is especially so as the media sector embrace new software systems consolidated into data centers. It has been a long journey from analog hardware to microservices in a virtualized environment. Mentally reviewing this journey it has been a long one stretching over several decades. The infrastructure underlying a modern playout center would have been unthinkable twenty years ago. New delivery businesses serving the OTT market have led the way in utilizing the latest software architectures. Broadcasters are fast catchig up, and it must be observed that the the shift from SDI to IP is a big enabler.
I have vague memories of mainframe computers, with the noisy telex terminals that were used for remote access. But my first real exposure to computers was the IBM PC. This was for office applications, word processing and spreadsheets. Around that time media creatives start to embrace the Mac, and the use of computer workstations for non-linear editing started.
Initially the media files were stored on local SCSI drives, direct-attached storage. Over time it made business sense to concentrate file storage in a central area, often referred to as the machine room (named after the tape machines, and where are they now?), or central technical area. Initially the stared storage was a storage-area network (SAN) that used technologies like fibre channel for the interconnection fabric.
IP networking has overtaken the SAN, now that Ethernet performance has improved to handle the data rates of video. Storage architectures have evolved to scale-out network-attached storage with an overarching file system that mounts as a single name space. Engineers can manage disk space allocation to individual production projects from an easy-to-use dashboard.
Shared storage to shared processing
However, a central pool of storage is only half the answer, expensive workstations scattered throughout a facility still represent a management overhead for software maintenance. And then there is the question of efficient utilization. An edit bay may lie idle at night, while encoding jobs are short of resources.
Over time facilities have been gathering the workstation processing into the central technical area. The edit bays are now connected via KVM, or use thin clients to give a lightweight user interface. Although such centralized processing can be implemented with individual “pizza boxes”, routed to the remote clients as needed, more and more the processing is being virtualized in the manner of a modern data center.
So far, I have hinted at post-production as an example, but playout and multiplatform delivery has also changed. One time a playout center had racks of computers, each running a specific software application. The video remained on specialist video servers that guaranteed real-time playback. Advances in disk technology and the computer components mean that the real-time processing can now be handled on off-the shelf servers. That opens the door to virtualization, and the abstraction of the software applications from the hardware.
Evolution of Software in Broadcast
The first appearance of software in video equipment was as a control function. The video processing still took place in hardware. The first non-linear editors demonstrated that video could be processed on computer workstations, albeit at proxy resolution and not in real-time. As CPU power increased the possibility of real-time operation was finally realised. Applications can be multi-threaded using several CPU cores in parallel, and furthermore GPU acceleration adds extra power to video processing applications.
Virtualization moves the software processing from specific workstations or servers to the data center. Processing now takes place on anonymous CPU cores. Processor blades can be replaced on a rolling basis as they reach end of life, and their purchase cycle is not linked to the purchase of a new software application. The business has the choice of owning the hardware—as an on-premise data center—or it can be outsourced to a cloud service. A hybrid of both methods gives the elasticity often needed for the coverage of occasional large events like sports tournaments or elections.
Once the hardware and the software applications are in a data center the system become more of an enterprise platform than workgroup. Other industries grew naturally with this centralized approach. Transactional processing and databases ran on a central mainframe and these centralized systems have continued to evolve into the virtualized and cloud processing of today.
Many in broadcast or production leadership were first introduced to software when they obtained their own PC. Shown here is the original IBM PC, model 5150, which was introduced August 12, 1981.
From Workgroup to Enterprise
Content post-production, distribution and delivery were once constrained by technology to a workgroup approach, with the enterprise approach restricted to the back-office functions like scheduling, sales and traffic.
Videotape was the “data storage” and SDI the network fabric. The workgroup approach led naturally to small systems-integration projects undertaken at a departmental level. SDI made it easy to add all manner of “widgets” for special tasks. The nature of the content, with the many layers of very different data types—video, audio, subtitles/captions, and metadata—all added complexity.
When the number-crunching moved from hard logic to algorithms on virtualized processors, the very nature of broadcast infrastructure not only changes, but it becomes much simpler. The complexity is hidden in the software.
Once functions are transferred from hardware to software the application designers have the option to leverage modern software architectures. The service-oriented architecture (SOA) is one, with microservices becoming a popular variant of the SOA. These architectures allow a complex operation to be divided down into small services like the transcoding or resolution transforms that form a such a large part of multi-platform delivery preparation.
This shift also impacts on the business. Capital projects morph into regular software license fees. We already see this in video editing and in playout. The director can be sitting on a beach sipping cocktails, reviewing his latest project being cut the other side of the world. Oh, that he or she had the time! The reality is more that they will be embroiled in the next production which could well be the other side of the world. Remote access to a job at a data center opens up more flexible ways to work.
As playout centers feed more OTT distribution than the tower on the hill, it makes sense to put the playout functions in a data center sitting on an Internet hub. The location of the TV network is decoupled from the playout center.
All this adds up to business flexibility. Where facilities are located and the split between CAPEX and OPEX; such decisions can be made for business reasons, not constrained by technology.
Broadcasting was once a business beholden to, and constrained by technology. Software infrastructure for video and audio processing, from ingest to delivery, has changed all that. The carriage of video by SDI has held the media business back from adopting the software infrastructure that has been available to other verticals. The move to IP has removed much of the impediments to change. The constraints now are different. What bandwidth is available and how much does it cost? As fiber networks grow and cost fall that constraint diminishes.
I guess it is time for the broadcast engineer to hang up his tweaker.
Related Editorial Content
Broadcasters have historically not had to endure regular large-scale technology transitions. Sure, the industry moved from B/W to color, analog to digital, and SD to HD. But the upcoming move from the familiar and comfortable SDI technology to an…
As well as providing functionality, tangible products present the opportunity of adding worth through their aesthetic appearance, cost of manufacture and development expenditure adds to the perceived barrier to entry for other vendors, and combined with low volumes, the cost…
One of my colleagues recently wrote an article including statements from others in the industry that orchestration was just a component of media asset management. I beg to differ and propose an alternative opinion and perspective.