The transition to an IP architecture has created significant changes in command and control and in how systems are monitored and managed. Command and control is the all encompassing automated set of processes that control the acquisition, file movement, handling and delivery of media. Monitoring is more than a set of scopes and meters. Dashboards and browsers provide the system monitoring tools to manage the handling and Quality Control of media and metadata in the entire facility.
Automation was the system that master control used to integrate the traffic system to the play out system. The automation controls all the program source devices i.e. tape machines, servers, plus routers and master control switches to originate programming following the instructions of the traffic schedule. As the industry has moved to multi-channel delivery, the requirements for automation systems has become more complicated. This translates into receiving multiple instructions sets (commands) ie traffic schedules and playlists, then issue multiple commands the various devices it controls to playout content. And while automation has evolved to support it, it is now only one element in the overall automation system of an IP broadcast center.
Editing systems control source machines, switchers and mixers. This is one of the primary examples of command and control. The editing system issues a set of commands to the source device each time an element is selected. It issues a different set of commands when the finished program has to be rendered for each delivery format.
Command and control has evolved from Serial RS232, 422, 485 with GPI/O relay closure to an IP layer in the stream and file-based architecture that also handles all the management and movement of media.
|IP TRANSPORT LAYERS|
|Command and Control|
This shows the layered topology in the IP environment and how command and control is one layer within a single IP transport stream that carries media, metadata and communication in addition to command and control.
The stream and file based technologies and workflows have introduced new requirements for broader command and control systems. One that takes control of the entire media lifecycle from the time of concept to the point of distribution and beyond with all processes along the way.
In the file-based broadcast and production environment, for each process there is a need for control. Ingest devices need to know to start recording plus the format profiles and direction where to place the media and metadata once it’s created. The production and media management systems need to be notified that the file is ready for use. There is a media handling process that controls the movement of the files across the different business and production units and into different storage areas for production, media management, archive and delivery.
Every aspect of the file-based environment is managed by the command and control layer. And each process and device is typically controlled by an automated process. Even when a process is manual, the controller is the dashboard or a separately integrated control system that handles the manual process and shows the status and movement of files and streams within the entire media management environment.
Orchestration is the new term for an all-encompassing command and control management system. These systems are also called conductors. Conductors provide a unified dashboard to manage the command and control system. They host the rules and policies that manage all the devices and processes.
The conductor is the dashboard that is the command and control center. It shows all the active processes, device status and where the files are in the system. The conductor controls the ingest processes and devices, handles media movement, interfaces with media management and controls the master control automation system for delivery. Following the schedules, rules and policies, the conductor will trigger processes then track their progress managing priorities and insuring continuity of the flow of media and metadata.
On a large and unique project I am involved in, there are as many as 17 multi-camera production spaces in operation at the same time daily. Each space produces multi-hour live programming. In each space is a multi-camera robotic system driven by a microphone system. Each time a microphone opens it triggers a camera preset that is recalled and stabilized prior to getting switched to air. The program feeds are all encoded to file, accessed for editing and streamed live to web live and in real time. Additionally metadata gets recorded for all productions.
There are numerous processes involved prior to the production and multiple concurrent processes during and post the live events. It is virtually impossible for an operator to manage and control this volume of production. The only way is through command and control and multiple automation processes.
A composite schedule with for all the production spaces is delivered as XML to the Media Asset Management system. This is parsed to individual schedules or each encoding chain. The same schedule is passed to the microphone management system to know when the production will begin. A separate XML is delivered to the microphone management system so it will know who is assigned to each microphone. Both of these XML files are entered as metadata to each production prior to the production starting. As the production starts the robotic controller begins tracking the microphones and both systems are writing metadata. In the Network Operation Center, technicians are monitoring the command and control systems that enable the production while monitoring the signal quality of the program output of each space as it passes through a router to the encoder.
The file is moved from the encoder to the storage environment where production personnel can browse and access for real time editing. As editing packages are completed they are registered to the asset manager and delivered to the various distribution platforms. There are more processes happening during the live production and many more downstream once the file is encoded.
This is a clear demonstration of the need for command and control throughout the entire media lifecycle. It would be difficult if not impossible for an operator to manually manage and perform these operations in a timely manner that assure the program gets to air.
In the stream and file based ecosystem processes get triggered, data moves between systems, devices are controlled, media moves throughout the environment all while metadata is controlling the media management.
Command and control is a core process within the IP architecture and using dashboards with centralized management and monitoring are essential tools the engineer needs.
You might also like...
If we could have a “Year of the Engineer”, then I am firmly of the opinion that 2020 would be it. Lockdown has demonstrated unprecedented imagination, ingenuity and tenacity, especially for our engineering community.
The industry experienced futureshock head-on in 2020. The impact will take a long time to unwind but it’s already clear that some changes will be profound and not all of them bad. These changes include remote workflow permanency, virtual production s…
Migrating live and file-based video services to the cloud holds the promise of huge flexibility in deployment architectures, geographical backhaul possibilities, virtually limitless peak provisioning, pay per use pricing, access to machine learning and other advanced intellectual property on-demand, and…
With viewers demanding to watch what they want, where they want, and how they want, it’s not surprising we’re seeing an unprecedented growth in broadcaster OTT requirements. However, the change in delivery format from traditional broadcasting is providing us wi…
To see how to make computers secure, we have to go way back to see how they work.