We begin this mini series with some history, the basic principles of Master Control and the evolution of centralcasting.
One of the most interesting aspects of US local TV station history is the fact that many of the new stations signing on in the late 1940s and 1950s were built from scratch by people who never worked in a TV station. Most pioneering TV engineers and technicians were generally WW2 veterans trained in electronics and often ham radio operators. They were the people with the skills and ingenuity that turned the vision and dreams of pioneering directors and producers into TV and kept it on the air. If a new TV station was built by a radio station, the TV sales, traffic, and engineering systems usually followed the sister radio station model. To this day, many TV stations use different electronics systems, inventory numbering systems, and content workflows invented and integrated into station culture by the station’s pioneers.
In some ways, broadcasting has changed little. Local TV stations still broadcast news, entertainment, sports, and public safety information to a specific geographic market. Station pioneers were experts in learning, tweaking, and moving forward with new technologies, even if they had to design and build some of it themselves.
The missions of Master Control, TV studios and control rooms and local TV newsrooms remain virtually unchanged from the early TV days. What constantly changes are the people, workflows, electronic systems, and new technologies supporting those missions. When designing or considering new systems in TV stations, understanding the mission is the top priority.
Four Missions of Master Control
The most obvious Master Control (MC) mission is to continuously feed the TV transmitter with commercial and program content from various sources, on a single or multiple channels, following the directions on each station’s daily program log. Typically, MC is the venue where content media is delivered and added to the system for playback. Today, commercial content arrives in various digital forms, although much syndicated programming is more easily distributed simultaneously to multiple sites by satellite. Outside digital content can be easily integrated into the local system using servers provided by several, well-known TV content distribution companies. The servers give stations the opportunity to Q/C content and levels before or as it is ingested into the local MC server.
Before automation, local TV MC operations required five or six people; an audio engineer, a video switcher engineer, a video tape engineer who could load, cue and play four 30-second spots on two VTRs, a film projectionist that built individual commercial break reels with leader and glue, and maybe a live booth announcer, all on PL headsets, rolling their content as called for on the log on cue from the video switcher engineer or a director. In larger markets, many of those talented people belonged to unions and made good money. Talk among MC crews on headsets between station breaks in those days was about how corporate intended to reduce the technical staff to a day person and a night person. They were right. Today, virtually all TV station content is live or from a server, and unattended station operation is normal.
The second MC mission is media management. It used to define where videotape spots and commercial films were located, organized and stored on shelves, and how content was organized for daily playback in dub reels and film editing. Today, media management is the same idea on a grand scale with much better controls. In MC, people must monitor the automation system for missing content, and find, fix, or adjust the content flow as necessary.
The third MC mission is to switch between the transmitter sources, from server to studio live, to EAS tests, alerts, and graphics as necessary. Typically, the automation system will do that switching work for the operator except in unanticipated situations that require live intervention, such as live news or weather bulletins and crawls.
The fourth MC mission is monitoring, logging and compliance. Most of that work can be done automatically with electronic alert notifications and visual alert cues on multi-monitor displays.
The benefit of centralcasting is the economy of scale. Centralcasting is like edge computing because it is an architecture, not a specific technology. The concept is to eliminate the duplication of people at individual station facilities all consistently performing the same repetitive tasks. Like so many broadcast ecosystems, one system doesn’t fit all requirements. Most broadcast groups have developed custom centralcasting models that work best for them and their systems.
For example, if 20 group stations carry “Wheel of Fortune,” 20 people need to ingest each show in segments and 20 MC operators will play it back on the air. In the centralcasting model, one person in a hub ingests each show once, and one MC operator plays the show back on all 20 stations.
Groups that centralcast from hubs don’t need a local MC. The necessary MC switcher functions for a local station on a hub can easily be provided by a 12x2 or so routing switcher in the Studio Control room. Groups usually also centralize station traffic and accounting operations, enhancing savings provided by the centralcasting economy of scale.
Centralcasting from a regional hub is comparable with edge computing because some local content can come from local servers and local news studios. The difference between centralcasting and edge computing is that most centralcasting data is on commercial fiber networks. Some use the public internet or satellites for backup.
You might also like...
Analytics and monitoring are now more critical than ever for media supply chains to deliver on various targets, including customer retention, regulatory compliance, revenue generation through effective targeting of ads and content, and efficient use of network resources.
This free 82-page eBook is your definitive guide to IP security for broadcasters. Written by our Editor Tony Orme, it gathers 12 original independent articles that examine the complex issues of security, explain the terminology and IT standards involved, and explore…
In the final article in this series, we look at datasets, their importance and why GPUs are critical for machine learning.
Broadcasters and video service providers first embraced broadband delivery over the internet well over a decade ago, but have only recently started to embed this fully into their supply chains.
Training neural networks is one of the most import aspects of ML, but what exactly do we mean by this?