OBX - one of three data-centric OB trucks from Arena
The past year in live event broadcasting has been as much about competition between OB suppliers as it has been between pay TV broadcasters BT Sport and Sky Sports. Making the leap to UHD is one matter but doing so with a forward-thinking path to IP is quite another. That’s the philosophy driving Arena which is putting three new OBs into action with an IP framework.
The first facility, OBX, saw action beginning early September for BT Sport’s EPL coverage and will be retained on a rolling basis by the broadcaster. OBY is also now working for Sky Sports primarily covering 80 rugby matches this season including England’s Autumn Internationals and the European Rugby Champions Cup and Challenge Cup (in HD). A third, again built to the same specifications, is due early 2017.
“We are entirely data centric,” says Daf Rees, Arena's deputy director of operations. “There's no baseband involved in that workflow although we do have kit which has baseband connections on them as short term gateways into the IP world.”
He explains that Arena approached Grass Valley late summer 2015 to provide a glass to glass IP based live production solution.
COTS switcher core
Requirements which GV needed the COTS switch to adhere to included IGMPv2 and IGMPv3 (Internet Group Management Protocol). Single IGMP requests must be processed in less than 10 msec and a minimum of 150 multicast groups per 10GbE physical interface must be supported on the network switch.
Additionally, every physical interface must be capable of simultaneously transmitting and receiving packets at maximum speed (1Gps, 10Gps, 40Gps) without any degradation in performance (ie packet drop, jitter, port to port latency) regardless of packet size.
The infrastructure needed to be standards based, which meant embracing standards available at the time (SMPTE ST 2022-6, AES67) while ready for standards coming down the line (TR04 and TR03) to allow implementation of technology from other companies.
“Most importantly OBX is a UHD (2160p60) truck which also needed to be capable of doing 3G 1080p50 and HD 1080i50 and to be future ready for HDR and HFR,” explains Phil Myers, product specialist, Grass Valley. “Just the simple things like being able to reduce the weight of cabling in a truck which is going up and down Britain every week using a large amount of fuel is a bonus, plus [the reduction in cabling] means easier access to rack space onboard.”
The whole truck had to be as cost neutral to an equivalent Quad SDI-based solution as possible.
It’s not small either. This is a triple expander with 32 camera channels plus 12 replay positions and a large video switcher in the middle of the vehicle to accommodate all the sources and outputs to the host broadcaster.
“The amount of deliverables to the client has more than doubled,” says Myers. “Not just HD with stereo and 5.1 we’re having to do UHD with 5.1 and Dolby Atmos and stereo and then make an HD variant of it and a clean version for the world feed and so on. As with most events this is not the only OB that turns up to do events so it needs to cater for incoming feeds, graphics and uplinks. For this we provided a fairly unique solution around the tailboard which is bidirectional - meaning that we can change the direction of the IO paths using the control system.”
For graphics, GV provided Arena with a flyaway external frame which would patch to and from a separate graphics or presentation van.
“And since this is the first of three trucks we ensured we can connect all three by 100GbE rather than large umbilicals of SDI.”
He adds, “When you look at the design of the truck it’s very different to what you’d have in an SDI truck where there would typically be a core IO and SDI router. Here, we have a small Cisco switcher which fans out IP to areas around the truck. Each area has its own connectivity.”
The cameras are LDX 86N with a native UHD sensor and support for Tico compression from the base station. “We connect up the base station over 10Gigs to the network,” explains Myers. “The main output, monitoring output and all camera returns are being moved around in the IP domain. The control system is able to switch the sources back to the cameras as if an SDI router were sat in front of it.”
OBX engineering bay
Using GV Nodes
Even though the truck’s infrastructure is IP, four signals still need accommodating in the Kayenne switcher. “We’re working in Two Sample Interleave (2SI) which means we need 4 MEs running in the switcher,” says Myers. “IP connectivity onboard the Kayenne supports Tico.”
All the Grass Valley devices that generate Tico streams also generate a monitoring stream. The Tico stream can go to the switcher, tailboard, vision mixer and then 1080p monitoring can go anywhere around the truck, for example to the multiviewers.”
A Kaleido KMX-4911 multiviewer with the GV Node system picks up the IP streams and outputs to various sizes of monitors around the truck's VT and production area expandable to an 18x2 multiviewer.
GV’s Gateway cards are the Swiss Army knife for the SDI to IP world. “These allow us to take legacy products into the IP domain and to bring streams back out to the legacy domain to provide connectivity to trucks that aren’t IP ready today,” says Myers.
The cards can run in HD, 3G or UHD mode and, as part of the design process with Arena, GV added new functionality to them. For example, aggregating 2 x 10GbE on the same card and converting Square Division Quad Split (SDQS) to 2SI and 2SI back again. The idea behind the conversion is to have all the signals the same inside the truck.
Most 4K/UHD productions to date have used the square-division method where the full image is divided into four quadrants. This requires all four HD-SDI images to be correctly timed otherwise the 4K/UHD picture will look like four separate images. This 2SI method centres on sending two consecutive pixels to one of the four sub-images at a time.
“IP by nature is Two-Sample Interleave because of the compression, but what we wanted to do was accommodate legacy devices (including EVS XT3 servers) that don’t handle 2SI today. Likewise you would want 2SI coming off the tailboard so we need to convert the other way to SDQS coming back out.
There are a number of GV nodes (a broadcast-centric IP processing platform) in the truck, predominantly for multiviewing but also to perform vertically accurate switching.
Historically, COTS IP switches have been unable to perform vertically accurate switching like traditional SDI routers. This is especially important in live applications where signals go directly to air, and where routers have traditionally been used as a backup to the production switcher. Switching that happens accurately within the vertical interval is also needed when a router is used for connecting secondary live feeds.
“Vertically accurate switching is a necessity in the broadcast environment. The video switcher by nature does it and if it doesn’t you have a problem.”
The GV Node distributed platform topology is designed to uplink to aggregation COTS IP switches, using a ‘spine-leaf’ architecture that's typical of modern IT infrastructures, Myers explains. This topology represents a much more scalable and flexible approach than traditional, centralised routing systems, which have encouraged the purchase of larger, more expensive chassis than necessary in order to allow future expansion. This type of distributed topology enables the realisation of a true broadcast-centric data centre model.
All seeing eye
The control system is perhaps the single most important piece of kit in the truck. It’s the “all seeing eye” says Myers, and in this case a Lawo Virtual Studio Manager (VSM) does the job. It controls all GV products and also interfaces with other kit, such as a Calrec audio desk. Through the MCP 450 the VSM can resubscribe the streams for the camera returns, control intercom and volume. It integrates with GV multi-viewers also for Tally and UMDs and any change of layout required by production teams. It even provides control down to crew using a tablet.” GV wrote its own API to interface its equipment with the Lawo system.
Cisco fibre patch in OBX
“When you come to choose an IP switch don’t assume you can buy the right switcher off the shelf,” he advises. “Enterprise switches need specific adjustments for the broadcast space.”
The Cisco Nexus 92762Q Spine housed in OBX uses Cisco’s non-blocking multicast algorithm which is designed for professional media.
“Essentially this system prevents you from over subscribing all the edge devices like the LDX cameras and the Kayenne,” says Myers. “It doesn’t get much bigger than this. We have 48 40GbE ports connecting all of the GV nodes and 230 10Gig ports connecting all the edge devices.”
You might also like...
In the previous Cloud Broadcasting article, we looked at Agile software development and its relevance to cloud computing. In this article, we delve further into Cloud-Born systems and investigate the differences between public and private clouds and where you might…
Broadcasters, and production facilities that have significant investments in large-scale SDI plants are going to face technical and budgetary challenges as they transition to newer IP-based topologies. The need to support current SDI workflows during the transition will be key…
In the previous Cloud Broadcasting article, we took a very brief look at cloud security and its advantages over traditional datacentre systems. In this article, we delve further into Cloud Born systems and investigate the new requirements for software development…
IP and Video Go Together. While OTA broadcasters await an IP delivery system, production tasks march ever closer toward IP-centric environments. So, where does an IP solution make sense today? Are manufacturers any closer to delivering the solutions they promised…
In the last article on Cloud Broadcasting we looked at reliability and the client-server model in Amazon Web Services high availability zones. In this article, we look more at cloud security, a very emotive word in the IT community.