In the last article in our three-part series, we explored the advantages of SDI and how 12G-SDI is applied in broadcast facilities. In this article, we investigate applications where SDI excels.
Broadcasters looking to move to 4K have up until recently relied on quad-link 3G-SDI, as the bit rate for 4Kp60 is just under 12Gb/s. The bulk and weight of these cables, however, make them difficult to install into broadcast facilities and outside broadcast vehicles.
Moving to 12G-SDI reduces the cabling to a single point to point connection and with fiber solutions, the connectivity becomes even easier. However, few broadcasters have the opportunity to work on brand new green field installations, so many facilities develop them piecemeal to meet ongoing program maker requirements.
As formats are advancing so quickly, reliable and safe integration is becoming more of a necessity. Expensive kit procured only a few years ago still has a long service life and broadcasters simply cannot afford to throw away perfectly good cameras and vision switchers because a new version of SDI has appeared.
There is, however, a compelling reason to switch to 12G-SDI for many broadcasters, especially those working with outside broadcast vehicles. The saving in weight of going from four coaxial cables to one for each link is considerable. Furthermore, infrastructure components such as routers and DA’s reduce as only one connection is required. Reducing the cable and equipment soon adds up and freeing up axle weight on a truck will allow broadcasters to either install more facilities or move to a smaller truck.
SMPTE, when designing 3G-, 6G-, and 12G-SDI have future proofed the specification and hence the design of many facilities. This upgrade path and backwards compatibility has allowed vendors to provide converter boxes that can interface quad-link 3G-SDI to a single 12G-SDI. This makes the upgrade path much easier and more cost effective.
Figure 1 – ST-2082 provides the option of 4:4:4 color subsampling. Although the human visual system will tolerate the chroma bandwidth to be half that of the luma bandwidth, 4:4:4 processing is essential for post-production to keep the quality as high as possible.
Low Latency Demands
Live events demand low latency as fast action-packed sports rely on sharing the emotion of the game with the audience as well as showing what is going on. With SDI, delay is kept to an absolute minimum as SDI is a synchronous distribution system meaning signals have the least delay possible when sent from the camera to the vision switcher. Large buffers are simply not required as there is no packetized data to re-order or re-time.
Furthermore, the inherent near instantaneous plug-and-play ability of SDI makes rigging live events much easier and safer. As the SDI system has had over thirty-years to mature, cable designs and equipment reliability is probably as high as it’s ever been. This further reduces complexity making multi-million dollar productions very reliable.
The learning curve for SDI is relatively low. Many engineers and technologists have grown up with SDI and understand it completely. The measurement and diagnosis tools are familiar, and the constraints of SDI are well known and documented. Essentially, the risk is very low.
Live sports events are well known for driving technology to its limits. Ever demanding audiences not only want the best pictures and sound possible, but they want more commentary and interviews. In today’s high energy sports events there is no down time.
Co-timed Video and Audio
Keeping video and audio in-sync is more challenging now than ever. There’s a miniature camera for nearly every imaginable application, all designed to help bring the ambiance and emotion of a live event home to the viewer. Unobtrusive, these devices can deliver stunning pictures. However, they generally don’t have broadcast connectivity and need to run through a frame-synchronizer to provide an SDI interface and relevant timing.
Although video delay by a few frames is expected, the audio delay must match it or accumulative errors could manifest. One of the advantages of 12G-SDI, and SDI in general, is that the latency is low, but more importantly, it is predictable. The transmission time through an SDI interface is fixed due to the synchronous nature of the system.
As well as being able to embed the audio into an SDI stream, the predictability of the transmission path makes it much easier for a sound engineer to select the correct audio delay to match the video synchronizer and transmission paths. This is even more important when the program output and isolated feeds are transmitted back to the studio or playout facility. Having the video and audio arrive at the compression system co-timed helps remove potential timing anomalies.
HDR is gaining greater prominence in live sports events to meet the needs of the ever-demanding viewing audiences. Combined with WCG (Wide Color Gamut) and much more detail in the shadows and highlights, HDR is the obvious choice for stadium events. Shaders have many more options when trying to balance the bright sunlight on the playing field against the harsh shadows cast by the stadium.
However, the vast number of viewers are still using SDR televisions, and although the future for HDR looks very promising, broadcasters must be realistic and still provide the SDR feed for these viewers. This usually implies a dual transmission path at some point where the SDR feed must be derived from the HDR feed (or the other way around if that is what the system designers have built).
Key to building reliable HDR systems is understanding the workflow requirements and keeping latency low. Again, simplicity and minimal delay are absolute necessities, especially for live events. Real-time processing is required with the smallest delay possible and the synchronous nature of SDI lends itself well to these applications.
As more demands are placed on HDR infrastructures, it’s inevitable that frame accurate signaling and switching is going to be required for localization and sponsor opt-outs. Embedded data in the auxiliary SDI stream will further support this requirement.
Although broadcasters are familiar with using 10-bit SDI systems, 12G-SDI is providing future proofing through the provision for 12-bit samples. As well as being backwards compatible and maintaining 10-bit samples, ST-2082 makes provision for 12-bit samples. This will further enhance HDR images especially as we move beyond 4K to 8K. As screens improve and become physically bigger, banding and aliasing not seen on smaller screens will become more obvious, moving to 12-bits will help alleviate this.
HDR 12-Bit Video Future Proofing
Through quad-link 12G-SDI, bit rates of up to 47.52Gbits/sec are available. This allows 8K signals to be transferred for 10-bit 8Kp60 and 12-bit 8Kp30. Again, SDI and specifically 12G-SDI provides an upgrade path should a broadcaster want to provide an 8K service. 12G-SDI gives a broadcaster upgrade options as they can build an 8K facility using quad-link 12G-SDI.
Figure 2 – 12G-SDI infrastructure cabling is backwards compatible with earlier versions of SDI to facilitate better integration.
Higher color spaces are also available through the provision of 4:4:4 color subsampling. This gives much higher color accuracy than the familiar 4:2:2 color subsampled system so colors can be better represented and rendered, especially in post-production.
Our human visual system may only need the bandwidth of the chroma to be half that of the luma, but processing in full chroma bandwidth using 4:4:4 will help reduce potential artifacts and concatenation errors. Again, these will be more obvious on large screens so we should do everything possible to keep the quality and accuracy as high as possible within the image.
SDI has stood the test of time and has gone through many iterations to reach the current 12G-SDI SMPTE ST-2082 standard. It’s well understood, is already plug-n-play, and delivers simplicity.
You might also like...
How adding PTP to asynchronous IP networks provides a synchronization layer that maintains fluidity of motion and distortion free sound in the audio domain.
This article describes the various codecs in common use and their symbiotic relationship to the media container files which are essential when it comes to packaging the resulting content for storage or delivery.
This list of file container formats and their extensions is not exhaustive but it does describe the important ones whose standards are in everyday use in a broadcasting environment.
It’s difficult for local stations generally focused on earning positive numbers during the next sweeps to invest much time contemplating station technology needs five to ten years out. This story explores what new direction TV broadcasting could go, from t…
When we think of glue in broadcast infrastructures, we tend to think of the interface equipment that connects different protocols and systems together. However, IP infrastructures add another level of complexity to our concept of glue.