Fiber optics is an increasingly important technology in media facilities because of its wide bandwidth and immunity from interference.
Engineers facing the need to upgrade a facility have a choice, copper or fiber. There may be good reasons to choose either. However, as this tutorial reveals, the key benefit of installing fiber optic links include its wide bandwidth, which almost makes fiber future proof with regard to new formats.
This is the second article in a series of two addressing the need to future-proof broadcast plants. The first article introduced basic requirements and concepts, including the clear advantages of fiber optic cabling over coax or CAT-x connectivity to protect signal paths from having to be upgraded to handle any future uncompressed video formats. You can read that article here.
This tutorial provides an overview of fiber optics both within and between broadcast plants. Fiber optic networking is under continuous development and enhancements in a broad and deep domain. In this article, we will focus on the aspects of fiber optics pertinent to broadcast plants, specifically Passive Optical Networks (PON).
A PON is a fiber optic network where there are no terminations nor optically active devices between optical transmitters at one end of a path and receivers at the other end.
Multi-mode and single-mode fiber optics
Fiber optic cables provide transport of signals and come in two basic flavors: multi-mode and single-mode. Somewhat counter-intuitively, multi-mode optical fibers carry one or two closely-related optical wavelengths while single-mode optical fibers carry one to many optical wavelengths within a broad band.
Optical fibers have a clear advantage over copper: their lack of electrical conductivity prevents creating ground loops or the transmission of voltage spikes or any other electrical anomalies. The two most significant disadvantages of optical fibers over copper is the inability to support Power-over-Ethernet (POE) scenarios and the need to perform electrical-to-optical conversions when connecting to most broadcasting gear in use today.
Single mode fiber optic cable with connectors.
Multi-mode fiber strands are between 49 and 63 micrometers in diameter and are usually illuminated by light-emitting diodes. Multi-mode fibers, when compared to single-mode, exhibit high attenuation losses so are only usable for short runs. Multi-mode fibers are beyond the scope of this article.
Single-mode fiber optic cables are between 7-11 micrometers in diameter and generally require laser transmitters. Compared to multi-mode fibers, single mode fibers provide wide bandwidth on any wavelength between 1261 and 1621 nanometers (nm). As frequency is inverse to wavelength, the resulting frequency band is from about 186 to 236 Terahertz (THz). One thousand (1000) GHz is one Terahertz.
Without the use of repeaters or amplifiers, single-mode fibers can relay signals without degradation at least 10 miles and up to 50 miles, depending on the wavelength(s) in use.
Unlike coaxial cabling, where proprietary standards predominate, most characteristics and measurements of fiber optic cabling are standardized. At the topmost level are International Telecommunications Union (ITU) standards. Some aspects of fiber optic cables beyond optical characteristics are standardized by the Telecommunications Industry Association (TIA). The Society of Motion Picture and Television Engineers (SMPTE) has developed and published standards applicable to the use of fiber optic within and between broadcast plants that are compatible with ITU and TIA standards. Manufacturers of fiber optic cables are free to decide how many fiber strands to run in a cable and the color or colors of the external cases, among other criteria.
The ITU has propagated several optical fiber bands based on wavelength. Table 1 presents ITU fiber bands relevant to broadcast plants. Until the last decade, the inadvertent introduction of water vapor into fiber optic cables during manufacture prevented the use of the ITU E-Band. Current single mode fiber cables have minimized or eliminated this constraint.
Like copper connectivity, optical signals attenuate over distance and the rate of attenuation is wavelength-dependent, ranging from 0.2 dB up to 1.0 dB per km.
For our purposes, optical signals are multiplexed using wave division multiplexing, a technology which is like frequency division multiplexing in radio frequency transmission. Two distinct approaches to wave division multiplexing are applicable to the needs of passive optical networks.
Coarse Wave Division Multiplexing
Coarse Wave Division Multiplexing (CWDM) enables up to 18 separate signals to be optically multiplexed in a single mode fiber. Each wavelength is unidirectional and can only be used once in a path. Each CWDM band is spaced 20 nm apart, ranging from 1271 to 1611 nm on center. At the current state of the art, each wavelength carries up to 10 Gb/sec. By employing all available 18 wavelengths, total throughput of up to 180 Gb/sec is possible in a fiber path using CWDM. Full duplex communications are accomplished using two wavelengths each with a transmitter on opposite path ends.
In the context of broadcast plants, one wavelength can carry one or two standardized HD-SDI or 3G media streams or one 6G media stream. To carry one 12G or two 6G streams, two separate wavelengths are required, but SMPTE has yet to publish the required standards. Were maximum bandwidth per CWDM wavelength to increase to 12 Gb/sec or higher, CWDM could become aligned with SDI standards that are multiples of 1.485 Gb/sec.
Because CWDM optical transmitters are not temperature stabilized, they tend to be inexpensive and easier to deploy compared to Dense Wave Division Multiplexing (DWDM).
Dense Wave Division Multiplexing
Dense Wave Division Multiplexing (DWDM) enables up to 40 or more separate signals to be optically multiplexed per ITU fiber optic band. Referring back to Table 1, there are 5 separate frequency bands within the wavelengths useful for CWDM. DWDM carriers are specified by frequency, generally in GHz, with spacing of 25, 50 or 100 GHz.
At the current state of the art, DWDM can convey up to 100 Gb/sec per wavelength. DWDM can carry 4 Tb/s, about 10X the bandwidth CWDM provides in a wavelength band. By reserving one or more CWDM wavelengths for future DWDM usage, the greater bandwidth afforded by DWDM can later be deployed without disrupting any CWDM signal paths.
The optical transmitters and receivers employed in DWDM are generally temperature stabilized to achieve the higher frequency tolerance required. DWDM endpoints tend to be several orders of magnitude more expensive than CWDM. Furthermore, while being incompatible with most electrical interfaces and current implementations, these installations are significantly more complex than those using CWDM.
Passive Optical Multiplexers
In a passive optical network, multiplexing and demultiplexing is performed by a passive optical multiplexer/demultiplexer (POM). POMs are made of materials with low optical losses so impose insertion losses of a few decibels or less. All wavelengths inserted into a POM are present at all ports. Signal collisions are prevented by insuring that no transmitters operate on the same wavelength.
POMs must be installed at both ends of a fiber path if more wavelengths are present in the path than the receivers can handle. (QSFP+ receivers and transmitters employ four separate wavelengths.) POMs can be cascaded at the cost of increased insertion loss.
In situations where one or more wavelength needs to be split off from a fiber run, prisms and dichroic filters are employed and induce additional attenuation of the optical signals. Because these devices are all passive, their mean time between failures tends toward infinity.
Some devices permit fiber optic cables to be inserted directly, but common approaches require the cable to sport a fiber optic connector.
The following connectors are in common use.
- The SC connector (TIA-568-A; EIA-61754-4) is a 2.5 mm ferrule plug.
- The ST connector (AT&T trademark) is a spring-loaded connector with a 2.5 mm ferrule plug. Properly seating an ST connector might require several attempts.
- The LC connector (“Lucent connector”; EIA-61754-20) is used in many SFP+s/QSFP+s due to the small size – 1.25 mm -- of the ferrule connector plug. For SFP+/QSFP+ transceivers, two LC connectors are easily joined with a small plastic clip to simplify insertion and removal.
While fiber optics offers a range of connectivity options, only three are of primary concern in broadcast and production facilities.
Due to the need for optical multiplexers in any fiber path employing multiple wavelengths, passive fiber patch panels are of limited utility. Patch points increase the system attenuation budget but do not permit separating links by wavelength.
The underlying concept behind patch panels holds that each circuit carries just one signal, which is not a safe assumption with fiber optic networks. Integrating POMs or electrical to optical converters within patch panels has yet to take hold. Perhaps these oversights will become key features of optical software defined networks of the future.
Optical networks, as with copper networks induce attenuation based on the composition and number of devices in the network, the quality of the connections and any splices and the length and characteristics of the optical fibers. Broadcast plants, where cable runs seldom are longer than a few hundred meters, are unlikely to induce sufficient attenuation to prevent fiber receivers from receiving error-free signals.
Calculations for attenuation losses with fiber optic paths closely follow the model used in RF or electrical paths. Figure 2 is a diagram of a simple fiber optic network with two transmitters and two receivers, two optical multiplexers, a fiber optic link and associated connection points. Table 2 is an example attenuation budget for such a network. By insuring that the transmitted signal level less induced attenuation (including signal margin) is higher than that needed for each receiver, reliable service is insured. Keep in mind that transmitters might become less powerful over time and receivers might become less sensitive.
The bottom line
By incrementally adding fiber optic cabling to broadcast plants when new cable runs are needed, broadcasters can start to break free of the bondage today’s interfaces. Fiber networking does not have the limitations coax or twisted pair cabling impose on video formats or path length. In light of the need to support higher speed interconnects, higher resolution images and data-center designs, fiber optic is today’s solution for tomorrow’s challenges.
John Willkie is a former broadcast engineer, systems integrator and now consultant based in San Diego, CA.
You might also like...
When purchasing Cat5e, Cat6 or Cat6a network cables, buyers may notice an AWG specification printed on the cable jacket. AWG is a standardized system for describing the diameter of the individual conductors that make up a cable. But…
Following numerous private conversations and panel discussions at the recent 2018 NAB Show, it’s become clear that broadcasters are being challenged like never before to hold the line on CapEx spending while delivering more content across their linear platforms. Because o…
John Watkinson puts on his snake-oil-proof clothing and looks at speaker cables. Finally, some clarity behind the myths and magic that surround technical aspects of speaker interconnections.
Broadcasters have a flurry of changing parameters and imperfections to avoid when making the transition to single-link 12Gb/s connectivity. This article will provide some guidance to the needed decisions and key performance factors.
As higher resolutions become the “norm” in television production and broadcasting, improvements in coaxial cables and associated connectors to enhance performance in the 12G operating range provides a more practical solution for 4K transport.