Sony HDC-5500 field camera.
Fox Sports’ broadcast of the 2022 Baseball World Series was a real eye-opener, and it wasn’t just because the Houston Astros handily defeated the Philadelphia Phillies in six games. After all, for those in the know, the Astros were the heavily favored team so the outcome shouldn’t have been too surprising. But what was surprising and even amazing to many broadcast professionals were the quality of the images from the center-field cameras.
Other articles by Barry Braverman:
When SD ruled the roost, it hardly mattered how lenses performed - the broadcaster’s low-resolution images masked all but the most egregious lens defects.
During the games, upwards of 40 cameras in total were trained on the field, including a venerable array of DirtCams. on-field MoVIs, a Flycam, and Super-Mos. It was, however, the bright, brilliant images from the manned Sony HDC-5500 center-field cameras that impressed veteran observers the most. At 400 feet (120m) from home plate, the 100X+ magnified images retained optimal sharpness and contrast to the corners of the frame with no loss of resolution or focus. This is no small achievement given the wickedly long zoom lenses used commonly by broadcasters these days at major sporting events, including the U.S. World Series.
The laws of physics, however, must still apply, and as such, DOPs have long had to contend with the shortcomings of the long zooms. Breathing of focus, ramping, loss of contrast, and darkening to the corners of the frame, have been the nemeses of DOPs for decades; the demand being so great for such optics, particularly in sports, that lens designers are willing to sacrifice some image quality in exchange for the greater reach.
Historically, zoom lenses have been comprised of two main element groups for focus and magnification. The two sections worked together via a system of gears and cams to provide the desired increased magnification while maintaining, theoretically at least, a constant field size.
With the advent of HDR and 8K resolution, the shortcomings of long zoom lenses can be glaringly obvious. The shortcomings, while subtle at first glance, are often amplified and thus rendered more objectionable downstream after encoding and subsequent decoding in the home TV. Chromatic Aberration Compensation and other LUT-based software schemes can help reduce the visibility of notable artifacts during original image capture and processing.
For DOPs who came of age in the 1970s and 1980s, the breathing issues we routinely faced were not confined to chronic lung problems or obscene phone calls. I can recall not-too-fondly the breathy performance of my Angénieux 12-150 f2.8 zoom on news magazine shows like 60 Minutes. While the 16mm format and standard definition TV concealed many of the lens’ egregious shortcomings, today’s higher-definition formats offer no such refuge. Shooting in 4K and 6K today enables the capture of pictures with exquisite detail, and sadly, more obvious and objectionable lens defects.
Over the last 40 years, DOPs have seen significant advances in the mechanical design of zoom lenses, incorporating, in addition to the two main groups, additional elements that move only incrementally to account for breathing of focus, barrel distortion, and other deficiencies. Most DOPs recognize, however, that chromatic aberrations (CA) are the most serious challenge facing lens designers, and indeed to most DOPs, CA is the main reason that cheap lenses tend to look cheap.
While improved lens coatings have helped enormously with respect to performance, the introduction of on-board Chromatic Aberration Compensation (CAC) has also played a key role as lens manufacturers seek a cost-effective way to reduce the visibility of CA. Derived from the average CA observed in multiple lenses of the same type, a generalized formula like CAC can only go so far, and may not be effective or even relevant to a particular lens in actual working conditions.
Still, the die is cast, and by now, the notion of on-board compensating software to enhance lens performance is a foregone conclusion. Manufacturers, like Canon and Fujinon, appear keen to develop new innovative ways to boost the quality of the captured image, without adding unnecessary elements or contributing further to the complexity of the lens and its concomitant cost.
The World Series broadcast delivered on the promise of an in-camera LUT-based solution in a major way. Utilizing Sony’s ARIA [Automatic Restoration of Illumination Attenuation] system, the distant center-field HDC-5500s dynamically applied the desired software correction in response to the actual lens settings, magnification, and operating environment. Applied most notably at the long end of the zoom beyond 90X, the expected ramping, loss of stop, and light loss to the corners of the frame, were effectively addressed and ameliorated in-camera during image processing. The ARIA compensating system applies to specified Canon and Fujinon lenses, and indeed a mix of both manufacturers’ optics were used for the World Series broadcast.
The ARIA function, found also in the 3500/3100/3170/P50 camera models, compensates in the image processing for common defects and aberrations inherent to long zoom lenses beyond 90X.
For over a century, DOPs have realized what viewers see most on screen is the quality of optics. With the advent of on-board image-enhancing strategies like ARIA, our workhorse zooms working at high magnification are able to capture dramatically improved images. Looking ahead to even higher-resolution, less-forgiving formats of the future, in applications like sports, it will make all the difference.
You might also like...
Virtual Production For Broadcast is a major 12 article exploration of the technology and techniques of LED wall based virtual production approached with broadcast studio applications in mind. Part 4 examines image based lighting, new developments in RGBW LED technology and what i…
One of the creative advantages of virtual production for performers is seeing the virtual environment in which they are performing. Using motion capture techniques extends this into capturing the motion of performers to drive CGI characters. New technologies are rapidly…
Sometimes, there’ll be a need to represent real-world objects in the virtual world. Simple objects could be built like any VFX asset; more complex ones might be better scanned as a 3D object, something some studios have begun to c…
Sending out a crew to capture a real-world environment can be a more straightforward option than creating a virtual world, but there are some quite specific considerations affecting how the material is shot and prepared for use.
Virtual Production For Broadcast is a major 12 article exploration of the technology and techniques of LED wall based virtual production approached with broadcast studio applications in mind. Part 3 examines shooting locations for virtual production, creating virtual versions of real objects a…