The Compromises In Long Zoom Lenses Are Still There But Viewers See Fewer Of Them
Sony HDC-5500 field camera.
Fox Sports’ broadcast of the 2022 Baseball World Series was a real eye-opener, and it wasn’t just because the Houston Astros handily defeated the Philadelphia Phillies in six games. After all, for those in the know, the Astros were the heavily favored team so the outcome shouldn’t have been too surprising. But what was surprising and even amazing to many broadcast professionals were the quality of the images from the center-field cameras.
Other articles by Barry Braverman:
When SD ruled the roost, it hardly mattered how lenses performed - the broadcaster’s low-resolution images masked all but the most egregious lens defects.
During the games, upwards of 40 cameras in total were trained on the field, including a venerable array of DirtCams. on-field MoVIs, a Flycam, and Super-Mos. It was, however, the bright, brilliant images from the manned Sony HDC-5500 center-field cameras that impressed veteran observers the most. At 400 feet (120m) from home plate, the 100X+ magnified images retained optimal sharpness and contrast to the corners of the frame with no loss of resolution or focus. This is no small achievement given the wickedly long zoom lenses used commonly by broadcasters these days at major sporting events, including the U.S. World Series.
The laws of physics, however, must still apply, and as such, DOPs have long had to contend with the shortcomings of the long zooms. Breathing of focus, ramping, loss of contrast, and darkening to the corners of the frame, have been the nemeses of DOPs for decades; the demand being so great for such optics, particularly in sports, that lens designers are willing to sacrifice some image quality in exchange for the greater reach.
Historically, zoom lenses have been comprised of two main element groups for focus and magnification. The two sections worked together via a system of gears and cams to provide the desired increased magnification while maintaining, theoretically at least, a constant field size.
With the advent of HDR and 8K resolution, the shortcomings of long zoom lenses can be glaringly obvious. The shortcomings, while subtle at first glance, are often amplified and thus rendered more objectionable downstream after encoding and subsequent decoding in the home TV. Chromatic Aberration Compensation and other LUT-based software schemes can help reduce the visibility of notable artifacts during original image capture and processing.
For DOPs who came of age in the 1970s and 1980s, the breathing issues we routinely faced were not confined to chronic lung problems or obscene phone calls. I can recall not-too-fondly the breathy performance of my Angénieux 12-150 f2.8 zoom on news magazine shows like 60 Minutes. While the 16mm format and standard definition TV concealed many of the lens’ egregious shortcomings, today’s higher-definition formats offer no such refuge. Shooting in 4K and 6K today enables the capture of pictures with exquisite detail, and sadly, more obvious and objectionable lens defects.
Over the last 40 years, DOPs have seen significant advances in the mechanical design of zoom lenses, incorporating, in addition to the two main groups, additional elements that move only incrementally to account for breathing of focus, barrel distortion, and other deficiencies. Most DOPs recognize, however, that chromatic aberrations (CA) are the most serious challenge facing lens designers, and indeed to most DOPs, CA is the main reason that cheap lenses tend to look cheap.
While improved lens coatings have helped enormously with respect to performance, the introduction of on-board Chromatic Aberration Compensation (CAC) has also played a key role as lens manufacturers seek a cost-effective way to reduce the visibility of CA. Derived from the average CA observed in multiple lenses of the same type, a generalized formula like CAC can only go so far, and may not be effective or even relevant to a particular lens in actual working conditions.
Still, the die is cast, and by now, the notion of on-board compensating software to enhance lens performance is a foregone conclusion. Manufacturers, like Canon and Fujinon, appear keen to develop new innovative ways to boost the quality of the captured image, without adding unnecessary elements or contributing further to the complexity of the lens and its concomitant cost.
The World Series broadcast delivered on the promise of an in-camera LUT-based solution in a major way. Utilizing Sony’s ARIA [Automatic Restoration of Illumination Attenuation] system, the distant center-field HDC-5500s dynamically applied the desired software correction in response to the actual lens settings, magnification, and operating environment. Applied most notably at the long end of the zoom beyond 90X, the expected ramping, loss of stop, and light loss to the corners of the frame, were effectively addressed and ameliorated in-camera during image processing. The ARIA compensating system applies to specified Canon and Fujinon lenses, and indeed a mix of both manufacturers’ optics were used for the World Series broadcast.
The ARIA function, found also in the 3500/3100/3170/P50 camera models, compensates in the image processing for common defects and aberrations inherent to long zoom lenses beyond 90X.
For over a century, DOPs have realized what viewers see most on screen is the quality of optics. With the advent of on-board image-enhancing strategies like ARIA, our workhorse zooms working at high magnification are able to capture dramatically improved images. Looking ahead to even higher-resolution, less-forgiving formats of the future, in applications like sports, it will make all the difference.
You might also like...
The Back Of The Brain May Soon Rule The Roost
If industry reports are to be believed, Apple is poised to release a mixed-reality headset at some point in 2023. Of course, it’s anyone’s guess when Apple’s Reality Pro will actually see the holographic light of day, but one t…
Learning From The Experts At The BEITC Sessions at 2023 NAB Show
Many NAB Shows visitors don’t realize that some of the most valuable technical information released at NAB Shows emanates from BEITC sessions. The job titles of all but one speaker in the conference are all related to engineering, technology, d…
Interlace: Part 3 - Deinterlacing
Now that interlace is obsolete, we are left only with the problem of dealing with archive material that exists in the interlaced format. The overwhelming majority of video tapes, whether component or composite, analog or digital, would be interlaced.
Compression: Part 6 - Inter Coding
The greatest amount of compression comes from the use of inter coding, which consists of finding redundancy between a series of pictures.
Magicbox Puts Virtual Production Inside An LED Volume On Wheels
Virtual production studios are popping up across the globe as the latest solution for safe and cost/time-effective TV and movie production. This method replaces on location shooting and, by utilizing all-encompassing LED walls (often called “volumes”), is fundamentally changing the…