Camera Lenses: Through The Looking Glass, Part I
That lump of glass on the front of the camera is often taken for granted, but John Watkinson argues that choosing and using lenses wisely can make a big difference in captured image quality.
This article was first published in 2015. It and the rest of this mini series have been immensely popular, so we are re-publishing it for those who missed it first time around.
Modern lenses are remarkable devices that are so reliable they need and get little attention, but they are not and never will be perfect and knowledge of how they work can be useful if the best results are to be obtained. Lenses affect both technical and artistic/creative aspects of photography, videography and cinematography. It should be noted that the move to digital technology is revealing some similarities between these quite different arts.
Pictures can be technically perfect and artistically lacking or vice versa. In the creative sphere, anything goes and opinions will be subjective. In the technical sphere you are up against the laws of physics and if they say it can’t be done, there is no point arguing. It is important when considering what to do to be very clear whether a decision is creative or technical.
Still camera lens interior. Note the multiple elements. (Image courtesy Leica.)
Lens construction
A lens is a massively-parallel channel having enough information capacity to make most computers appear stupid. Like any information channel, the bandwidth and signal-to-noise ratio are finite, although in a lens the equivalent terms would be resolution and dynamic range. It will be necessary to consider what those terms mean in some depth.
So what does a lens do? Any object that is illuminated can be thought of as an array of points, each of which will radiate light over a wide range of directions. The job of the lens is to capture light over some of those directions and direct it to an equivalent point on the image sensor. The lens works by refraction: it bends light both where it enters and leaves the glass. The shape of an ideal lens is designed so that no matter where light from a given point on the object enters, it is always directed to a given point on the sensor, provided it is correctly focused.
All of the individual photosensitive points on the image sensor must lie within the image circle on the camera's focal plane. If any part were to lie outside the image circle they would be dark. It is therefore important that the correct lenses are used so that the image circle is correctly mapped to size of the image sensor. (Courtesy photokonnexion.com)
The sensitivity of the camera is proportional to the solid angle over which light can reach a point on the sensor. Solid angle is defined by the ratio of the area over which light can enter the lens to the focal length. The longer the focal length, the further away the light enters, so the smaller the solid angle. Long focal length lenses will either be limited in their sensitivity, or their diameter will be large. The lens area, known as the entry pupil, is in turn proportional to the square of the lens diameter.
The sensitivity is traditionally defined using the f number, which is the ratio of the focal length to the diameter. This is a dimensionless unit which gets smaller as the sensitivity goes up. For a given focal length, halving the f number doubles the lens diameter and quadruples the sensitivity. Real lenses contain an adjustable iris so that the working sensitivity can be set below the maximum by reducing the diameter of the light path.
It is more useful to calibrate the iris in sensitivity factors of two, which causes the sequence of f numbers to change in steps of the square root of two, or a near approximation to it. Thus a typical f stop sequence on a lens aperture ring might be f/1.4, f/2, f/2.8, f/4, f/5.6, f/8 and so on.
This image illustrates how changing the aperture controls how much light gets to the camera sensor. (Image courtesy gpsphotography.com.)
The focal length of a lens alone tells little. What matters is the relationship between the focal length and the size of the sensor, since that determines the field of view and a few other things as we shall see. There is always a compromise over the size of the sensor. Technical considerations such as noise mostly favour a large sensor, whereas the cost of the sensor rises disproportionately with size, not just because more material is needed, but because the chances of a defect rise with size so more have to be thrown away.
As a result ENG cameras and consumer equipment such as smart phones may use a sensor chip as small as 0.2 inches across for compactness and economy, whereas a top of the range digital cinematographic camera and a medium format digital photographic camera may have a sensor around 2.4 inches across. Broadcast HDTV cameras commonly use the 2/3 inch format (a dimension that dates from image sensors being cylindrical glass tubes), in which the frame is actually about 0.38 x 0.21 inches.
Still cameras often rated by the number of pixels. “Mine has more than yours.” That aspect alone does not adequately describe the resulting image quality and the same goes for video lenses. Sensor pixel count is but one factor to consider. Sensor size is an equally important factor. (Image courtesy jtauber.com.)
A so-called standard focal length exists for any sensor size, being one that corresponds roughly to the human view. Lenses with a shorter focal length than that are called wide-angle and longer ones are called telephoto. In the once popular 35mm still camera having a frame 1.4 inches across, the standard lens focal length was generally agreed to be around 50mm. For a 2/3 inch TV camera the standard focal length would be about 15mm, whereas for a medium format still camera it would be about 80mm.
For a constant field of view, the focal length is proportional to the sensor size. For constant sensitivity, the lens entry area needs to be proportional to focal length. Thus quadrupling the sensor size will quadruple the focal length, but it only doubles the lens diameter. This means that the lens weight and size penalty of going to larger sensors isn’t as bad as it could have been.
Focal length tells us the angle of view—how much of the scene will be captured—and the focal length or magnification—how large individual elements will be. The longer the focal length, the narrower the angle of view and the higher the magnification. The shorter the focal length, the wider the angle of view and the lower the magnification. (Courtesy Nikonusa.)
The focal length of the lens is not the physical length, nor is it necessarily the distance the sensor has to be from the back of the lens. Real cameras generally interpose something between the lens and the sensor: beam splitters in TV, mirrors in reflex cameras and shutters in cine cameras. To make space, the so-called retro-focus lens design may be used. This displays the desired focal length, but contains extra elements that make the image come to a focus physically further back.
As well as having an appropriate focal length to suit the sensor size, the lens must also be able to produce an image that is big enough to illuminate it uniformly. The ability to illuminate the frame is called coverage, and if it is not ideal the corners of the frame become darker than the centre, a phenomenon called shading or vignetting. Whilst it was considered a problem in the days of film and analogue video, in the digital domain it is one of the most benign of lens shortcomings, because it can easily be fixed by the application of gain that is the correct function of radius. It could be done in the camera/CCU but it can equally be done in post-production.
All cameras contain a sensor of finite frame size, and the output image comes only from light falling on that sensor. However, this does not mean that light from outside the field of view is unimportant. The lens doesn’t know how big the frame is. Imagine a scene in which a bright light is outside the frame. If it is just outside, it will fall on some internal part of the camera adjacent the sensor. If it is considerably off axis, it may strike the inside of the lens barrel. In both cases the light may not be absorbed completely.
Out of frame light may bounce around inside the lens/camera and eventually some will fall on the frame, reducing the contrast or even producing an image. There are three solutions to this problem. Lenses have coatings that are intended to stop light reflecting from the surfaces so that it can only go in the intended path. Lenses can also be fitted with hoods or shades to diminish off-axis light. Finally, camera users can compose shots that avoid the problem.
An example of internal reflections in a lens. A powerful street lamp is just out of the shot on the left and internal reflections of the iris are visible as small lighted circles on the image. This is a tough test with a very high quality lens and illustrates that lenses are not perfect. This particular problem could have been prevented by using lens shade to prevent light from the street lamp from falling on the lens.
A good test of a lens is to take it outdoors and shoot whilst performing a 360 degree pan. If the contrast is noticeably better down-sun, that indicates limited performance that may benefit from a hood. Lenses in a lot of smart phones can fail this simple test spectacularly. If extended or high dynamic range becomes popular, these considerations will become more important. Presently the limited dynamic range of conventional video conceals all manner of things.
To be continued…..
You might also like...
The Resolution Revolution
We can now capture video in much higher resolutions than we can transmit, distribute and display. But should we?
Microphones: Part 3 - Human Auditory System
To get the best out of a microphone it is important to understand how it differs from the human ear.
HDR Picture Fundamentals: Camera Technology
Understanding the terminology and technical theory of camera sensors & lenses is a key element of specifying systems to meet the consumer desire for High Dynamic Range.
Demands On Production With HDR & WCG
The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.
NDI For Broadcast: Part 3 – Bridging The Gap
This third and for now, final part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to a trio of tools released with NDI 5.0 which are all aimed at facilitating remote and collaborative workflows; NDI Audio,…