Most people are aware that words like “gamma” and “gamut” are connected with the way a camera processes brightness and color, respectively. Some modern cameras might easily have half a dozen settings for each, and it’s not always obvious which is best for the circumstances at hand. To be clear up front, it’s not the purpose of this article to prescribe which settings to use in which situations. There’s an infinity of reasons to do various things, that sort of advice would date very quickly.
What we can do is to go over the concepts, so that the next time we’re faced with a multi-level menu structure requiring us to pick one of a dozen combinations, we can make the best possible decisions. Before we dive in too deep, though, let’s address one widely-asked question. Is the situation as it exists in early 2020 perhaps a little more complicated than it needs to be in order to make the best use of the technologies we have?
Frankly, yes, it probably is – but we still have to make it work.
The wording that’s used to discuss brightness and color is sometimes used rather loosely, which can deepen confusion. The terms “brightness” and “color” are good because they refer to human perception. If two lights look as bright as one another, they are as bright as one another, and likewise if they look the same color they are the same color. These are not words picked because they’re somehow simpler; they have well-defined meanings that work well because they refer to what things look like, which is usually what we care about.
Gamma and Gamut Settings
Often, a camera will want us to pick a gamma and a gamut, though some may combine the two settings into a single menu item. Most settings fall into one of two main groups: settings designed to be viewed without color correction, and settings that must be processed later, in color grading, to make sense. We’ll concentrate mainly on the latter, but anyone shooting news and current affairs will probably be using a setup called something along the lines of Rec. 709. This records pictures with a lot more built-in contrast designed to be broadcast, and would be a disaster for anything intending to grade later. Literal implementations of standards like Rec. 709 tend to look very harsh, even on modern cameras, and many manufacturers include settings representing a slightly adjusted version intended to look pretty rather than accurate.
For the rest of this piece, we’ll concentrate mainly on setups intended for later grading, as that’s where most of the complexity lies. Typical gradable gamma settings include Arri’s Log C, SLog by Sony, C-log on Canon cameras and V-log on a Panasonic Varicam. Gamma controls how the camera handles brightness, with different gamma settings creating a different relationship between the amount of light that goes into the lens, and the signal level recorded.
Gamut controls how the camera handles color. Typical gamuts include SGamut from Sony or V-Gamut on Panasonic cameras. Most people understand that a color picture is recorded using a combination of red, green and blue; at its simplest, a gamut simply defines which red, which green, and which blue are used. Deeper, more saturated primaries allow the camera to record deeper and more saturated colors; no red can be deeper than the red used as a primary color, for instance.
Choosing A Gamut
Designing gamma and gamut encoding systems is simple in some ways, and complex in others. Anyone can create a color gamut by picking any available shades of red, green and blue, though camera gamut settings often include other aspects of color processing too, so it can be complex. With a bit of mathematics, it’s possible to convert images recorded with one gamut into another, although clearly a gamut with less-saturated primaries can’t represent all the saturated colors another might.
Instinctively, we might pick the deepest possible shades of red, green and blue – truly monochromatic colors, made of only one wavelength of light, and in fact that’s what Rec. 2020 does. There are downsides to choosing very saturated primaries, though. First is that no three primary colors can actually encompass all the color a human can see; second is that very deep and saturated primaries might need to be represented as bigger digital numbers so that there are enough graduations between them to avoid visible banding. A bigger gamut can require more bits.
Gamma is perhaps a slightly different matter. Modern gamma settings often encode the signal in the file as (approximately) the base-2 logarithm of the detected brightness, which sounds complicated but really just means that each stop of exposure is represented by the same range of numbers. Different gamma settings may mean that range is larger or smaller, and it is reasonable to think that an 8-bit camera, recording 256 levels of brightness, is less capable of representing a large range than a 10-bit camera, recording 1024 levels. Selecting a log encoding mode capable of huge range on an 8-bit camera might lead to visible banding in certain parts of the image, although in practice even that tends to be quite hard to see on a monitor.
Most professional cameras in 2020 are capable of 10-bit recording, though. Given that, and given that all the camera manufacturers are presumably working toward much the same goals, there’s a question over why they often have several different gamut and gamma encoding options. Each of them is using roughly the same underlying technology, in the end. Part of the answer to that is commercial: it’s easier to promote a manufacturer-specific technology. Some of it arises from the history of digital imaging. That’s particularly the case with Arri’s Log C, which makes it an interesting example.
The C stands for Cineon, a term which was originally the name of a whole system for scanning film and manipulating that was developed in the early 1990s. It was an early example of the log encoding we mentioned above, where each stop of exposure occupies an equal part of the range of numbers. Arri’s implementation is designed to be quite close to Cineon, perhaps so that people used to grading film scans would be used to Arri pictures.
The differences between various manufacturers’ implementations of log encoding are sometimes slight; Sony’s SLog3 is sufficiently similar to Log C that tools intended for one can sometimes – coarsely, and with caveats – be used with the other. It might be reasonable to interpret Sony’s previous efforts as driven more by the idealized electronic behavior of their cameras than a desire to be familiar to colorists. While most gamma and gamut settings definitely aren’t sufficiently similar to be interchangeable, the reason they differ is generally just down to engineering opinion.
Varying Exposure Latitude
Engineers have different goals, and so different systems are designed to achieve different things. Log C is interesting in that it fixes the 18% gray picture level at a 10-bit count of 400, and that holds true regardless of the sensitivity setting on the camera. Other cameras don’t do that. Because the behavior of the sensor is more or less fixed, this approach means that the camera appears to have varying over- and under-exposure latitude depending on ISO selection. This is true of all cameras in essence, but things may change depending on the gamma encoding in use.
Similarly, the principal difference between Sony’s commonly-encountered SLog2 and SLog3 is that SLog3 is – to gloss over the details – a lower-contrast encoding. It is capable of describing a wider range of brightness and midtones appear brighter when viewed uncorrected. What’s interesting about that is that very few cameras are actually capable of shooting 16-stop images, so some of the range in SLog3 will be wasted on many cameras. Similar engineering tweaks apply to Sony’s gamut options, with the .cine suffix indicating color handling intended to be more similar to a film scan and thus more familiar to colorists.
Often, the differences in performance between the various options is fairly slight. Recording footage in SLog2 against SLog3 should not, absent really very specific circumstances, cause unsolvable problems. Accidentally recording in Rec. 709, or some other mode not intended to be graded, would cause serious problems, potentially making the footage unusable, or at lest very hard to grade. Beyond that, though, so long as the same settings are selected in the camera and in the grading software (which is sometimes automatic) things should be OK.
Test, Test and Test Again
Given an unfamiliar camera, the only really universal solution is to shoot tests, which many high-end productions will do anyway. These tests should involve the camera, recording media, monitoring and post-production chain that’s intended for the final production. These tests can serve to verify the choice of settings, ensure compatibility between production and post-production gear, and even provide an opportunity to create production-specific monitor setups. Particularly, if multiple cameras are involved, matching issues can be investigated at this stage.
If this sounds like the old approach of testing film stocks – well, it is. It’s hard to come up with rules of thumb to avoid problems, but in general, shooting almost any log mode is likely to be less destructive, leaving more room for maneuver, than shooting modes described as “Rec. 709” or a variation thereon. The curveball comes from cameras which are capable of recording an image with certain settings, while monitoring an image with entirely different settings. At some point, if this is a real concern, employ a DIT (Digital Imaging Technician) or consult the rental house or manufacturer.
In general, if there’s at least time to throw a few test shots on a Resolve timeline, if things aren’t right they will often look so wrong that it’s easy to tell there’s a problem. There is no quick fix for that moment of confronting a menu, but understanding the issues involved certainly makes it easier to head off the problems that pop up when things are wrong – and to choose the least-destructive option in a crisis.
Next time we’ll look at the Academy Color Encoding System, which was, in some ways, a reaction to all this complexity, and which can make dealing with it a little easier.
You might also like...
In the last article in this series, we looked at why integrated monitoring is a necessity in modern broadcast IP workflows. In this article, we dig deeper to understand what is new in IP monitoring and how this integrates with…
A few years ago, a prominent manufacturer of studio support equipment did something unusual: it went to NAB with an experienced broadcast camera operator to discuss a part of live production that’s invisible when done well. Following a driven g…
Video, audio and metadata monitoring in the IP domain requires different parameter checking than is typically available from the mainstream monitoring tools found in IT. The contents of the data payload is less predictable and packet distribution more tightly defined…
Will AI and ML make TV engineers obsolete? Or will it give engineers more time to focus on crucial live production details and new revenue opportunities?
In the previous article in this two-part series we looked at how cloud systems are empowering storytellers to convey their message and communicate with viewers. In this article we investigate further the advantages for production and creative teams.