Moving to an IT/IP infrastructure is more than just bits, it is also about image/file size. Facilities will soon need to move from current HD-size files to 4K-size files. For those technical managers charged with this transition, you probably have questions. If so, then read on as help is just ahead.
Another issue surrounding the move to 4K involves camera specifications. Not all vendors use the same criteria, definitions and specifications. Need some clarity? See David Austerberry’s viewpoint on the industry’s move to 4K, “4K and the Numbers Game.”
Creating and transmitting 4K content requires more storage, higher bandwidth and more computing power than HD. Image: anandtech.com
The production community has always led when it comes to advancing image quality. Always pushing the limits, asking for better pictures, immersive audio, higher highlights, deeper blacks. At the same time, I’ve never met the broadcaster who was anxious to tear up the TV station to support the latest innovation. Broadcasters tend to be more risk adverse for a variety of reasons.
Neither approach is good or bad; the choice has much to do with how long the equipment must last, continuity of service and budgets. However, on the broadcast front, a revolution is about to take place, all centered on ATSC 3.0. One of the advancements that the new broadcast standard will support is 4K imagery.
The migration from SD to HD was less disruptive than the move from HD to 4K will be for a variety of reasons. Perhaps the first consideration in building a new 4K workflow involves storage. Of particular concern is the impact these much larger files will have on storage capacity and bandwidth. Analyst Coughlin Associates predicts that there will be a 5.8X increase in digital storage capacity used in the entertainment industry over the next few years and nearly a 4X increase in storage capacity shipped per year—an increase from 26,756PB to 102,661PB. This massive storage consumption includes storage associated with media workflows as well as long-term archive.
Read the complete tutorial here.
Marketers like large numbers, hang the physics of realising an image and the psychovisual complexities of rendering a scene into the visual cortex.
We saw all this with HD. Most European countries adopted 1080 interlaced HD, disregarding tests that showed 720p50 pictures looked better. We know that to avoid the artefacts of interlace vertical resolution had to be sacrificed, so a progressive scan picture with fewer lines looked sharper, but the number 720 was not that much more than the old SD standard of 625 lines. The detail that the SD picture had only 576 active lines was irrelevant, it’s all about numbers. On top of that, the public were sold 1080P receivers, when broadcasts were interlaced, OK you could view 24P movies, but even today, 1080P remains overlooked, passed over in the numbers game.
The next conn is 4K, sounds like four times 1080, its just another detail that the latter is picture height, the former is width.
Viewing test after test has shown better pixels beat more pixels, but the CE folks like things simple, 4K presents the opportunity to sell at a better margin over the miserable profits to be made from HD screens. At the end of the day, it’s about profit, we all need to earn a living, and so from studio to broadcaster to CE vendor, profit must take precedence over the niceties of picture quality and psychovisual realities.
Of course it’s all numbers again, what really counts is what the pictures look like. What If the camera is considered as a black box, an opto-electric transfer engine? Then the number of photosites; the algorithms in the deBayering and the colour matrix; the size of the photosites; the dynamic range; all these factors contribute to the picture quality. What matters are how factors like textural reproduction and the sharpness of edges in the image are perceived? These depend on the lens and the camera together, so headline numbers like the number of photosites are important, but they don’t tell the whole story.
Read the complete article here.
Visit The Broadcast Bridge daily for more answers to your technology questions.
You might also like...
Noise is found in all imaging systems, but it becomes particularly challenging in low light. High ISO can be used to increase brightness, but it also amplifies noise. Post-processing can be applied, but it does not resolve the low signal-to-noise…
Following numerous private conversations and panel discussions at the recent 2018 NAB Show, it’s become clear that broadcasters are being challenged like never before to hold the line on CapEx spending while delivering more content across their linear platforms. Because o…
The core of any camera is the sensor, and along with the lens, they define and constrain the performance of the camera more so than the downstream processing. There have been many advances in sensors, with the move from vacuum…
Canon’s new Cine-Servo 17-120 T2.95-39 compact lenses are parfocal lenses. Once focus is achieved, focus will be held at all focal lengths. However, when you need auto-focus, the parfocal design is a perfect match to the Canon’s Dua…
Did you miss these two important articles from The Broadcast Bridge? The first of two articles presents a white paper examining how software-centric T&M can help keep test equipment up-to-date on the latest standards and technologies. The second…