Musings of a Consultant - Workflow in Modern Media

This article will be the first in a series that will continue for a while. I hope to explore some aspects of workflow for modern media and create some clarity about terms, trends, pitfalls, and successes. I am also hoping that as we explore this together you will feed back questions and comments that might steer future postings in the series to be more useful to you in the future.

We all have opinions about what constitutes workflow for moving media. I must say that the topic is not new, just our focus on it as a discipline. Moving media workflow can certainly be traced back to the early days of film. The craft of film editing developed over decades into a set of approaches to completion of film content in ways that allowed collaboration between practitioners and service agencies that provided unique services a film editor needed. Those might have included sound, or film special effects work. By staying inside a repeatable workflow the results became repeatable, and of course less expensive. Clearly that is important. Part of the workflow was the devices necessary to complete projects, like well known tools such as the Steinbeck flat bed editor used for decades in documentary, television, and motion picture work.

In electronic media purely electronic production workflow can be traced to the early days of professional videotape, facilitated by the invention of the quadruplex video tape format by Ampex. While not file based in quite the sense we think of today, the analog electronic records on the videotape were more ‘file like’ than ‘film like’ in that they were only useful when reproduced and displayed. The same is quite true of files today. We work with electronic versions, digital versions, of the content we are producing, but only at the time of playback does it become ’television’.

The workflow that evolved in color television before digital files was based upon tools that allowed linear content to be edited and effects performed in ways that were well understood. It involved control over playback functions of video tape recorders, and manual operations of course. Take the workflow for the critically acclaimed program from the late 60’s and early 70’s, ‘Rowin and Martin’s Laugh In’. Art Schneider, a highly accomplished editor who passed away in 2009, developed a workflow that allowed a program with sometimes hundreds of edits to be assembled weekly. He used an off line process based in film and then conformed the original video tape (2” quadruplex recordings) by physically cutting the tape with razor blades. Once completed the program was ‘sweetened’ by adding a laugh track and other sound elements as appropriate. His approach was unique and the production technique was awarded an Emmy in 1968, validating the importance of the ‘jump cut’ edit as well as the workflow which facilitated sometimes as many as 500 edits in one hour. Art had a production concept which could not be done by any existing workflow, so he invented one that worked for the production values he wanted. Perfect, form following function.

The Smith 2 inch quad videotape splicer
image Courtesy of Early Television
Foundation and Museum and Steve

As digital technology was entering the industry we had to adapt workflow to more new tools. Interestingly, the workflow that was developed for analog video recording was simply modified for digital recording. The first digital recorders were almost recording files on tape. Only the lack of headers and footers and other file structures would limit the interpretation of the record on the digital tape as a file (it was a digital stream, a subject for a future article…). We interfaced to the digital recorders initially with analog interfaces, later replaced with digital interfaces as other digital production devices like switchers and graphics forced the development and standardization of fully digital studio systems. Even then, much of post production was done with techniques that simply used digital versions of analog tools, with digital interfaces, to complete essentially the same linear workflow. The source and output was digital, but the workflow was linear and additive, with no ability to ‘undo’ once a process had been completed.

The first major disruption of this long time pattern of workflow came as a result of the implementation of islands of non-linear, computer based, editing tools. Those first primitive file based tools fundamentally changed how we look at production, but in the context of this series, how workflow is crafted and completed. It was different in the following ways.

First, it was not additive and destructive. Until ‘flattened and rendered’ editing changes could be made, including addition or removal of graphics and special effects. To a degree this was possible in a well crafted linear workflow so long as all of the production elements were left unmodified and used as input sources in a combining stage, usually in a sophisticated editing room with large production switcher, audio console, and of course multiple people. In previous workflow that meant often the content had to be dubbed (copied), inexorably lowering the quality with each successive generation away from the original material. The major difference with files is that the next generation is the same quality as the previous, and so the flattened master preserves the quality of the original recording. Second, the ability to ‘undo’ in a non-destructive way meant that many production decisions could be explored and then abandoned. Though in ‘rehearsal or preview mode’ this was possible before, only a single edit could be previewed. With non-linear file based workflow it was possible to edit entire sections of a production and then choose to accept it and move on, or try another approach, non-destructively. This changed the game in pernicious ways. Some say it sped up production, others said it slowed it down because so many options could be explored. In my own experience as an editor it was still up to the editor to recommend what was appropriate and then demonstrate the wisdom on experience. And sometimes that actually worked!

There was one aspect of this change that cannot be missed. Until field and studio production moved to file based acquisition, which was considerably after the editing tools were created by CMX, AVID, Imix, Lucas Films, Montage and others, the weak link was the movement of content from acquisition to post production. Content had to be ingested, which meant real time copying of the content to the computer file based environment before editing could begin. And of course it also had to be rendered and ‘printed’ to linear media like tape for delivery and consumption.

EditCAM recorded on removable hard
drives, with files formatted to interchange
directly with AVID Media Composer.
Photo Courtesy of Ikegami.

Once Ikegami and AVID created the EditCAM in 1995 things changed for ever. The camera recorded files which could be copied at considerably faster than real time into the edit system. Things would never go back to linear I/O again once this cat was out of the bag.

To be continued soon …, “what is the role of metadata in production?”

You might also like...

Essential Guide: Delivering High Availability Cloud

Delivering high availability cloud for broadcast production and transmission environments requires engineers to think in terms of resilience from the very beginning of the design.

The Big Guide To OTT: Part 1 - Back To The Beginning

Part 1 of The Big Guide To OTT is a set of three articles which take us back to the foundations of OTT and streaming services; defining the basic principles of the OTT ecosystem, describing the main infrastructure components and the…

Professional Live IP Video - Designing Networks

There’s a lot to consider when planning to incorporate uncompressed media into your live production particularly using SMPTE ST 2110, ST 2022-6 or AES67 streams. In this article, we will look at the network hardware, its architecture and future-proofing you s…

The Back Of The Brain May Soon Rule The Roost

If industry reports are to be believed, Apple is poised to release a mixed-reality headset at some point in 2023. Of course, it’s anyone’s guess when Apple’s Reality Pro will actually see the holographic light of day, but one t…

Learning From The Experts At The BEITC Sessions at 2023 NAB Show

Many NAB Shows visitors don’t realize that some of the most valuable technical information released at NAB Shows emanates from BEITC sessions. The job titles of all but one speaker in the conference are all related to engineering, technology, d…