Using KPI for Media Workflows

Determining the Key Performance Indicators (KPI) required in order to measure media workflows is no different than for other workflows. Attempting to set standards and measure these without a thorough understanding of the process will result in incorrect results regardless of whether your workflow is for a doctor’s office or a post suite.

At IBC I encountered multiple exhibitors with products where the demonstrator mentioned KPI, indicating that the product always had the correct KPI and often could be tailored to the user or project. While this all sounds good, the underlying facts are that using KPI properly requires some understanding of the underlying theory. This article will help you get started.

 It is important when developing key performance indicators to understand what processes are going on and the desired output goals.

It is important when developing key performance indicators to understand what processes are going on and the desired output goals.

Why have KPI?

The first question we need to ask: “What is the purpose of the measurement”. The answer will include not only the reason but also the audience. In the case of the above workflow our audience could include all stakeholders or just an operator such as the Asset Manager. With such a broad audience, how we measure a KPI for productivity would be different in each case.

Let’s assume we want to measure productivity and our audience is the Asset Manager. First we need to define productivity within the constraints of the workflow. Rather than pick some theoretical or historically accepted measurement it makes sense to look at our workflow and see if any of the automatically generated data can give us usable information.

For instance ask, what does productivity mean to the asset manager? Is it personal performance, system performance, operator performance, etc.? Because the Asset Manager is client facing, regardless of how performant the process may be, the organization’s productivity might be down if orders are not being placed or if the orders are not profitable. What if the Asset Manager is doing excellent work but their major account just went out of business?

A KPI always measures something. Let us call this something the “facilitator” regardless of whether it is a person a machine or an entire system. Now we need to know what the facilitator can affect and how. Measuring the productivity of the asset manager by whether the client base increased makes no sense, but an increase in orders from an existing client is affected by the manager’s performance.

If your facilitator is a person, make sure they get the current status of the KPI so that they may explain when circumstances were beyond their control.

The Asset Manager

The Asset Manager also has a controlling function. Part of this is on-time, in-budget and to-specification delivery. The other part is profitability. How can the asset manager affect profitability? Let’s look at the workflow. All organisational tasks are handled by the Asset Manager, so what information does the workflow provide when these tasks are not being handled in an efficient manner? Every workflow has a so-called “happy path” where everything goes as expected and there are no exceptions. Many in this industry may claim such a condition never occurs in broadcast or post work.

In this example workflow, all events go as planned and there are no exceptions.

In this example workflow, all events go as planned and there are no exceptions.

As you can imagine exceptions to an expected chain of events costs time and money. If the workflow was designed to allow for it we can automatically check the actual exceptions against the planned exceptions as outlined in the original order. This will give us a rough KPI “Unplanned Exceptions”, however the facilitator (Asset Manager) cannot affect all unplanned exceptions.

Now we have two cases where the workflow is not giving us all the KPI data we think we need. Whether the product was delivered to specifications can be objective or subjective. If we are off spec a bit but the client is satisfied who's to say differently? The point being that the absence of this data is not going to affect the usability of the KPI for the purpose intended.

Also, if all specs are technically correct and the client is still dissatisfied, why is this? Remember our audience is the Asset Manager measuring their own Productivity and the purpose of a KPI in this case is an alert. So, if client orders are matching market expectations and product is on-time in-budget and technically correct and unplanned exceptions are kept to a minimum, is this enough for the Asset Manager to measure their own productivity? Because this information is automatically generated as part of the production process it is extremely accurate. Asking users to enter data that is not needed by the workflow is counterproductive and does not increase the usability of this KPI.

Measure the correct parameters

Let’s change the audience, the boss wants performance data on the new asset manager. What is missing from the above? We need a baseline for comparison to some expected level. The best scenario would be a history of previous Asset Managers using the same KPI’s.

Why not use “throughput”, you may ask? Historically throughput was the easiest thing to measure, so many gadgets per hour, but today we can measure other things just as easily. Still there is the missing element, how much of their available time was booked to which client and what percentage of the total “work-time” was booked to client work. Also interesting for the boss is client to asset manager matching.

There is an iterative process between workflow design and KPI reporting. A well-designed workflow will acquire all information required to develop informative KPI without needing additional manual inputs. The opening image for this article is from a YouTube article on developing organization KPI. That tutorial video can be found here.

You might also like...

Cloud Workflow Intelligent Optimization - Part 2

In the last article in this series, we looked at how optimizing workflows improves reliability, and enhances agility and responsiveness. In this article, we investigate advanced monitoring systems to improve data analysis and aid optimization.

Cloud Workflow Intelligent Optimization - Part 1

Optimization gained from transitioning to the cloud isn’t just about saving money, it also embraces improving reliability, enhancing agility and responsiveness, and providing better visibility into overall operations.

IP Security For Broadcasters: Part 4 - MACsec Explained

IPsec and VPN provide much improved security over untrusted networks such as the internet. However, security may need to improve within a local area network, and to achieve this we have MACsec in our arsenal of security solutions.

Managing 2022

The new year is a time to ponder the past and muse about the future. In the past, nearly each technical device needed to produce broadcast TV cost more than building a new house, was as huge as it was…

The Sponsors Perspective: Video Streaming Strategies & Challenges In A Rapidly Changing Environment

Entertainment over the internet has gained significant traction over the last years. For this reason, companies have developed new business models in order to retain customers, by meeting their emerging needs and studying the behavior patterns of online streaming consumption.