Taking Storage to the Next Level

Storing assets is pointless if the correct procedures are not in place to manage where these assets are, keep them secure and ensure they are discoverable. In short, those who implement the bare minimum in plain storage risk missing out on the business benefits of object storage entirely.

Asking “what’s the difference between ‘storage’ and ‘digital storage governance’”, is much akin to questioning the difference between a house and a house with plumbing, heating and electricity. The reality is if one cannot find a piece of content, then the asset does not exist. 

We call the correct and most efficient way of storing assets, Digital Content Governance, a phrase coined to mean control and overview of all of your digital assets. So given the number of aspects to consider when producing and storing assets, what are the key issues to keep in mind in order to create a proper storage system?

Efficiency of Workflow

Simple storage is effective for housing fragments of information, but when it comes to managing a large repository of complex assets, it becomes increasingly inefficient and potentially unsecure. This is especially the case if the only tools at your disposal are disjointed filesystems, USB keys, portable drives, or multiple apps that quickly become out-of-date. Add to this the issue of rising file size and new video formats and pretty quickly the situation becomes a mess.

This leads to an extortionate amount of time-sapping challenges including working out where exactly in this maze of storage is a particular clip, or what to do when storage becomes full on one device and an asset needs to be moved. Moving data shouldn’t be a complicated task that consumes half a day’s worth of valuable time in one go. Similarly, if an asset needs to provide some sort of value to the owner, then it must be retrievable for use 24/7. If it takes time to find, or is not discoverable at all, then it may as well not exist.

But of course, efficiency of search cannot be discussed without outlining the importance of metadata management. When done correctly, metadata embedded within an asset will unlock the value of that content. A good workflow should, however, automate the extraction of metadata, otherwise all the time saved during searching will only be making up for the time spent manually managing metadata.

USB sticks, while convenient, are easily lost or damaged.

USB sticks, while convenient, are easily lost or damaged.

Security of Assets

Perhaps the most detrimental of issues to storing assets within unsuitable environments is the sheer vulnerability to malicious attack. Cyber security is a hot topic right now, both in the general sphere and within our industry. But even in the case of non-malicious errors, unsuitable storage spaces are highly susceptible to asset loss or corruption. Disks get overwritten, USB keys are like socks – you know you had them but one by one they disappear, and a single virus, or accidental mistake could wipe an entire SAN directory.

Several steps can be taken here to solve the issue of security. First, ensure access rules and retention periods are enforced to limit accidental deletions. This can also be followed up by regular action audits to identify when and by whom a damaging action was taken. Second, every action that can be, should be automated to reduce human error in the first place. A complete workflow mindful of Digital Content Governance should practice digital preservation with multiple copy protection including metadata replication.

Scaling Up

There are two aspects to remember when considering how futureproof a workflow and its corresponding systems really is. The media assets themselves might still be useable in 50 years’ time, but the likelihood of machines still being able to read their format will be low. With this in mind, only storage platforms that make it easy or can automatically migrate content to new storage platforms or formats is truly futureproof.

The second aspect is how capable a workflow is of growing upwards in scale. Over time, plain-old storage solutions become more difficult to manage. If it is tape based, I probably only need to mention LTO4, 5, 6. And if it is disk based, the act of buying the latest and greatest storage also creates its own problems, namely silos of storage. 10TB of storage is required, so you buy the best system for that. One year on, 100TB of storage is required – so you go through an RFP and buy another system, supposedly scalable. Another year passes and you need to store a Petabyte of data. The supposedly scalable 100TB system now apparently ‘isn’t relevant’ for such ‘large’ scaling – in comes a new 1PB system. This certainly doesn’t conform to the strategy of Digital Content Governance, which should create an environment where assets are controlled with an emphasis on defined lifetime.

Moving from HD to a 4K workflow requires 4X the bandwidth and 4X the storage.

Moving from HD to a 4K workflow requires 4X the bandwidth and 4X the storage.

The 4K Dimension

A combination of a loss of time, not being able to quickly locate assets, security issues and silos of storage all adds up to lost revenue – not exactly what the storage solution and workflow was implemented to do.

The issues listed above are important enough, but that’s even before we start to consider the added dimension of 4K workflows. Let’s remember, that’s four times the storage requirement of HD and therefore four times the bandwidth requirement. Finding space for the data, moving the data around, transcoding the assets – it all becomes a lot harder to achieve and more expensive all around. So this means we all need to consider if we have the best strategies in place to cope with the future demands placed on storage workflows, and the added complexities of the numerous issues discussed in this article. Or better still, start to invest in how to better manage storage rather than buying additional time-sapping, vulnerable silos of problems.

Jonathan Morgan, CEO, Object Matrix.

Jonathan Morgan, CEO, Object Matrix.

Let us know what you think…

Log-in or Register for free to post comments…

You might also like...

Essential Guide:  Audio Over IP - Making It Work

Audio over IP (AoIP) has become one of the most important technologies to ever enter the media landscape. The protocol allows facilities to leverage today’s mature IP platforms for audio applications resulting in lower costs, faster installations, improved quality o…

Articles You May Have Missed – June 20, 2018

Until now, 4K/UHD and high dynamic range (HDR), in many ways, has been little more than a science project, as manufacturers have struggled to convince production entities of the long-term practicality and viability. Fears of overly complex pipelines and…

Are Helium-Filled Hard Drives More Reliable Than Air-Filled Drives?

The first commercially available helium-filled hard drives were introduced by HGST, a Western Digital subsidiary, in November, 2013. At the time, the six terabyte device was the highest capacity hard drive available. Backblaze, a major hard drive user, wanted to find…

Broadcast For IT - Part 11 - Sensors

In this series of articles, we will explain broadcasting for IT engineers. Television is an illusion, there are no moving pictures and todays broadcast formats are heavily dependent on decisions engineers made in the 1930’s and 1940’s, and in this art…

A Brief History of IP - Putting It All Together

Building reliable, flexible IP networks requires an understanding of infrastructure components and the interoperability of systems that run on them, especially when working in fast-paced, dynamic studios. Protocol interfacing is relatively straightforward, but as we investigate application level connectivity further,…