It is important to bring discipline to the ever increasing demand for storage.
Storing assets is pointless if the correct procedures are not in place to manage where these assets are, keep them secure and ensure they are discoverable. In short, those who implement the bare minimum in plain storage risk missing out on the business benefits of object storage entirely.
Asking “what’s the difference between ‘storage’ and ‘digital storage governance’”, is much akin to questioning the difference between a house and a house with plumbing, heating and electricity. The reality is if one cannot find a piece of content, then the asset does not exist.
We call the correct and most efficient way of storing assets, Digital Content Governance, a phrase coined to mean control and overview of all of your digital assets. So given the number of aspects to consider when producing and storing assets, what are the key issues to keep in mind in order to create a proper storage system?
Efficiency of Workflow
Simple storage is effective for housing fragments of information, but when it comes to managing a large repository of complex assets, it becomes increasingly inefficient and potentially unsecure. This is especially the case if the only tools at your disposal are disjointed filesystems, USB keys, portable drives, or multiple apps that quickly become out-of-date. Add to this the issue of rising file size and new video formats and pretty quickly the situation becomes a mess.
This leads to an extortionate amount of time-sapping challenges including working out where exactly in this maze of storage is a particular clip, or what to do when storage becomes full on one device and an asset needs to be moved. Moving data shouldn’t be a complicated task that consumes half a day’s worth of valuable time in one go. Similarly, if an asset needs to provide some sort of value to the owner, then it must be retrievable for use 24/7. If it takes time to find, or is not discoverable at all, then it may as well not exist.
But of course, efficiency of search cannot be discussed without outlining the importance of metadata management. When done correctly, metadata embedded within an asset will unlock the value of that content. A good workflow should, however, automate the extraction of metadata, otherwise all the time saved during searching will only be making up for the time spent manually managing metadata.
USB sticks, while convenient, are easily lost or damaged.
Security of Assets
Perhaps the most detrimental of issues to storing assets within unsuitable environments is the sheer vulnerability to malicious attack. Cyber security is a hot topic right now, both in the general sphere and within our industry. But even in the case of non-malicious errors, unsuitable storage spaces are highly susceptible to asset loss or corruption. Disks get overwritten, USB keys are like socks – you know you had them but one by one they disappear, and a single virus, or accidental mistake could wipe an entire SAN directory.
Several steps can be taken here to solve the issue of security. First, ensure access rules and retention periods are enforced to limit accidental deletions. This can also be followed up by regular action audits to identify when and by whom a damaging action was taken. Second, every action that can be, should be automated to reduce human error in the first place. A complete workflow mindful of Digital Content Governance should practice digital preservation with multiple copy protection including metadata replication.
There are two aspects to remember when considering how futureproof a workflow and its corresponding systems really is. The media assets themselves might still be useable in 50 years’ time, but the likelihood of machines still being able to read their format will be low. With this in mind, only storage platforms that make it easy or can automatically migrate content to new storage platforms or formats is truly futureproof.
The second aspect is how capable a workflow is of growing upwards in scale. Over time, plain-old storage solutions become more difficult to manage. If it is tape based, I probably only need to mention LTO4, 5, 6. And if it is disk based, the act of buying the latest and greatest storage also creates its own problems, namely silos of storage. 10TB of storage is required, so you buy the best system for that. One year on, 100TB of storage is required – so you go through an RFP and buy another system, supposedly scalable. Another year passes and you need to store a Petabyte of data. The supposedly scalable 100TB system now apparently ‘isn’t relevant’ for such ‘large’ scaling – in comes a new 1PB system. This certainly doesn’t conform to the strategy of Digital Content Governance, which should create an environment where assets are controlled with an emphasis on defined lifetime.
The 4K Dimension
A combination of a loss of time, not being able to quickly locate assets, security issues and silos of storage all adds up to lost revenue – not exactly what the storage solution and workflow was implemented to do.
The issues listed above are important enough, but that’s even before we start to consider the added dimension of 4K workflows. Let’s remember, that’s four times the storage requirement of HD and therefore four times the bandwidth requirement. Finding space for the data, moving the data around, transcoding the assets – it all becomes a lot harder to achieve and more expensive all around. So this means we all need to consider if we have the best strategies in place to cope with the future demands placed on storage workflows, and the added complexities of the numerous issues discussed in this article. Or better still, start to invest in how to better manage storage rather than buying additional time-sapping, vulnerable silos of problems.
Jonathan Morgan, CEO, Object Matrix.
You might also like...
NASCAR Productions, based in Charlotte NC, prides itself on maintaining one of the most technically advanced content creation organizations in the country. It’s responsible for providing content, graphics and other show elements to broadcasters (mainly Fox and NBC), as w…
Ground breaking advances in storage technology are paving the way to empower broadcasters to fully utilize IT storage systems. Taking advantage of state-of-the-art machine learning techniques, IT innovators now deliver storage systems that are more resilient, flexible, and reliable than…
Artificial Intelligence (AI) has made its mark on IT and is rapidly advancing into mainstream broadcasting. By employing AI methodologies, specifically machine learning, broadcasters can benefit greatly from the advances in IT infrastructure innovation and advanced storage designs.
Broadcast systems are renowned for their high speed and high capacity data demands. Up to recently, they relied on bespoke hardware solutions to deliver the infrastructure required for live real-time uncompressed video. But new advances in IT data storage have…
Broadcast and IT technology collaboration is continuing its journey with the cross-over between them becoming ever clearer. Storage is a major growth area with advanced methods achieving greater efficiencies and resilience. And as broadcast repeatedly demands more and more capacity,…