Today’s archive management systems oversee assets stored on all of the various storage options within a media organization.
As the amount of assets at broadcast production facilities continues to grow, the need for more storage to save and repurpose them has increased exponentially. For example, PGA Tour Entertainment, the production arm of the Professional Golf Association (based in St. Augustine, Fla), maintains an archive of more than 150,000 hours of digitized content (some 5 petabytes of archived footage). That’s a lot of storage capacity, and it’s still growing as you read this.
This massive increase in storage requirements has led facility managers to develop a wide array of file migration strategies, whereby a current or “active” piece of content is stored on expensive solid-state (NVME technology) or Object storage arrays while an older file is moved to a less costly tape-based library. In fact, most broadcast storage infrastructures feature four distinct tiers: online, near-line, archive and offline. The benefit of this tiered storage model is that storage cost decreases significantly as content migrates from online to near-line, from nearline to archive and, finally, from archive to offline storage.
In the past a human (or a team of people) was responsible for developing and administering a migration plan, often having to physically eject bad discs or tapes that ate up hours of time and resources. Today, most large systems have moved from only on-premises deployments to cloud-based infrastructure and hybrid cloud-local systems. This gives users the tools they need to find and use lightweight proxy or native resolution media from any number of local or cloud storage and archive systems.
An ideal storage archive in today’s world is one that can be automatically managed while allowing users to take advantage of the most applicable storage for their different types of assets, at different stages in their lifecycle.
Companies that serve the Media and Entertainment space are now offering special software that automatically performs the migration process on a user-defined timetable using a single, web-based dashboard that oversees all of the various storage locations. Once completed, individual files are easily accessible to any member of the production team no matter where the file physically resides. Once the administrator defines their own policies, the software manages the migration between storage tiers automatically with little or no human interaction. These migration policies can specify the movement of a file between storage tiers when it has not been accessed for a certain period of time, has a specific file extension or meets other general parameters.
A new generation of storage management software helps users reduce storage costs and develop a consistent file migration strategy.
In addition, if there is a malware attack or cyber attack, a carefully planned migration strategy employing this software can minimize the damage by rolling back to the latest version of any file.
“As data volumes are growing, it’s more important than ever for media organizations to get their archiving strategies correct,” said Jason Coari, director of product marketing at Quantum. The company offers its StorNext file system, which includes one or more Xcellis high-availability storage controllers or server nodes. “The file system is a layer of software that provides accessibility for users to access their data and the file system keeps track of the metadata associated with each piece of content. Therefore it knows when a user stores a piece of data where that data lives.”
The Stornext file system stripes all of the storage systems within a facility (data in the cloud, disc—NAS and Object storage—and LTO tape) together and makes it available under a single user interface. This way no matter where the data sits, the user can actually see the copy right in front of them and use it. StorNext allows the customer to make whatever hardware decision and storage media decision that’s best suited for their specific use case.
“Remotely located creative teams are collaborating more these days,” Coari said. “In large organizations, any one piece of content is probably being touched by multiple people and multiple teams. They’ll need slow storage for offline work and fast storage for rapid turnaround projects or workflows, but the assets have to be available at all times, no matter where they are stored.”
StorNext also automates policies based on time. So if a user does not touch a piece of data over a pre-determined amount of time, that data is automatically truncated to a lower-tier storage.
The StorNext file system can include one or more Xcellis high-availability storage controllers or server nodes that provide the gateway to the premium tier and long-term storage archives.
“That frees up time for your storage admin because if they had to move the data themselves like in the past, they would be inundated and probably lose their minds trying to keep up,” he said. “And creative people don't want to waste time determining where they should put the data. It just needs to be in the right place at the right time.”
Minimizing Primary Storage Reduces Costs
Last month Spectra Logic introduced new “lower cost” (starts at $7,200, for a one-year license for three users) storage management software called StorCycle that’s designed to automatically migrate assets. StorCycle allows organizations to create a new Perpetual Tier of storage, reducing the overall cost of storing data by up to 70 percent, according to the company, while giving users full access to their data. StorCycle can be implemented as standalone software using public cloud storage or existing network-attached storage, or combined with Spectra storage hardware to create a complete storage solution.
The software automatically scans primary storage data for inactive files and migrates them to a secure Perpetual Tier, which can include any combination of cloud storage, object storage disk, network-attached storage (NAS) and tape. This ensures that the data is protected, while making it easily available to end users.
Jeff Brunstein, Director of product management, Spectra Logic, said that a smaller primary storage tier shrinks backup and recovery windows, reduces costs and increases overall performance. In addition to time-based references, the software also allows users to set file size thresholds to conserve storage space, perhaps saving a file at a proxy resolution.
“The software is looking at the last access dates of the file, or whatever the user chooses to pre-determine,” Brunstein said. Spectra Logic also sells its SpectaStorage arrays in a variety of capacities. “Then it’s going to move all of those files, based on the policies that the user sets. So, a user can say ‘I want to move all files that haven’t been accessed in two years and are greater than a certain file size to a long term, more cost effective storage medium.’ Or ‘I want that job to run every month or every quarter,’ and StorCycle will do so without anyone having to touch it.”
Hybrid cloud infrastructures are proving to be a cost-effective way to make assets available with minimizing storage costs.
Once installed on a COTS server, StorCycle stores the data in an open format (LTFS) so that if a user wants to share those files with someone else, they could do that.
“Eventually, we plan to go the other way,” he said. “This way, if you have files that were generated off a camera in an LTFS format, we can easily ingest that file and bring them into the StorCycle environment. Then you can copy files and store them both to your on-premise storage array as well as off-site to the cloud, as a good way to have redundancy and ensure now material ever gets lost.”
He said that the other option is, if a user wants to keep their archive local, but also wants to put it on a lower cost storage, maybe a local disc or tape and they want to make a cloud copy as well. The benefit is they can make that copy available for sharing or protection, but when they want to get it back they won’t incur cloud retrieval costs because they still have that local copy.
“So it’s a good combination for security and saving on storage costs,” Brunstein said.
Masstech offers its own storage and asset lifecycle management platform, called Kumulate. It’s a modular solution, comprising workflow automation, transcode, and storage. A web-based UI enables search and restore, and the ability to generate, view and manipulate proxies. In day-to-day usage, Kumulate optimizes content storage, deploying workflows that are defined by the user (using asset age, usage frequency, restore profiles and other attributes) to automatically move content across storage tiers so that assets are always stored in the most cost-efficient locations.
Kumulate can also make multiple copies in different locations for disaster recovery and business continuity, and has an integrated transcode engine that automatically delivers different versions to specific locations. Kumulate also optimizes content migrations, from tape version to tape version, or from tape to cloud, or across any storage tiers, and even from legacy or outdated hardware and management systems to modern technology platforms. All of this is managed through the Kumulate UI, or from a connected MAM.
Masstech’s Kumulate software features a web-based UI that enables search and restore and the ability to generate, view and manipulate proxy files in order to save storage capacity.
“The benefits for customers are many, but the key to it all is peace of mind,” said Nik Forman, director of Marketing & Partnerships at Masstech. “Our solutions mean that users don’t have to worry about where their content is stored; they know that assets will always be in the right place, at the right time and in the right format. Storage costs will be as low as they can be, and they don’t have to get involved in any of the heavy lifting, it all happens automatically.”
In order to be most cost efficient, the goal of all storage administrators is to get primary storage to only consist of active data, which could potentially make that primary storage run faster. If it’s a smaller set of data, an organization can more realistically look to move to a faster tier of storage because they have a smaller set of active data that they are working with. That could be a NVME storage array or 3.5-inch solid state spinning discs.
These storage management solutions are not cheap—some costing hundreds of thousands of dollars—but they can be justified because they automate a process that is only getting more complex and labor intensive as the archive grows.
“These days we’re actively encouraging operators to model their storage costs as a factor of their revenue,” Forman said. “So, for example, how much does a show, or a season, bring in? And how much are you spending per GB of that show to process, edit, and of course store it? Using these metrics allow you to truly gauge return on investment. When you look at it in these terms, intelligent storage management software that saves huge amounts of operator time and effort justifies itself.”
Indeed, deploying these storage management solutions goes a long way to saving on storage costs while maintaining the freedom to access data files on demand.
“If I am CFO or a company, I care about the productivity of my staff as well as the cost of storage, so how can I make that end-to-end environment as economical as possible?,” said Quantum’s Coari. “That means making sure that your archive has a good balance of capacity versus storage costs. If I’m a member of the production team, I want to know that I can get my work finished as quickly as possible. It’s that simple.”
You might also like...
Esports viewership worldwide is on a steep upward trajectory and will soon begin to challenge traditional sports broadcast audience figures. As the esports and traditional sports communities converge, what can traditional broadcasters learn from the remote production workflows being pioneered…
Security is becoming increasingly important for broadcasters looking to transition to IP infrastructures. But creating improved software, firewalls and secure networks is only half the story as cybercriminals look to find new and imaginative methods of compromising data.
At the 2019 IBC convention this year it was clear that the consumer is king and, for broadcasters and content delivery platforms, reliably serving that on-demand ruler with hyper-adaptable operations that can reach many platforms simultaneously could secure the keys to…
When, in May 2019, AMD announced their Ryzen Zen 2 architecture, beyond the amazing performance offered by the new Series 3000 microprocessors, they announced the new chips would support PCI 4.0. Although I was pretty confident the step from 3.0 to 4.0 meant 2X greater bandwidth,…
In this fourth installment of the Immersive Audio series we investigate the production tools needed to produce live immersive content. Moving from channel-based output to object audio presents some interesting challenges as the complex audio image moves around in three-dimensional…