On-Line, near-Line or archive. Just how should data be stored?

Any discussion of media storage relies on four generic phrases; on-line, near-line, archive or off-line. What storage technology is best suited for each task?

Let’s set some general definitions for our discussion on data storage. On-line means the information is immediately usable for any required purpose. The data could be on-line for transcoding while at the same time being off-line for editing. In other words storage terms like “on-line” cannot be used unless an application is also specified.

Off-line has occupied some under-defined usage scenarios for the last 10 to 20 years. For me it has always meant "Damn-it, I have to wait for something to happen before I can do what I really wanted to do". Obviously such interruptions in a workflow, to the extent that KPI's are not met, require the media to be less "off-line".

Types of storage

Why do we need a third definition of storage usage scenarios? If something is "On-shelf" is it not simply further off-line. I would like to propose that we define the difference in terms of human interaction.

Off-line can become on-line as an automated process. The cheapest, and in some sense the most secure, storage is on the shelf, preferably kept in two geophysical separated disaster protected locations.  Deciding what to put where becomes easier if we know that all three kinds of storage are available. Just to be clear, these distinctions still have value even when provisioning from the cloud.

Localized storage typically consists of multiple HDs installed in rack mounts of various sizes, configuration.  Shown here is a Facilis TX16 storage system.

Localized storage typically consists of multiple HDs installed in rack mounts of various sizes, configuration. Shown here is a Facilis TX16 storage system.

Today’s storage systems are virtually all disk based. While solid-state drives are available, and we use RAM, both provide insufficient storage capacity and are more expensive than rotating disk—especially for media projects. So what are the key differences in types of storage and how do we judge their performance?

The criteria for data storage are, Permanence, Availability, Scalability and Security. Making an acronym we get P.A.S.S.

  • Permanence means that data is never lost.
  • Availability means that the user/application requirements for access/performance are met.
  • Scalability defines the ease of meeting changing requirements.
  • Security defines the granularity and durability of access privileges.

Providing storage with the best technologically available, P.A.S.S., regardless of application, is going to be prohibitively expensive. Therefore we need to match the storage to the application. That means trading storage performance against availability. But that means we have to maintain multiple types of storage.

Cloud storage gives us the advantage of bespoke storage without the additional overhead. Generic cloud storage enables economies of scale were multiple clients are served from a single enterprise storage system. But, you still need to ask, will the cloud provider at some point off-load your deep archive and ship it to Iron Mountain? Or, can you even afford to keep all of your projects on expensive, always ready, on-line storage?

Let’s assume that we have determined what P.A.S.S we need for each application used within our acquisition, post-production and distribution pipeline. How do we determine when to move the data to less-expensive storage? Changing technology (lower prices) mean that this decision is always open for reinterpretation. This all comes back to KPI’s, if we can achieve the KPI while moving the data to less expensive storage, then do it!

Match workflows to KPI

An automated workflow should make data migration between types of storage a transparent background process. This works because the task requirements are anticipated and built into the system.

Grading systems arguably have the highest availability requirements in the production process, but does the storage used for this application have to be mirrored? Do you have to simultaneously store all current projects? Or, can you live working on just one project on-line at a time? Tradeoffs can be made to permit a sufficient level of availability, redundancy and capacity. After all, the redundancy required for the safe operation of a nuclear power-plant may be excessive for the production of the nightly news!

Diagram illustrates a typical broadcast workflow using Isilon technology. A modern storage platform allows users to adapt available storage to project needs—all without the complexity of becoming a storage infrastructure expert.

Diagram illustrates a typical broadcast workflow using Isilon technology. A modern storage platform allows users to adapt available storage to project needs—all without the complexity of becoming a storage infrastructure expert.

Correctly designed workflow management systems acquire the necessary information in order to anticipate data access requirements and move the data where it will be needed in a timely manner. This can even include an automatic order to get backups from Iron Mountain in advance so that the material is in place when needed for post. Fortunately, today’s workflow management solutions are so sophisticated, they can anticipate virtually all your production storage needs and automatically retrieve and move the data where it’s needed without human intervention.

Storage costs continue to drop. But, that doesn’t mean you should chase them.  Shown here is a Samsung 1TB SSD, which today costs about $400. A 1TB HD may cost less than $50.<br /><br />

Storage costs continue to drop. But, that doesn’t mean you should chase them. Shown here is a Samsung 1TB SSD, which today costs about $400. A 1TB HD may cost less than $50.

Exact pricing for each storage option is a moving target, however the relationship between the options should remain essentially the same.

Off-line storage costs are about one-third of on-line storage, this is without geographic replication. Using LTO-6 and 3TB cassettes at 50 cents per tape per month makes archive physical storage cost 1/100 of off-line costs. The latter comparison is, of course, unfair as it does not include the cost of the tape itself or the additional cost for physical retrieval.

However the extreme discrepancy between automated retrieval taking hours and manual retrieval taking days leaves room for a new service offering 24 hour retrieval of on-shelf storage. When thinking about the viability of shelf storage, remember that Disney destroyed the 4K data used for the latest release of Snow White and only keeps the physical separations!

You might also like...

HDR Picture Fundamentals: Brightness

This article describes one of the fundamental principles of broadcast - how humans perceive light, how this relates to the technology we use to capture and display images, and how this relates to HDR & Wide Color Gamut

AI In The Content Lifecycle: Part 4 - Pushing The Content Distribution Boundaries

Generative AI is poised for another round of disruption across key aspects of media content distribution. These include recommendation, streaming, quality control during transmission, and video encoding.

Virtualization - Part 2

In part one, we saw how virtualization is nothing new and that we rely on it to understand and interact with the world. In this second part, we will see how new developments like the cloud and Video Over IP…

Essential Guide: LED Walls For Virtual Production

This Essential Guide explores the technology of LED wall displays for virtual production. It discusses fundamental requirements and the relationships between the LED wall and the other equipment required for virtual production techniques.

Designing IP Broadcast Systems: Part 4 - System Management Software

Welcome to the fourth and final part of ‘Designing IP Broadcast Systems’ - a major 18 article exploration of the technology needed to create practical IP based broadcast production systems. Part 4 discusses the increasing role of system management and configuration software in …