Cloud Storage - Object Matrix Has A View

In the latest thought opinion articles about the role of storage in cloud workflows, Jonathan Morgan, CEO Object Matrix suggests that if your business needs to use data on a daily basis then the data should be kept where best for connectivity – rather than necessarily migrating to the cloud.

BroadcastBridge: How might migration to cloud storage help broadcasters and payTV operators achieve their strategic technology objectives and business goals?

Object Matrix: There are clear benefits to the cloud in the areas of shared resources not only in terms of hardware, e.g., public cloud CPU farms and network connectivity, but also in terms of skills and not needing to employ skilled hardware technicians at facilities who would rather spend resources on people who add value to the business. Further obvious benefits can also be seen with public cloud: if you are going to deliver OTT then gravitating your data to where the playout servers might be is a massive benefit, if the workflow is a post-production one and contributors to that post-production are worldwide, then keeping the data in the cloud may be the best solution for connectivity.

However, in the world of public cloud storage, there is a big difference from that of using shared resources: the storage that you will be using is likely to be utilised 100% of the time. So, say the broadcaster is considering moving a petabyte archive to the cloud, then it may well be that the benefits of having someone else look after your data for you are outweighed by the sheer cost of the operation. Currently to store a petabyte in Amazon S3 for 5 years, ignoring egress fees (which could be very high if the data is used regularly), is approx $1.4m; with other less expensive options available only where the data won’t be required to be accessed very often.

There is another option and that’s private cloud. Private cloud is where a service provider delivers a solution for data storage to you on hardware that you own. This can provide all the advantages of a public cloud offering (ease of use and management, connectivity, etc.) but with a more appropriate cost and ownership model.

Most workflows can be stored in the cloud however many workflows do not generally work well in public cloud such as Colour Grading or high-res editing.

What reservations do broadcasters and operators have around using public cloud storage…and are these reservations justified?

There are real and present dangers in public cloud. A popularly discussed risk is security. Take for instance the recent case with ABC in Australia leaving their archive publicly accessible in the cloud. The problem isn’t that a mistake was made - that can always happen, but that because that mistake was made in a very public way, anyone could access their storage archives. If data is in the cloud it’s there to be shot at.

Secondly, there is always that danger that the company the information is being stored with will change its business practice or pricing, or that the corporation storing the data will want to get it all back in a hurry. Is that easy to do? How long would it take to recover a 25 petabyte archive? Is it conceivable that a large service provider could go away, have a catastrophic failure or put their prices up?

But perhaps the main reservation is a much more mundane, if tangible one: if your business needs to use the data on a daily basis then the data should be kept where it is needed for the best connectivity. Simply put, if the data is in the cloud, is the bandwidth to that data going to be too expensive to make sense at this point in time?

To what extent can cloud storage be virtualized in isolation from other core processing functions like ingest, playout and encoding?

There is no magic to the cloud - at the end of the day it is just someone else’s computer. Therefore, the same challenges in setting up workflow and isolating functions from interfering with each other / ensuring the right workflows can collaborate with each other exist in the cloud as on premises.

What changes in practice does cloud storage migration mean for in-house engineering/operations teams?

Certainly dev-ops will want to use solutions that take away the pain of managing storage and will rather want to use services that provide the solution for them. I truly believe that we will look back in horror at the days of in-operability between software packages and solutions and wonder at all the countless hours that were wasted doing upgrades and fixes once better and more robust solutions are in place. Or at least, we will let others take the pain whilst we concentrate in adding value to our core businesses.

What does Object Matrix offer and what makes it different from other cloud storage offers? What MAM systems does it work best with?

Object Matrix provided the first AI-ready private cloud solution based on object storage in the media world and has continued to build upon that solution since its inception. It was tough in the early days explaining why people needed object storage, but now cloud has arrived, people understand!

Where Object Matrix excels is that it bridges the gap from media applications (e.g., Avid Interplay/Media Central) through to on-premises storage or off-premises storage, private or public cloud. Object Matrix focuses on Digital Content Governance (DCG); managing, storing, securing, indexing your assets for you, so you can focus on your day to day.

Our solution is MAM agnostic but we have qualified against Cantemo Portal, Vidispine (API level), VSN, Tedial, CatDV, Masstech, Axel etc.

Who is using this (can you give us a specific reference)

Object Matrix has well over 100 major installations with a who’s who in the media world including, to name a few: BBC, BT, ITN, Orange, Sky, Fox, NBC Universal, TV Globo (Brazil), and many more.

How are Object Matrix developing cloud storage in 2018?

Object Matrix has big plans in 2018 for cloud storage with both its private on-premises and private off-premises offerings. Three of the biggest moves forward include: more AI in the cloud - being able to search on images and video clips across your archives in a near instant fashion; more control over where you keep your assets, for instance being able to keep low-res searchable copies on-premises and high-res copies on more appropriate platforms and extension of our MatrixStore as a Service platform to more territories and customers.

Let us know what you think…

Log-in or Register for free to post comments…

You might also like...

Broadcast for IT - Part 1 - Introduction

In this series of articles, we will explain broadcasting for IT engineers. Television is an illusion, there are no moving pictures and todays broadcast formats are heavily dependent on the decisions engineers made in the 1930’s and 1940’s. Understanding broadcast vid…

DPP - The Live Explosion

Away from traditional broadcasting a revolution is happening. Live internet streaming is taking the world by storm with unprecedented viewing figures and improved accessibility for brands looking to reach better targeted audiences. The Live Explosion, hosted by the DPP in…

Your Guide to Understanding IP

See that hill up ahead? It’s not a hill, it’s Mt Everest and your job is to conquer that mountain. Rendered into familiar industry vernacular, you, video engineer, are charged with building an IT-centric facility. A SMPTE standard was…

Articles You May Have Missed – November 22, 2017

At the start of 2013, BCE at RTL City was a hole in Luxembourg’s ground. In less than four years the facility was on air broadcasting 35 different channels across Europe and Singapore. Costas Colombus is BCE’s Special Projects Manager and…

A Conversation with Up-and-Coming Audio Mixer Sean Prickett

In this interview, we meet Sean Prickett – a “young gun” audio mixer that’s making a name for himself on some of TV’s most high-profile live and audition shows. Most recently, Sean was A1 for the live-broadcast finals of the CBS r…