Object Matrix has big plans in 2018 for cloud storage.
In the latest thought opinion articles about the role of storage in cloud workflows, Jonathan Morgan, CEO Object Matrix suggests that if your business needs to use data on a daily basis then the data should be kept where best for connectivity – rather than necessarily migrating to the cloud.
BroadcastBridge: How might migration to cloud storage help broadcasters and payTV operators achieve their strategic technology objectives and business goals?
Object Matrix: There are clear benefits to the cloud in the areas of shared resources not only in terms of hardware, e.g., public cloud CPU farms and network connectivity, but also in terms of skills and not needing to employ skilled hardware technicians at facilities who would rather spend resources on people who add value to the business. Further obvious benefits can also be seen with public cloud: if you are going to deliver OTT then gravitating your data to where the playout servers might be is a massive benefit, if the workflow is a post-production one and contributors to that post-production are worldwide, then keeping the data in the cloud may be the best solution for connectivity.
However, in the world of public cloud storage, there is a big difference from that of using shared resources: the storage that you will be using is likely to be utilised 100% of the time. So, say the broadcaster is considering moving a petabyte archive to the cloud, then it may well be that the benefits of having someone else look after your data for you are outweighed by the sheer cost of the operation. Currently to store a petabyte in Amazon S3 for 5 years, ignoring egress fees (which could be very high if the data is used regularly), is approx $1.4m; with other less expensive options available only where the data won’t be required to be accessed very often.
There is another option and that’s private cloud. Private cloud is where a service provider delivers a solution for data storage to you on hardware that you own. This can provide all the advantages of a public cloud offering (ease of use and management, connectivity, etc.) but with a more appropriate cost and ownership model.
Most workflows can be stored in the cloud however many workflows do not generally work well in public cloud such as Colour Grading or high-res editing.
What reservations do broadcasters and operators have around using public cloud storage…and are these reservations justified?
There are real and present dangers in public cloud. A popularly discussed risk is security. Take for instance the recent case with ABC in Australia leaving their archive publicly accessible in the cloud. The problem isn’t that a mistake was made - that can always happen, but that because that mistake was made in a very public way, anyone could access their storage archives. If data is in the cloud it’s there to be shot at.
Secondly, there is always that danger that the company the information is being stored with will change its business practice or pricing, or that the corporation storing the data will want to get it all back in a hurry. Is that easy to do? How long would it take to recover a 25 petabyte archive? Is it conceivable that a large service provider could go away, have a catastrophic failure or put their prices up?
But perhaps the main reservation is a much more mundane, if tangible one: if your business needs to use the data on a daily basis then the data should be kept where it is needed for the best connectivity. Simply put, if the data is in the cloud, is the bandwidth to that data going to be too expensive to make sense at this point in time?
To what extent can cloud storage be virtualized in isolation from other core processing functions like ingest, playout and encoding?
There is no magic to the cloud - at the end of the day it is just someone else’s computer. Therefore, the same challenges in setting up workflow and isolating functions from interfering with each other / ensuring the right workflows can collaborate with each other exist in the cloud as on premises.
What changes in practice does cloud storage migration mean for in-house engineering/operations teams?
Certainly dev-ops will want to use solutions that take away the pain of managing storage and will rather want to use services that provide the solution for them. I truly believe that we will look back in horror at the days of in-operability between software packages and solutions and wonder at all the countless hours that were wasted doing upgrades and fixes once better and more robust solutions are in place. Or at least, we will let others take the pain whilst we concentrate in adding value to our core businesses.
What does Object Matrix offer and what makes it different from other cloud storage offers? What MAM systems does it work best with?
Object Matrix provided the first AI-ready private cloud solution based on object storage in the media world and has continued to build upon that solution since its inception. It was tough in the early days explaining why people needed object storage, but now cloud has arrived, people understand!
Where Object Matrix excels is that it bridges the gap from media applications (e.g., Avid Interplay/Media Central) through to on-premises storage or off-premises storage, private or public cloud. Object Matrix focuses on Digital Content Governance (DCG); managing, storing, securing, indexing your assets for you, so you can focus on your day to day.
Who is using this (can you give us a specific reference)
Object Matrix has well over 100 major installations with a who’s who in the media world including, to name a few: BBC, BT, ITN, Orange, Sky, Fox, NBC Universal, TV Globo (Brazil), and many more.
How are Object Matrix developing cloud storage in 2018?
Object Matrix has big plans in 2018 for cloud storage with both its private on-premises and private off-premises offerings. Three of the biggest moves forward include: more AI in the cloud - being able to search on images and video clips across your archives in a near instant fashion; more control over where you keep your assets, for instance being able to keep low-res searchable copies on-premises and high-res copies on more appropriate platforms and extension of our MatrixStore as a Service platform to more territories and customers.
You might also like...
The transformation of the media and entertainment workflow from discrete, server-based silos to software-based environments is well underway. As the industry makes this shift, media companies find that placing a scale-out storage solution at the heart of the IP workflow…
Storing assets is pointless if the correct procedures are not in place to manage where these assets are, keep them secure and ensure they are discoverable. In short, those who implement the bare minimum in plain storage risk missing out…
As today’s media workflows increase in size and speed, with Big Data analysis and Fast Data processing added to the mix, the need to better manage the entire lifecycle of content becomes ever more important. Building an efficient and eff…
In case you missed a day with The Broadcast Bridge, here are two popular articles that may be of special interest. These articles focus on specific solutions to help you and your facility operate more efficiently and economically—including some k…
Broadcasters have finally been able to harmonize the Master Exchange Format (MXF) with the Digital Cinema Distribution format (DCP) and other international media exchange formats such as Digital Production Partnership (DPP). The result is a new specification called the Interoperable…