Virtualization and cloud computing is taking us back to the architecture of the mainframe era. A thin client, maybe just a browser, provides the user interface, and the processing takes place at a central location. The co-location of processing and media storage gives the performance needed in a post-production environment. The lightweight, network-connnected client give producers a new flexibility as to where users can access their content. Production is a global business, and producers need the ability to work wherever suits the needs of the production. An associated download from L.A. post house, DigitalFilm Tree, decribes their deployment of Avid Media Composer for global production services.
The mainframe restricted the use of the computer, but the development of the personal computer kicked off dispersed computing and with it the general revolution with computing penetrating all aspects of business. An office computer could host word processing and spreadsheets, with a central server used for backup and archive. This model spread throughout businesses, consigning the mainframe to specialist supercomputing tasks like weather forecasting.
The desktop computing had the advantage that a low bandwidth network could be used, 10Mbit Ethernet was common. For the media sector, tasks like video editing used very large files, and the typical office networks just weren’t suited to shared storage. Production is a collaborative process, files have to be moved around. Technologies like fibre channel and storage area networks (SAN) went part way to enabling shared storage.
The advances in Ethernet bandwidths have changed all that. Although networks and switches don’t have infinite bandwidth, they do have very large capacities capable of satisfying most of the needs of the media sector.
The big question is where do you put the large files? The obvious place is close to the processing. That gives two options, the traditional route is the workstation with direct attached storage. The more recent option is to put the media files in a data center, and have remote access from a thin client or use KVM technology. This puts us on the road back to the original thin client/mainframe model, but with cloud computing replacing the mainframe.
Nothing Stays Still
As resolutions and frame rates continue to increase, the demands on edit workstations for increased processing power grows. Workstation can have tens of processor cores and multiple GPUs. However, does this represent efficient utilization of computing resource? This power is needed to render out a grade, but what about when the edit bay isn’t in use?
Another issue with dispersed workstations—and this applies to office applications as well—is maintenance. Although there are tools to assist, keeping many workstations up to date with the current operating systems and applications software is a chore. Security is a problem with so many access points for insecure USB thumb drives and the like. In the post environment, there are additional tasks like licence management for plugins.
Post houses have traditionally put all the noisy chassis in a central equipment room along with shared storage and used KVM to remote to the edit bays. However, the equipment room still had racks of individually assigned workstations. The alternative is the data center approach. The applications run in a virtualized environment on racks of generic processors, possibly blades.
Maintenance becomes much simpler, software updates can be managed more easily, hardware failures are dealt with via a blade swap. There are other advantages, the processing resources becomes a pool, with the levelling effect of the pool meaning better utilization of the hardware. Processor intensive tasks like transcoding, which may not be time sensitive, can be scheduled to utilize slack resources.
Post in the Cloud
There are many approaches to the virtualization of applications in a cloud environment. The public cloud providers like AWS and Azure are one route, but for media businesses the most flexible and cost-effective route can be a hybrid of a private facility—possibly an on-premise data center—and the public cloud services.
Much depends on where is the best place to store the large media files, and where are the users are located. It could be all in one building, the classic post house. However, many productions want near-set facilities, or they may be farming out tasks like VFX to specialist shops. The collaborative approach may involve sharing media files across several sites. Some may require original camera raw files, others may only need compressed copies for review and approval.
A spreadsheet can be used to figure the optimum solution, balancing bandwidth availability and cost, plus processing costs for the different options.
Not all of this is new. Premium motion picture production has hauled files around the world for a decade or more. What is more recent is the possibility of using cloud processing rather than returning the files for editing to the primary post house on the production.
No-one jumps into this without raising one question, “are my files going to be secure?” It is a very good question, as many past security breaches have shown. No sector is immune from hacks, but industries like banking have demonstrated that very high levels of security can be achieved with cloud computing. There is nothing as insecure as an edit workstation with a USB port or DVD burner.
Many vendors in the media technology sector now have cloud-native offerings of their applications, giving customers the flexibility to operate in classic mode on a workstation though to virtualized operation across a hybrid cloud/on-premise service.
Whether a motion picture production, wanting to spread post around locations and specialist FX houses, or a television co-production looking to collaborate across two or three countries, the virtualization of post-production applications gives producers a new flexibility as to how a complex production is managed.
Everything is in place, the applications and the services, and already some forward-looking post houses are offering post-production in the cloud. One example is DigitalFilm Tree (DFT), based in Los Angeles. Their ProStack platform allows productions, post vendors, and marketing departments secure access to any file they need. Through the use of the cloud they can offer clients real time collaboration with third parties, on any device, from anywhere in the world.
White Paper: The Power of Virtualization
In this White Paper sponsored by Avid, Guillaume Aubuchon, CIO of DigitalFilm Tree give an overview of “The Power of Virtualization”. Guillaume decribes how DFT have centralized the computing resources and storage. DFT now runs multiple instances of Media Composer on multiple, high-performance servers in a pooled processing environment. The paper looks the advantages, and as well as scalability and security.
You might also like...
Transitioning to IP improves flexibility and scalability, both of which are achievable using COTS IT equipment. But can COTS solve every challenge? Or does broadcasting still have some unique and more demanding requirements that need further attention? In this article,…
Immersive audio transforms the listening environment to deliver a mesmerizing and captivating experience for a wide range of audiences and expansive group of genres.
Without doubt, virtualization is a key technological evolution focus and it will empower many broadcast and media organizations to work differently, more efficiently and more profitably.
Artificial Intelligence is more than just one element. In this article, we look at and describe the many parts AI encompasses.
After visiting the recent Henry Stewart DAM (Digital Asset Management) conference in New York, Gary Olson asked some very difficult questions of Cloud vendors regarding security. Their responses may surprise you.