Virtualization - Part 1

As progress marches us resolutely onwards to a future broadcast infrastructure that will almost certainly include of a lot more software running on cloud-based infrastructure, this seems like a good moment to consider the nature of Virtualization.


Other articles in this series:


If you want to understand virtualization in a fundamental way, you don’t have to look further than the room (or garden, or orbiting space station) that you’re currently in. It might feel pretty solid to you, a sensation reinforced by pretty much everything (legal!) that you’ve done since you were born. Most people live their entire lives without questioning their understanding of reality. But what does that mean? At the very least, you can probably identify a few pragmatic rules. For example, solid means solid: you can’t pass your hand through a tabletop in much the same way as you can’t walk through walls. We live according to a vast collection of cumulatively reinforced empirical laws of behavior, and we wouldn’t be able to play pool, drive a car or even lay peacefully on a sofa if these rules were even slightly flaky. Can you imagine being able to relax if, one time in every one hundred, you fell through the sofa into an unfathomable void?

Everything described above is a perfectly valid and consistent type of reality. But what does it even mean to admit the possibility that there are other types of reality of which we’re seemingly completely unaware?

Virtualization isn’t something relatively new, swept in by modern technology. Instead, everything we do and everything we know is courtesy of some kind of virtualization. The concept of virtualization shouldn’t seem strange to us - in a sense, it’s simply more of what we’re already used to.

Donald Hoffman compared the way we see reality to an operating system’s desktop, where the familiar-looking objects on the surface are like analogies to the complex and opaque processes that lay underneath. It makes sense for computers and for us. We’re all familiar with the concept of atoms and that they are tiny fundamental constituents of matter. In fact, they’re not even particularly fundamental. We now know that atoms can be split and that they’re made up of smaller electrons, neutrons, positrons and so on. And even these especially tiny objects can be split into a kaleidoscope of weirdly named constituent particles that are so abstract it almost beggers belief - were it not for strong experimental proof.

There’s absolutely no way that our brains can cope with all of that detail, and nor do they have to, as long as we have evolved a way to see the world through what are essentially proxies for reality. So, when I see a table, I don’t see a swirling sea of sub-atomic particles; instead, I perceive a wooden surface standing on metal legs. And I know - despite our latterly-obtained knowledge that the table is strictly composed mostly of empty space - that my coffee cup won’t pass through it when I put it down after taking a sip.

Turning back to computers, the desktop analogy is so effective that it has opened up computing to the vast majority of the population that doesn’t write machine code. Even before graphical user interfaces, operating systems like MS-DOS and Linux/UNIX were still massive abstractions from the hardware beneath. Today’s computers (even Apple’s Vision Pro) almost universally have a desktop paradigm for their UI.

This illustrates an incredibly important principle: that virtualization (because this is a prime example) can have a catalytic effect on usability. Before the GUI, computers were usable only by experts. If they had remained inaccessible, then they would still be niche, specialist machines. But that would have ruled out the vast majority of today’s computer users, who are not specialists (and nor should they have to be). As a result of today’s profoundly accessible computers, practically every field of research and wider endeavor is enhanced by the almost universal availability of computers.

To see the proof of that, look no further than email. If you’re using Gmail or anything like it, then you are confidently using something that is massively virtualized. You might see what’s labelled an “inbox” on your computer, but what that corresponds to in the physical world is a few bytes stored on some kind of massive virtualized storage system somewhere - who knows where? - in the world. To the user, the details don’t matter because it just works and it’s easy to use.

There’s a pattern here that’s far from obvious (that’s the nature of virtualization!): virtualization occurs at all levels in the technology stack. It can be a programming language like C relative to machine code or an operating system written in C that gives users a graphical interface. And today, you can get into an electric car, speak your destination to it, and it will take you there. That’s, essentially, a virtual journey: you don’t need to understand the first thing about driving, how a car works, dynamics or the law as it applies to drivers.

So, wherever you look, you can find examples of virtualization at any level. Before we go on, it’s a good idea to talk about the distinction between virtualization and abstraction. There’s actually very little difference, except, perhaps, that abstraction is a more general term. Specifically (although this distinction is a soft one), abstraction tends to hide complexity. Virtualization tends to do this at the same time as making the abstracted view look more like something else that is familiar. In practice, the two terms are so interwoven that they are largely interchangeable, and I’m going to treat them as such, except when we’re looking at specific applications. Virtual machines (VMs) are typically entire, configured computers that are abstracted from specific hardware but behave in every other way like a real computer. It’s quite tricky to achieve, and the companies that make virtual machine software do it extremely well. Here are some of the advantages

Run Multiple Machines On A Single Computer

Running a virtual machine will always require computing power (”compute”) just to enable the processes needed to support it, and “supporting” effectively means providing an emulation layer that makes the virtual machine think it’s a real computer. There’s also a Hypervisor that switches between multiple virtual machines in a way that makes sense to the actual, physical computer. But if the virtual instances aren’t too demanding, there’s no reason why you shouldn’t run multiples of them on a single, physical computer. This is a huge advantage and breaks the hard link between the number of computer instances and the number of physical devices. Is it a panacea? No. You can’t expect miracles - especially where there is heavy-duty processing called for or lots of real-time I/O.

Easily Replace A VM That’s Failed

When a VM fails, it can be for any number of reasons, but in a high-pressure environment, instead of stopping production to figure out what went wrong (a new software update, for example), it is quicker simply to close down the troublesome instance and start a new one that was configured and stored for exactly this purpose before the problem hit. Remember - there’s no change to the physical environment.

Demo An Entire Set-up Without Having To Physically Configure A Computer

This is a niche, but it nicely illustrates the advantages of a VM. Companies that sell complex computer software or services can offer their customers a “canned” version of multiple set-ups. There’s no need to waste time configuring the demonstration at the customer’s premises: it’s already set up on one or more virtual machines. It’s quick, slick, and a godsend for weary tech salespeople.

The concept of virtualization has much more to it, and we’ll examine more examples in the next article.

You might also like...

Microphones: Part 2 - Design Principles

Successful microphones have been built working on a number of different principles. Those ideas will be looked at here.

Expanding Display Capabilities And The Quest For HDR & WCG

Broadcast image production is intrinsically linked to consumer displays and their capacity to reproduce High Dynamic Range and a Wide Color Gamut.

Standards: Part 20 - ST 2110-4x Metadata Standards

Our series continues with Metadata. It is the glue that connects all your media assets to each other and steers your workflow. You cannot find content in the library or manage your creative processes without it. Metadata can also control…

Delivering Intelligent Multicast Networks - Part 2

The second half of our exploration of how bandwidth aware infrastructure can improve data throughput, reduce latency and reduce the risk of congestion in IP networks.

If It Ain’t Broke Still Fix It: Part 1 - Reliability

IP is an enabling technology which provides access to the massive compute and GPU resource available both on- and off-prem. However, the old broadcasting adage: if it ain’t broke don’t fix it, is no longer relevant, and potentially hig…