# The Potential Impact Of Quantum Computing

Quantum Computing is still a developmental technology but it has the potential to completely transform more or less everything we currently assume regarding what computers can and can’t do - when it hits the mainstream what will it do to broadcast?

Legislation banned cigarette advertising on US radio and TV in 1970. The last TV cigarette commercial aired at 11:50 pm EST on 1/1/71, 10 minutes before the ban became law. The ban made the broadcast industry’s so-called ‘license to print money’ business model obsolete, and the game-changing impact on broadcasting was more powerful than any new technology. In fact, lower ad revenues positioned relatively cheap ¾” U-matic VCRs to replace expensive news film, quad machines and 2” tape in TV stations.

Over half of broadcast TV’s history, vacuum tubes dominated electronic technology in stations and homes. Who remembers waiting for the TV to warm up? Until DTV, broadcast technology improvements were typically incremental. Everyone is interested in the latest new tech at NAB Shows, but nearly every successful new technology ever demonstrated at an NAB Show can be done on a modern PC, tablet or smartphone for a relatively tiny investment.

One example would be an Ampex Digital Optics (ADO) digital video effects (DVE) device. It was a huge industry hit that sold for $250,000/channel in 1981. If you didn't have an ADO you weren't in the post-production game. In 2002, I threw one in a dumpster. In 2023, nearly all new consumer computers and tablets can do more and better for about 1% of the original ADO price. Wow has turned into "what's the app for that?"

#### Generation Loss

Before lossless digital video recording and digital copying, analog video ‘generation loss’ was the most frustrating production and post obstacle in the industry because it restricted flexibility and creativity. The most expensive 2” quad machines introduced noise and other processing artifacts when its analog video output was recorded on another quad machine. Generation artifacts in a third-generation analog 2” quad tape dub are difficult to ignore. Less expensive analog VCR formats like U-matic and VHS generated so many artifacts when copied that copies beyond about the 3^{rd} generation were barely viewable.

Uncompressed digital video can be digitally copied again and again without generation loss, providing a new creative advantage for modern producers and video editors. However, multi-generation digital video can sometimes be considered ‘lossy.’ Artifacts can occur with compression and or conversion for IP encoding and between digital formats. The better the compression system and higher the bitrate, the less likely it will appear lossy when copied or viewed. As more facilities move to full IP, h.265 and NDI, lossy compression will become moot.

#### The Arrival Of Quantum Computers

Three regular NAB exhibitors, IBM, Google and Microsoft, are at various stages of quantum computer hardware and software development and production. Each is taking a different approach to quantum computing. The IBM Osprey is the largest quantum computer at 433 qubits. It uses a superconducting cubit ‘transmon’ made to reduce charge noise. Google Quantum AI uses Google’s Sycamore chip based on superconducting qubits. Both have the gold chandelier form factor and must be kept at temperature of 15 millikelvins.

Microsoft Azure Quantum brings quantum computing to the cloud at three levels using quantum computers from multiple manufacturers. It uses topological qubits, based on exotic states of matter called ‘anyons’ for greater error resistance.

Quantum computers solve problems classical computers cannot, such as finding structure in tons of data. Broadcasters are gathering tons of data in almost every area of capture, production and delivery.. Who will be the first to show quantum computing at a NAB Show and what will it do?

The costs of quantum computers depend on the number of qubits. Most with two- or three-digit qubit counts cost millions and include installation and a full-service contract. SpinQ recently introduced a $5000 portable quantum computer with two qubits, capable of operating at room temperature. Quantum computers aren’t the next generation of supercomputers, they’re something entirely different because they rely on subatomic quantum mechanics, just as nature does. The only items on Earth that use ones and zeros are designed by man. Quantum systems are a change to the rules of probability.

The building blocks of classical computers is bits. A classical bit is binary switch that can only be set to zero or one. A classic bit can only be in one state at a time, thus data is processed in a serial manner. A quantum computer processes data in parallel, rendering multiple solutions simultaneously. The building blocks for quantum computers are qubits. A Qubit is like an arrow pointing into 3D space. If the arrow points straight up its value is zero, also known as the ground state. If it points straight down, its value is one, also known as the excited state. The qubit arrow is a simple way to visualize a qubit.

In the quantum world, the arrow is in a ‘superposition’ state when it points in any direction other than up or down. The answer will still be a zero or a one, but it depends on a probability set by the direction of the arrow.

If the arrow is pointing more upwards, the output is more likely to be zero than one. More downward is more likely to be one than zero. If it points horizontally exactly at the equator, the probability is 50-50.

In a classical computer the bits are independent of each other. In quantum computers the qubits can become ‘entangled’ with each other, meaning they become part of one large quantum state together. When entangled, independent probabilities don’t count, and probability distribution becomes what is important. Two entangled qubits can produce four possible states: 00, 01, 10 or 11. If the direction arrow of either qubit changes, it changes the probability distribution for the entire system. The qubits are no longer independent, they are all part of the same large state.

Scientifically, a qubit is described as a ‘quantum wavefunction.’ When qubits are entangled, their quantum wavefunctions are added together to create an overall wavefunction. The state of a qubit is controlled by microwave pulses at a particular frequency and duration. The right pulse makes a particular qubit material spin in dynamic superpositions.

Adding the wavefunctions creates ‘interference,’ because some waves can become stronger when added together (known as constructive interference) while other waves can cancel each other out (known as destructive interference). Quantum computers use constructive interference to increase the probabilities of correct answers, and destructive interference to decrease the probabilities of incorrect answers. This work is done with quantum algorithms, which is the territory of Einstein and other physicists and again beyond the scope of this story.

One qubit has a probability distribution over two states. Two qubits can have 4 states. Three qubits can have 8 states, and the number continues doubling each time another qubit is added. A computer with N qubits can be in 2N states.

Suffice to say, a qubit (quantum bit) is the basic unit of quantum information. Qubits reduce the number of steps it takes to complete a computation. A 10-qubit quantum computer can store 2^{10} (1,024) values in parallel. An equivalent modern PC performing similar tasks in series would take weeks. The PC ‘equivalent’ to a quantum system with 500 qubits would need more bits than there are atoms in the universe. Quantum computing is still in the experimental stage. Experts predict it is two to 10 years away from being fully developed.

Michio Kaku recently said “Silicon Valley could become a rust belt unless they get on the bandwagon.” Soon, Moore’s Law and digital computers will collapse, and we will need to move to the next level - the atom. When AI and ML start working at the quantum level it will break encryption as we know it today and unravel the secrets of the universe.

The problem with quantum computing is that its parallel processing power can easily break the most sophisticated, serial encryption systems used for finance, banking, and government work in mere moments. RSA encryption is the foundation of all the public key cryptography in use today. It uses prime factors of huge numbers. Is so hard to break that even the best-known algorithm, the General Number Field Sieve, would take 16 million years to crack it. A quantum computer could crack it in a couple of days or hours.

That’s ’s not the kind of power to be in mistrustful hands. Scientists are working on post-quantum Cyber Topography that can withstand quantum attacks. It's a dilemma because quantum computers are only going to become more powerful.

## You might also like...

# How Starlink Is Progressing As An Alternative To 5G

TV stations have mostly parked their satellite trucks and ENG vans in favor of mobile bi-directional wireless digital systems such as bonded cellular, wireless, and direct-to-modem wired internet connections. Is Starlink part of the future?

# Preventing TV News Deepfakes

The technology used to create deepfake videos is evolving very rapidly. Is the technology used to detect them keeping pace and are there other approaches broadcast newsrooms can use to protect themselves?

# Scalable Dynamic Software For Broadcasters - The Book

Scalable Dynamic Software For Broadcasters is a free 88 page eBook containing a collection of 12 articles which give a detailed explanation of the principles, terminology and technology required to leverage microservices based, software only broadcast production infrastructure.

# Motion Pictures: Part 6 - How We Might Achieve True Motion

John Watkinson continues his exploration of the potential for a true motion tv system that requires the complete removal of frame sampling to make each pixel a continuous representation of the image thus removing motion artefacts.

# Beyond RGB – Should We Use More Primaries?

Moving beyond the use of three primary colors could significantly increase the range of colors we can reproduce. There is no doubt it could improve the viewer experience but what are the barriers to adoption?