More AI Acronyms & Buzzwords Explained

So many buzzwords & acronyms and so little time! I thought this was a better title than Understanding the Differences Part 2.

In my first article, Understanding the Differences, I took a broad stroke on the high-level basics of AI, Machine Learning, Deep Learning and Cognitive Computing.

Now that we will be hearing a lot about Qubits (Quantum Computing), I thought it’s worth getting a little deeper in what is considered or discussed as Artificial Intelligence AND what is not. To be clear, Quantum Computing has NOTHING to do with artificial intelligence other than providing faster processing power that ultimately helps the algorithms handle the data loads better.

There is Artificial Narrow Intelligence (ANI), Artificial General Intelligence (AGI), Artificial Neural Network (ANN) and Convolutional Neural Network (CNN) - not to be confused with CNN/Turner.

I will leave Artificial General Intelligence for last as its currently the unobtainable or “real” AI.

Artificial Narrow Intelligence (ANI) - is the most commonly referenced AI and is trained to perform a specific task or understand a specific area of knowledge (oncology or game – Jeopardy). Each instance of ANI is dedicated and is unable to learn on its own a different task or area of knowledge. ANI is designed to constantly seek patterns in data, learn from experience and then have the ability to select an appropriate response for a query or operation.

Machine Learning (ML) - is based on algorithms that parse data, learns from that data and can make informed decisions based on what it has learned. Basic machine learning functions improve progressively, but still rely on human guidance. When a ML algorithm returns an inaccurate prediction, an engineer needs to step in and make adjustments. Machine learning involves a lot of programming and training data to support operational functions performed by camera operators, archivists, editors, master control technicians and broadcast engineers.

Deep Learning (DL) - the machine’s algorithms can determine independently if a prediction is accurate or not. DL structures its algorithms in layers to create an Artificial Neural Network (ANN) that can learn and make intelligent decisions on its own. ANN design attempts to model the biological neural network of the human brain. Unlike human coded ML, DL machines continually learn to master complex and abstract aspects of a technical or creative process.

Deep learning algorithms continually analyze data applying a logic structure similar to how a human would draw conclusions. Based on a constant flow of human queries, DL agents can generate accurate predictions nearly instantaneously. While Machine learning algorithms perform a function with the data given to it and progressively improves.

Cognitive Computing - is taking machine & deep learning to a higher plane. While machine learning is based on “trial and error”, it does not apply “reasoning” to the data it creates. Cognitive computing applies a level of reasoning to the data and errors processed from machine learning. It looks for patterns in data and then compares those patterns to other patterns in a form of reasoning. Cognitive processes use existing knowledge and generates new knowledge.

Artificial General Intelligence (AGI) - is the “holy grail” of AI. It is the ability of a computer system to mimic human intuition, deductive reasoning and contextual understanding. We are still quite a way from that capability.

There are a few other terms like neural network or Artificial Neural Network (ANN), which is not the same as Convolutional Neural Network (CNN). ANN is a computer environment mimicking the way our brain synapses interact with each other as we process information and multi-task. CNN applies neural networking to how visual information/data is processed.

An algorithm is a computer formula that may be used to write AI code, however not all algorithms are AI.

When speaking AI, you will hear the word Bayesian used quite a bit in referring to which AI technique is being used. Bayes’ theorem is named after Reverend Thomas Bayes (1701–1761) who first used conditional probability to create an algorithm to calculate limits on an unknown parameter. This was published in 1763. Pretty cool.

Another AI technique is the Inference Engine, typically the inference engine is part of an expert system coupled with a knowledge base. The inference engine applies logical rules to the knowledge base to deduce new information.

Then there is Knowledge Representation and Reasoning. Knowledge representation incorporates concepts about how humans solve problems and represents the knowledge in algorithms in order to design applications that will make complex systems easier to design and build. Knowledge representation and reasoning applies logic to automate various kinds of reasoning such as the application of rules and relationships to data sets.

These are the most common terms used when discussing artificial intelligence as the science it is. Asking Alexa or Google to play a song is not artificial intelligence. It’s clever speech to text and then a search engine to find it. So many of the products and services claiming to use artificial intelligence are not. Automating processes is not AI. Having your refrigerator order groceries is not AI!

You might also like...

Why AI Won’t Roll Out In Broadcasting As Quickly As You’d Think

We’ve all witnessed its phenomenal growth recently. The question is: how do we manage the process of adopting and adjusting to AI in the broadcasting industry? This article is more about our approach than specific examples of AI integration;…

Designing IP Broadcast Systems: Integrating Cloud Infrastructure

Connecting on-prem broadcast infrastructures to the public cloud leads to a hybrid system which requires reliable secure high value media exchange and delivery.

Video Quality: Part 1 - Video Quality Faces New Challenges In Generative AI Era

In this first in a new series about Video Quality, we look at how the continuing proliferation of User Generated Content has brought new challenges for video quality assurance, with AI in turn helping address some of them. But new…

Minimizing OTT Churn Rates Through Viewer Engagement

A D2C streaming service requires an understanding of satisfaction with the service – the quality of it, the ease of use, the style of use – which requires the right technology and a focused information-gathering approach.

Production Control Room Tools At NAB 2024

As we approach the 2024 NAB Show we discuss the increasing demands placed on production control rooms and their crew, and the technologies coming to market in this key area of live broadcast production.