So many buzzwords & acronyms and so little time! I thought this was a better title than Understanding the Differences Part 2.
In my first article, Understanding the Differences, I took a broad stroke on the high-level basics of AI, Machine Learning, Deep Learning and Cognitive Computing.
Now that we will be hearing a lot about Qubits (Quantum Computing), I thought it’s worth getting a little deeper in what is considered or discussed as Artificial Intelligence AND what is not. To be clear, Quantum Computing has NOTHING to do with artificial intelligence other than providing faster processing power that ultimately helps the algorithms handle the data loads better.
There is Artificial Narrow Intelligence (ANI), Artificial General Intelligence (AGI), Artificial Neural Network (ANN) and Convolutional Neural Network (CNN) - not to be confused with CNN/Turner.
I will leave Artificial General Intelligence for last as its currently the unobtainable or “real” AI.
Artificial Narrow Intelligence (ANI) - is the most commonly referenced AI and is trained to perform a specific task or understand a specific area of knowledge (oncology or game – Jeopardy). Each instance of ANI is dedicated and is unable to learn on its own a different task or area of knowledge. ANI is designed to constantly seek patterns in data, learn from experience and then have the ability to select an appropriate response for a query or operation.
Machine Learning (ML) - is based on algorithms that parse data, learns from that data and can make informed decisions based on what it has learned. Basic machine learning functions improve progressively, but still rely on human guidance. When a ML algorithm returns an inaccurate prediction, an engineer needs to step in and make adjustments. Machine learning involves a lot of programming and training data to support operational functions performed by camera operators, archivists, editors, master control technicians and broadcast engineers.
Deep Learning (DL) - the machine’s algorithms can determine independently if a prediction is accurate or not. DL structures its algorithms in layers to create an Artificial Neural Network (ANN) that can learn and make intelligent decisions on its own. ANN design attempts to model the biological neural network of the human brain. Unlike human coded ML, DL machines continually learn to master complex and abstract aspects of a technical or creative process.
Deep learning algorithms continually analyze data applying a logic structure similar to how a human would draw conclusions. Based on a constant flow of human queries, DL agents can generate accurate predictions nearly instantaneously. While Machine learning algorithms perform a function with the data given to it and progressively improves.
Cognitive Computing - is taking machine & deep learning to a higher plane. While machine learning is based on “trial and error”, it does not apply “reasoning” to the data it creates. Cognitive computing applies a level of reasoning to the data and errors processed from machine learning. It looks for patterns in data and then compares those patterns to other patterns in a form of reasoning. Cognitive processes use existing knowledge and generates new knowledge.
Artificial General Intelligence (AGI) - is the “holy grail” of AI. It is the ability of a computer system to mimic human intuition, deductive reasoning and contextual understanding. We are still quite a way from that capability.
There are a few other terms like neural network or Artificial Neural Network (ANN), which is not the same as Convolutional Neural Network (CNN). ANN is a computer environment mimicking the way our brain synapses interact with each other as we process information and multi-task. CNN applies neural networking to how visual information/data is processed.
An algorithm is a computer formula that may be used to write AI code, however not all algorithms are AI.
When speaking AI, you will hear the word Bayesian used quite a bit in referring to which AI technique is being used. Bayes’ theorem is named after Reverend Thomas Bayes (1701–1761) who first used conditional probability to create an algorithm to calculate limits on an unknown parameter. This was published in 1763. Pretty cool.
Another AI technique is the Inference Engine, typically the inference engine is part of an expert system coupled with a knowledge base. The inference engine applies logical rules to the knowledge base to deduce new information.
Then there is Knowledge Representation and Reasoning. Knowledge representation incorporates concepts about how humans solve problems and represents the knowledge in algorithms in order to design applications that will make complex systems easier to design and build. Knowledge representation and reasoning applies logic to automate various kinds of reasoning such as the application of rules and relationships to data sets.
These are the most common terms used when discussing artificial intelligence as the science it is. Asking Alexa or Google to play a song is not artificial intelligence. It’s clever speech to text and then a search engine to find it. So many of the products and services claiming to use artificial intelligence are not. Automating processes is not AI. Having your refrigerator order groceries is not AI!
You might also like...
As the world went remote in 2020, many businesses turned to cloud technology to help keep content flowing, causing a massive spike in cloud adoption. As one data point, Signiant reported a more than 230% increase in data moving into and out…
Computer systems are driving forward broadcast innovation and the introduction of microservices is having a major impact on the way we think about software. This not only delivers improved productivity through more efficient workflow solutions for broadcasters, but also helps…
The peculiarities of the motion of planet Earth are responsible for much more than seasons and the midnight sun and it took a while before it was all figured out.
The concept of working from home was trending long before public health issues caused most of us to contribute remotely, but the past year has seen an acceleration no one could have predicted. What those in the media industry quickly…
Never trust the adhesive holding tape to the hub of a 40 year-old ¾-inch videocassette.