Artificial Intelligence is more than just one element. In this article, we look at and describe the many parts AI encompasses.
The history of computers has been a continuum of using technology to perform human tasks theoretically faster, more efficiently, and more accurately. In some instances, computers are able to accomplish tasks or calculations beyond human capabilities in a reasonable time frame.
Until recently, Artificial Intelligence (AI) became the catch term for all advanced computer accomplishments. Now we delineate Machine Learning (ML), Deep Learning (DL), Cognitive Computing (CG) and AI. Many believe there is AI involved in our daily lives as the home assistant takes our voice commands to order cat food or engage the robotic vacuum cleaner. Recommendation engines use semantic algorithms to identify which brand and color of clothing or other object based on pattern behavior – This is NOT AI.
Watson playing Jeopardy is not AI, neither is the autonomous vehicle (part of the issue with this technology). Let’s take a moment and explore the different high order computing processes that are being called AI. This is similar to all facial tissues being called Kleenex or all copying called Xerox.
First there’s Machine Learning - ML: This is the most basic in the field of advanced computer “reasoning”. Let’s remember the famous chess match between Bobby Fischer and THE MACHINE. First the machine was taught – read programmed – every chess move on record. Then as it continued to play when it made mistakes it was told to remember the error and not do that again. If only children behaved that well.
Machine Learning (ML)
Evolved from the study of pattern recognition and computational learning theory. Gartner claimed that in 2016 machine learning was at its peak of expectations. The challenge of creating effective machine learning is difficult because discovering patterns is difficult and depends on the amount of training data available. Machine learning is based on a consistent level of input and determining the correct or incorrect value to reduce the variables when assigned to discover data or produce the solution to a problem.
Deep Learning (DL)
Is essentially the next generation of Machine Learning. As computational power has increased exponentially, the ability to build layers of machine learning algorithms across multiple GPU’s has enabled multiple concurrent processes. This is deep learning, using the computational power of high-performance computers to analyze massive amounts of data and discover patterns.
Cognitive Computing (CC)
Is taking machine/deep learning to a higher plane. While machine learning is based on “trial and error”, it does not apply “reasoning” to the data it creates. The word cognition is derived from the Latin cognitiō that derives from cognōscere which is from co- (intensive) + nōscere (to learn). It is further defined as “the mental act or process by which knowledge is acquired, including perception, intuition, and reasoning” and “the knowledge that results from such an act or process”.
Cognitive computing applies a level of reasoning to the data and errors processed from machine learning. Looking for patterns in data and then comparing those patterns to other patterns is a form of reasoning. Cognitive processes use existing knowledge and generates new knowledge.
Cognitive computing vs machine learning is the generation of new knowledge. Cognitive computing is used heavily in recognition technologies. The discussion of semantics or recommendations based unstructured data sets falls within the parameters of cognitive computing. Recommendation engines look at patterns, identify similar objects and make suggestions or recommendations to a user. Online shopping is an example of this. If a user looks at a car out of curiosity, there is a high probability they will start getting emails and notices on social media about automobiles, accessories, loans and all things car.
Semantic technology can also be considered a variant of cognitive computing. Semantic technology is the next generation of “fuzzy logic”. The human mind can analyze information that is not “black and white” or “digital as in 1 or 0”. The “grey” space is the intuitive aspect of the human mind’s ability to analyze considerable amounts of disparate information before making an informed decision. Fuzzy logic or semantic analysis is based on the variables or additional parameters that should be considered in data analysis. Semantic algorithms look at comparable data points or those that while not meeting exact query criteria, may allow that the query be inaccurate and attempts to deterministically “guess”, based on a looser filter parameter, other results to the query.
Cognitive computing is “defined” as creating new knowledge from the evaluation of existing knowledge. Machine Learning is an iterative process to determine the answer to a problem, not a predictive answer based on unknowns. Cognitive computing analyzes data and generates new data in a predictive determination of what the next layer in the data set might be.
Still Not Artificial Intelligence
The human mind is a miraculous organ. It is capable of processing multiple types of information from multiple sources, at varying intensities of input, and produce an action that may require motor control, or an answer to a problem or question. We talk of intuition and inference. What about deductive reasoning or understanding intent. Can speech to text understand cynicism or sarcasm? What about facetiousness?
I was told that an instance of IBM Watson was asked to name the most prolific anti-war poets of the 60’s. Bob Dylan’s name was not among the answers. How could that be you might ask? While anyone reading or listening can INFER from the construction of his phraseology, there are not anti-war words in his songs and poems. That is deductive reasoning. And that’s where Artificial Intelligence is still lacking.
Would the events at Facebook and Google, where their technology created its own code language for inter-application communications, be representative of Artificial Intelligence? This definitely got the attention of programmers who all of a sudden didn’t understand this new language. Is artificial intelligence the creation of a new software language to streamline communications between computer systems? It’s certainly a component of machine learning, applied cognitive computing, and moving towards inference knowledge. Was there reasoning involved as algorithms created their own algorithms to solve problems or generate new knowledge?
Interesting concepts to ponder.
Editor’s Note: Gary Olson has a book on IP technology, “Planning and Designing the IP Broadcast Facility – A New Puzzle to Solve”, which is available at bookstores and online.
You might also like...
Computer game apps read compressed artificial world descriptions from a disk file. This artificial world is regenerated by the CPU and loaded into the GPU where it is displayed to the gamer. The gamer’s actions are fed back to t…
Errors are handled in real channels by a combination of techniques and it is the overall result that matters. This means that different media and channels can have completely different approaches to the problem, yet still deliver reliable data.
Hackers are always improving the level of sophistication and constantly finding new surface areas to attack – resulting in the surging volume and frequency of cyberattacks.
As the amount of data in the world keeps exponentially multiplying, a Holy Grail in research is finding a way to reliably preserve that data for the ages. Researchers are now closing in on methods to make data permanent. The…
In the data recording or transmission fields, any time a recovered bit is not the same as what was supplied to the channel, there has been an error. Different types of data have different tolerances to error. Any time the…