Understanding the Differences – ML, DL, CC, AI
By Gary Olson (Guest Columnist) | November 28, 2017
The history of computers has been a continuum of using technology to perform human tasks theoretically faster, more efficiently and more accurately. In some instances computers are able to accomplish tasks or calculations beyond human capabilities in a reasonable time frame.
Artificial Intelligence (AI) has become the catch term for all advanced computer accomplishments. Now we delineate Machine Learning, Deep Learning, Cognitive Computing and Artificial Intelligence. We all believe there is AI involved in our daily lives as the home assistant takes our voice commands to order cat food or engage the robotic vacuum cleaner. Recommendation engines use semantic algorithms to identify which brand and color of clothing or other object based on pattern behavior. This is NOT AI.
Watson playing Jeopardy is not AI, neither is the autonomous vehicle (part of the issue with this technology). Let’s take a moment and explore the different high order computing processes that are being called AI. This is similar to all facial tissues being called Kleenex, or all copying called Xerox.
First there’s Machine Learning. This is the most basic in the field of advanced computer “reasoning.” Let’s remember the famous chess match between Bobby Fischer and THE MACHINE. First the machine was taught; programmed with every chess move on record. Then, as it continued to play and made mistakes, it was told to remember the error and not do that again. If only children behaved this way.
Machine learning evolved from the study of pattern recognition and computational learning theory. Gartner claimed that in 2016 machine learning was at its peak of expectations. The challenge of creating effective machine learning is difficult because discovering patterns is difficult and depends on the amount of training data available. Machine learning is based on a consistent level of input and determining the correct or incorrect value to reduce the variables when assigned to discover data or produce the solution to a problem.
Deep Learning is essentially the next generation of machine learning. As computational power has increased exponentially, the ability to build layers of machine learning algorithms across multiple GPU’s has enabled multiple concurrent processes. This is deep learning, using the computational power of high performance computers to analyze massive amounts of data and discover patterns.
Cognitive Computing is taking machine/deep learning to a higher plane. While machine learning is based on “trial and error,” it does not apply “reasoning” to the data it creates. The word cognition is derived from the Latin cognitiō that derives from cognōscere which is from co- (intensive) + nōscere (to learn). It is further defined as “the mental act or process by which knowledge is acquired, including perception, intuition, and reasoning” and “the knowledge that results from such an act or process.”
Cognitive computing applies a level of reasoning to the data and errors processed from machine learning. Looking for patterns in data and then comparing those patterns to other patterns is a form of reasoning. Cognitive processes use existing knowledge and generates new knowledge.
Cognitive computing vs machine learning is delineated by the generation of new knowledge. Cognitive computing is used heavily in recognition technologies. The discussion of semantics or recommendations based on unstructured data sets falls within the parameters of cognitive computing. Recommendation engines look at patterns, identify similar objects and make suggestions or recommendations to a user. Online shopping is an example of this. If a user looks at a car out of curiosity, there is a high probability they will start getting emails and notices on social media about automobiles, accessories, loans and all things car.
Semantic technology can also be considered a variant of cognitive computing. Semantic technology is the next generation of “fuzzy logic.” The human mind can analyze information that is not “black and white” or “digital as in 1 or 0.” The “grey” space is the intuitive aspect of the human mind’s ability to analyze considerable amounts of disparate information before making an informed decision. Fuzzy logic or semantic analysis is based on the variables or additional parameters that should be considered in data analysis. Semantic algorithms look at comparable data points or those that while not meeting exact query criteria may allow that the query might be inaccurate and attempts to deterministically “guess” based on a looser filter parameter, other results to the query.
Cognitive computing is “defined” as creating new knowledge from the evaluation of existing knowledge. Machine Learning is an iterative process to determine the answer to a problem, not a predictive answer based on unknowns. Cognitive computing analyzes data and generates new data in a predictive determination of what the next layer in the data set might be.
Still Not Artificial Intelligence.
The human mind is a miraculous organ. It is capable of processing multiple types of information from multiple sources at varying intensities of input and produces an action that may require motor control or an answer to a problem or question. We talk of intuition and inference. What about deductive reasoning or understanding intent. Can speech to text understand cynicism or sarcasm? What about facetiousness?
I was told that an instance of IBM Watson was asked to name the most prolific anti-war poets of the 60’s. Bob Dylan’s name was not among the answers. How could that be you might ask? While anyone reading or listening can INFER from the construction of his phraseology, there are not anti-war words in his songs and poems. That is deductive reasoning. And that’s where Artificial Intelligence is still lacking.
Would the recent events at Facebook and Google where their technology created its own code language for inter-application communications be representative of Artificial Intelligence? This definitely got the attention of programmers who all of a sudden didn’t understand this new language. Is artificial intelligence the creation of a new software language to streamline communications between computer systems? It’s certainly a component of machine learning, applied cognitive computing and moving towards inference knowledge. Was there reasoning involved as algorithms created their own algorithms to solve problems or generate new knowledge?
Interesting concepts to ponder.
Gary Olson is an IP broadcast designer, technology architect, project executive, IP broadcast evangelist, author, trainer and pundit. Explore his book Planning and Designing the IP Broadcast Facility: A New Puzzle to Solve. View Gary on LinkedIn.