Artificial Intelligence and Natural Language Processing

Natural language is the simplest form of human communication, and the hardest for machines to understand.

By Praful Krishna  |  October 1, 2017


The first artificial neural networks were developed in the 1960s, when The Beatles and The Rolling Stones topped the charts. But it is only in the last 10 years that advances in computational power and big data have made them useful enough to pique the interest of the business community.

One technique in particular has proven itself spectacularly successful in getting the most out of neural networks – deep learning, or the application of complex neural network architectures to model patterns in data. Thanks to deep learning, computers can now recognize objects in images and video with greater precision than the human eyetranscribe speech to text faster than a journalistpredict crop yield better than the USDA, and diagnose cancer more accurately than the world’s best physicians.

While deep learning has proven itself as a powerful learning system, the jury is out on whether it is capable of solving what is perhaps the next biggest outstanding challenge for artificial intelligence: Natural Language Processing.

Humans learn how to efficiently and accurately process natural language from a very young age; in fact, it is one of the easiest skills we will learn. For a machine, the inverse is true. But if we can get computers to analyze, understand, and derive meaning from human language in smart and useful ways, the possibilities are limitless. We could turn up insights that would take the most intelligent human more than a lifetime of reading and research to discover. And we could turn up such insights around the clock, at a fraction of the cost of human resource, without any manual or fatigue-related errors, cognitive bias or risk.

In situations where there is a gigantic amount of annotated data available, deep learning algorithms can still be applied to take on Natural Language Processing. Google Translate, for instance, has used deep neural networks to come tantalizingly (or not-so-tantalizingly, if you work as a translator) close to human-like translation. But an AI must be trained before it can start producing results – and Google relies on a global army of gig economy workers and volunteers to clean and structure the data its networks train on. Just enabling Kazakh-English translation required two million human contributions.

Even if they can get their hands on enough data to tackle NLP with deep learning, it is very rare an organization will be able to afford the army of Subject Matter Experts deep learning takes to tag the training data . A worthwhile return on investment from an organization applying deep learning to the processing of natural language is even rarer. A case in point is the recent announcement by MD Anderson that despite investment of $62 million over three years to train IBM Watson, the system was not able to yield any results.

To democratize Natural Language Processing and make it affordable in mainstream business applications, we need machine learning of a completely different thought process than deep neural networks are currently capable. We need an algorithm that can, without human supervision, learn how to handle the probabilistic nature of natural language: interpreting facts, recognizing meaning, taking decisions based context and adapting to changing habits of its users. An algorithm, in other words, that is built specifically for Natural Language Processing.

With such an algorithm, it becomes economically viable to automate vast swathes of the tedious but complicated work white collar professionals have become all too familiar with: redacting documents, ensuring compliance to stringent and ever-shifting regulations, Googling and synthesizing information into reports, digging up stats from labyrinthine legacy databases, answering the same questions over and over again. With such an algorithm, these professionals would be able to focus on the things that matter: creativity, judgment, strategy.

Three tenets of Calibrated Quantum Mesh:

  • In nature, variables do not have absolute values, but they can take various forms (quantum states) at various odds.
  • Almost everything is inter-connected to every thing else with various levels of influence to form an intricate mesh.
  • Patient consideration of all the available information, applied one by one, can calibrate the various odds of the various states such a mesh can take, leading to viable solutions.

We call such AI technology ‘cognitive’ because it must processes language in a manner similar to the human brain. It must answer natural questions and respond with natural answers like a seasoned professional who knows their organization’s databases inside-out. It must synthesize the important information from different sources into concise summaries and reports, protect data security by intelligently redacting sensitive documents before sharing with stakeholders.

In one significant respect cognitive automation is very different to human capital – it provides much greater scale, running round the clock, at a fraction of the cost of a human resource, without any manual or fatigue-related errors, cognitive bias or risk.

By using advanced data science techniques like cognitive calibration, which tiers training data into a hierarchy of reliability, and closely following human thought process, this is what we at Coseer have developed. An AI algorithm called Calibrated Quantum Mesh that is built to train and process natural language without the need for pre tagged of data. CQM also enables Coseer to configure for a variety of problems across numerous contexts, ultimately making it a true cognitive agent.

Only time will tell if Coseer’s cognitive algorithm is as groundbreaking in the field of artificial intelligence as The Beatles and Rolling Stones were in that of music, but we’re already applying it to help organizations make faster actions and decisions, optimize portfolios, react to market fluctuations, reduce labor costs, achieve greater scale, and uncover product and service innovations.


Coseer1017