Tech industry is always bustling with new technologies and innovation that will transform business models, human lives and the global economy. It is therefore crucial that business organizations and policy leaders look ahead and identify the technology trends that will matter to them. Every new technology passes through a hype cycle of being perceived as a silver bullet for everything. And, like any other technology, Artificial Intelligence (AI) has been following the same hype cycle. But beyond the hypes and promises,we are at a new level of cognition in artificial intelligence that has started to deliver real-life benefits. From warehouse robots to self-driven vehicles and chatbots to music composition, AI is making its
impact across industries . While Artificial intelligence and all its related fields continue to gain popularity, there has been a strong confusion regarding the technological jargon. Artificial intelligence and cognitive computing are the hottest buzzwords that are often used interchangeably. However, both these terms are quite distinctive in their objectives and approaches. Having a sound knowledge about the contrast between these two technologies is important in deciding which is best for your business or application.
Understanding Artificial Intelligence (AI)
AI can be defined as a set of technologies that enable machines to grasp and understand human intelligence, learn to imitate it and act accordingly. Though AI has experienced a real boom in recent times, its history began way earlier than you might think. In fact, the term “Artificial Intelligence” was first coined by professor John McCarthy at the Dartmouth Conference in 1956. Over the years, AI has experienced several booms and busts and among the highlights were the introduction of world’s first chatbot Eliza in 1966, IBM’s Deep Blue’s victory against world chess champion Garry Kasparov in 1997 and IBM Watson’s Jeopardy! Victory in 2011. And in the following years, Siri, Google Now and Cortana hit the market. Exponential growth of data, increase in computing power and advances in deep neural networks are accelerating the pace of innovation in the field of AI.
Artificial intelligence is an umbrella term for smart technologies that simulate human intelligence in machines. Machine learning(ML), natural language processing (NLP), deep learning, and robotic process automation (RPA) are some of the many technologies that incorporate AI into their operations. Machine Learning can be considered as one of the most exciting subsets of AI that enables humans to ‘teach’ machines by processing huge volumes of data, finding patterns in data and using those patterns to make predictions. Over time, machines tweaks the algorithm in response to the data they are exposed to. Today, with the help of machine learning services, it is possible to avail traffic predictions, product recommendations, refine search engine results, carry out medical diagnosis and more.
Understanding Cognitive Computing
Cognitive computing, a mashup of cognitive science and computer science focuses on building systems that simulate human thought process. Many believe cognitive computing as the new and third era of computing - the first being the tabulating systems (1900) and second era as that of programmable systems (1950). According to Cognitive Computing Consortium, cognitive computing systems should have five key attributes - adaptive, interactive, iterative, stateful, and contextual. IBM describes it as, “Systems that learn at scale, reason with purpose and interact with humans naturally.”
Cognitive computing uses self-learning algorithms to mimic human brain activity and has the potential to deal with conceptual and symbolic information rather than just pure data or sensor streams. Rather than focusing on a single set of AI technologies, cognitive systems covers several disciplines including Machine Learning, Deep Learning, neural networks, statistical analysis, Natural Language Processing, contextual awareness, sentiment analysis, speech and vision recognition. And this enables cognitive systems to refine the way they learn and interpret data to become capable of apprehending new problems and derive the best possible solutions.
The first cognitive computing platform - IBM Watson was introduced to the world as a competitor on the Jeopardy quiz show in 2011, where it went on a winning streak, and ultimately won the game. Named after the company’s founder and first CEO Thomas J Watson, IBM Watson is powered by 50 different cognitive technologies. Currently, many big chip companies including Intel, Microsoft, Cisco and Google DeepMind as well as promising start-ups are working on cognitive computing platforms.
Understanding the difference
The difference between Artificial Intelligence solutions and Cognitive computing lies in human interaction. In other words, AI is applied to automate processes while cognitive computing is for augmenting human capabilities. An AI-based system doesn’t mimic human thoughts or processes, but will solve a complex problem using the best possible algorithm. On the other hand, cognitive computing simulates human thoughts or processes, and assists humans in taking smart decisions. Let’s imagine a scenario where a doctor is looking for the best course of treatment for a patient. Both AI and cognitive systems would search, process and interpret huge volumes of medical records and journal records to find the best solution. An AI system would decide the most relevant treatment option on behalf of doctor, whereas a cognitive system would provide all the necessary information to doctor - where the doctor himself will take the final decision.
In short, Artificial intelligence is rooted on the idea that machines can make better decisions on behalf of human beings, while cognitive computing mimics human brain activity to help humans make better decisions.
Implications for the future
According to a report published by the research firm Markets and Markets, the AI market is expected to grow to a $190 billion industry by 2025. Untapped potential of AI can revolutionize nearly every industry by optimizing their processes and reaping higher revenues and profits. Major
applications of AI can be found in finance, security, healthcare, government, retail and manufacturing. Analytical Research Cognizance projects that Cognitive Computing will register a 31.0% CAGR in terms of revenue, reach US$ 49800 million by 2023, from US$ 9850 million in 2017. The major application of cognitive computing is in healthcare, education and industrial sector among others.
Final Thoughts
Artificial Intelligence and cognitive computing technologies are ideal for interacting with an increasingly complex world where people are looking for personalized information. Not only these technologies are powerful enough to solve problems deemed too complex for average human brains but also renders increased and seamless productivity. Both
AI and cognitive computing are closely similar in their intent, but different in their tendencies of human interaction. And, it is certain that both these technologies will experience unparalleled advancements across diverse industries and pave the way for a promising future.