Personal tools
You are here: Home Research Trends & Opportunities New Media and New Digital Economy AI, Machine Learning, Deep Learning, and Neural Networks

AI, Machine Learning, Deep Learning, and Neural Networks

MIT Stata Center_051118
(MIT Ray and Maria Stata Center, Jenny Fowter)


Artificial Intelligence: Fueling the Next Wave of the Digital Era


- Artificial Intelligence: The Science of Making Inanimate Objects Smart

From smartphones to chatbots, Artificial Intelligence (AI) is already ubiquitous in our digital lives. The momentum behind AI is building, thanks in part to the massive amounts of data that computers can gather about our likes, our purchases and our movements every day. And specialists in AI research use all that data to train machines how to learn and predict what we want - or detest.

AI is a general term that refers to techniques that enable computers to mimic human behavior. It is an overarching term for a collection of technologies. AI deals with computer models and systems that perform human-like cognitive functions such as reasoning and learning. AI software is capable of learning from experience, differentiating it from more conventional software which is preprogrammed and deterministic in nature. AI doesn’t necessarily mean giving intelligence or consciousness to machines in the same way that a person is intelligent and conscious. It simply means the machine is able to solve a particular problem or class of problems. AI helps to solve problems through performing tasks which involve skills such as pattern recognition, prediction, optimisation, and recommendation generation, based on data from videos, images, audio, numerics, text and more. 

AI - together with Augmented Reality (AR), Virtual Reality (VR), 5G technology, and hyper-automation -  is considered one of the main enablers of the Internet of Senses (IoS), a megatrend in the next decade (from 2021 to 2030).


The AI Resurgence

AI and ML principles have been around for decades. AI's recent surge in popularity is a direct result of two factors. First, AI/ML algorithms are computationally intensive. The availability of cloud computing has made it feasible to run these algorithms practically. Second, training AI/ML models requires massive amounts of data. The availability of big data platforms and digital data have improved the effectiveness of AI/ML, making them better in many applications than humans.

The infrastructure speed, availability, and sheer scale has enabled bolder algorithms to tackle more ambitious problems. Not only is the hardware faster, sometimes augmented by specialized arrays of processors (e.g., GPUs), it is also available in the shape of cloud services. What used to be run in specialized labs with access to super computers can now be deployed to the cloud at a fraction of the cost and much more easily. This has democratized access to the necessary hardware platforms to run AI, enabling a proliferation of start-ups. Furthermore, new emerging open source technologies, such as Hadoop, allow speedier development of scaled AI technologies applied to large and distributed data sets.

Larger players are investing heavily in various AI technologies. These investments go beyond simple R&D extensions of existing products, and are often quite strategic in nature. Take for example, IBM’s scale of investment in Watson, or Google’s investment in driverless cars, Deep Learning (i.e., DeepMind), and even Quantum Computing, which promises to significantly improve on efficiency of machine learning algorithms.


- The Future of Artificial Intelligence (AI)

Technology is transforming how humans and machines work together. People are relying on machines to help them make better informed decisions, expand reach and access, and increase safety and productivity. This new era of human-machine collaboration depends on trust and understanding - allowing each component of the team to do what it does best. The future of autonomy isn't human-less. It's human more. 

Over the past few years AI has exploded, and especially since 2015. Much of that has to do with the wide availability of GPUs that make parallel processing ever faster, cheaper, and more powerful. It also has to do with the simultaneous one-two punch of practically infinite storage and a flood of data of every stripe (that whole Big Data movement) - images, text, transactions, mapping data, you name it.  

AI has various applications in today's society. It is becoming essential for today's time because it can solve complex problems with an efficient way in multiple industries, such as Healthcare, entertainment, finance, education, etc. AI is making our daily life more comfortable and fast. AI enabled technologies are already shifting how we communicate, how we work and play, and how we shop and care for our health. For businesses, AI has become an absolute imperative for creating and maintaining a competitive edge. 

With AI penetrating our daily lives with the intention to stay and make our lives easier it is interesting to see how quickly it is developing and evolving, allowing different industries to advance. Science fiction is gradually becoming reality with new technological developments emerging every day. Who knows what tomorrow brings?


- AI Is Evolving to Process the World Like Humans

AI is evolving all by itself. Researchers have created software that borrows concepts from Darwinian evolution, including “survival of the fittest,” to build AI programs that improve generation after generation without human input. AI offers broad technological capabilities that can be applied to all industries, profoundly transforming the world around us. 

As AI researchers work on developing and perfecting their machine learning and AI algorithms, the end goal is ultimately to recreate the human brain. The most perfect AI imaginable would be able to process the world around us through typical sensory input but leverage the storage and computing strengths of supercomputers. With that end goal in mind, it's not hard to understand the ways that AI is evolving as it continues to be developed.  

Deep learning AI is able to interpret patterns and derive conclusions. In essence, it's learning how to mimic the way that humans process the world around us. That said, from the onset, AIs generally need typical computer input, like coded data. Developing AIs that can process the world through audio and visual input, sensory input, is a much harder task. 


- The Relationship Between AI, ML, DL, and Neural Networks

These three terms⁠ - Artificial Intelligence (AI), Machine Learning (ML), and Deep Learning (DL) - are critical to understand on their own, but also how they relate to each other; from a sales team explaining the services they provide, to the data scientists who must decide which of these model types to use. And while it is true that each of AI, ML, and DL have their own definitions, data requirements, level of complexity, transparency, and limitations - what that definition is and how each relate is entirely dependent on the context at which you look at them.


  • Artificial Intelligence (AI): Mimicking the intelligence or behavioral pattern of humans or any other living being entity. When a machine is able to mimic human intelligence by having the ability to predict, classify, learn, plan, reason, and/or perceive.
  • Machine Learning (ML): A technique by which a computer can "learn" from data, without using a common set of different rules. This approach is mainly based on training a model from data sets. Machine learning is a subset of AI that incorporates math and statistics in order to learn from the data itself, and improve with experience.
  • Deep Learning (DL): A technique to perform machine learning inspired by our "brain's own network of neurons" - network capable of adapting itself to new data. Deep learning is a subset of ML that uses neural networks to solve ever complex challenges, such as image, audio, and video classification.
  • Neural Networks: A beautiful biologically-inspired programming paradigm which enables a computer to learn from observational data. Deep learning, a powerful set of techniques for learning in neural networks. Neural networks and deep learning currently provide the best solutions to many problems in image recognition, speech recognition, and natural language processing. 


Machine Learning, although widely considered a form of AI, is designed to allow machines to learn from data as opposed to programming. Its applicable use is to predict outcomes in the same way we recognize a red octagonal sign with white letters and know to stop. AI on the other hand can determine the best course of action on how to stop, when, and so on. The difference, simply put: Machine Learning predicts, AI acts.


Artificial Intelligence System_112720A
[Artificial Intelligence System - Deloitte]

- The Rise of Machine Learning (ML)

Machine Learning (ML) is an interdisciplinary field that uses statistics, probability, algorithms to learn from data and provide insights which can be used to build intelligent applications.

ML is a current application of AI. The technology is based on the idea that we should really just be able to give machines access to data, and let them learn for themselves. Machine learning is a technique in which we train a software model using data. The model learns from the training cases and then we can use the trained model to make predictions for new data cases.  

ML provides the foundation for Artificial Intelligence (AI). Two important breakthroughs led to the emergence of ML as the vehicle which is driving AI development forward with the speed it currently has. One of these was the realization that rather than teaching computers everything they need to know about the world and how to carry out tasks, it might be possible to teach them to learn for themselves. The second was the emergence of the Internet, and the huge increase in the amount of digital information being generated, stored, and made available for analysis. Once these innovations were in place, engineers realized that rather than teaching computers and machines how to do everything, it would be far more efficient to code them to think like human beings, and then plug them into the Internet to give them access to all of the information in the world. 

ML is concerned with the scientific study, exploration, design, analysis, and applications of algorithms that learn concepts, predictive models, behaviors, action policies, etc. from observation, inference, and experimentation and the characterization of the precise conditions under which classes of concepts and behaviors are learnable. Learning algorithms can also be used to model aspects of human and animal learning. Machine learning integrates and builds on advances in algorithms and data structures, statistical inference, information theory, signal processing as well as insights drawn from neural, behavioral, and cognitive sciences. 


- Deep Learning (DL)

Deep learning (DL) uses artificial neural networks (ANNs) to perform sophisticated computations on large amounts of data. It is a type of machine learning that works based on the structure and function of the human brain. DL algorithms train machines by learning from examples. Industries such as health care, eCommerce, entertainment, and advertising commonly use deep learning. 

While DL algorithms feature self-learning representations, they depend upon ANNs that mirror the way the brain computes information. During the training process, algorithms use unknown elements in the input distribution to extract features, group objects, and discover useful data patterns. Much like training machines for self-learning, this occurs at multiple levels, using the algorithms to build the models. 

DL models make use of several algorithms. While no one network is considered perfect, some algorithms are better suited to perform specific tasks. To choose the right ones, it’s good to gain a solid understanding of all primary algorithms. 

DL is the hot topic of the day as it aims to simulate human thinking. It is getting lots of attention lately and for good reason. It’s achieving results that were not possible before. In deep learning, a computer model learns to perform classification tasks directly from images, text, or sound. DL models can achieve state-of-the-art accuracy, sometimes exceeding human-level performance. Models are trained by using a large set of labeled data and neural network architectures that contain many layers.

DL basically is ML on steroids and allows the crunching of vast amounts of data with improved accuracy. As it’s more powerful it also requires considerably more computing power. Algorithms can determine on their own (without intervention of an engineer) whether a prediction is accurate or not. Think for example of providing an algorithm with thousands of images and videos of cats and dogs. It can look at whether the animal has whiskers, paws or a furry tail, and use learnings to predict whether new data fed into the system is more likely to be a cat or a dog.


- Neural Networks

A neural network is a series of algorithms that endeavors to recognize underlying relationships in a set of data through a process that mimics the way the human brain operates. In this sense, neural networks refer to systems of neurons, either organic or artificial in nature. Neural networks can adapt to changing input; so the network generates the best possible result without needing to redesign the output criteria.

Neural networks are a set of algorithms, modeled loosely after the human brain, that are designed to recognize patterns. They interpret sensory data through a kind of machine perception, labeling or clustering raw input. The patterns they recognize are numerical, contained in vectors, into which all real-world data, be it images, sound, text or time series, must be translated.

Neural networks help us cluster and classify. You can think of them as a clustering and classification layer on top of the data you store and manage. They help to group unlabeled data according to similarities among the example inputs, and they classify data when they have a labeled dataset to train on. Neural networks can also extract features that are fed to other algorithms for clustering and classification; so you can think of deep neural networks as components of larger machine-learning applications involving algorithms for reinforcement learning, classification and regression.

Neural networks and deep learning currently provide the best solutions to many problems in image recognition, speech recognition, and natural language processing.



[More to come ...]

Document Actions