Personal tools
You are here: Home Research Trends & Opportunities New Media and New Digital Economy AI, Machine Learning, Deep Learning, and Neural Networks
 

AI, Machine Learning, Deep Learning, and Neural Networks

MIT Stata Center_051118
(MIT Ray and Maria Stata Center, Jenny Fowter)

 

Artificial Intelligence: Fueling the Next Wave of the Digital Era

 
 

- Overview

Artificial Intelligence (AI) is not a single subject it has sub-fields like Learning (Machine Learning & Deep Learning), Communication using NLP, Knowledge Representation & Reasoning, Problem Solving, Uncertain Knowledge & Reasoning.

AI is a field of computer science dedicated to solving cognitive problems related to human intelligence, such as learning, creation, and image recognition. Modern organizations collect vast amounts of data from different sources such as smart sensors, human-generated content, monitoring tools, and system logs. The goal of AI is to create self-learning systems that derive meaning from data. AI can then apply this knowledge to solve new problems in a human-like way. 

For example, AI technology can respond meaningfully to human conversations, create original images and text, and make decisions based on instant data input. 

Your organization can integrate AI capabilities into applications to optimize business processes, improve customer experience, and accelerate innovation.

Please refer to the following for more information:

 

- The Internet of Sensing (IoS)

In the past few years, the field of artificial intelligence (AI) has made significant progress in almost all standard subfields, including vision, speech recognition and generation, natural language processing (understanding and generation), image and video generation, multi-agent systems, robotics planning, decision making and integration of vision and motion control.  

In addition, breakthrough applications have emerged in many fields such as gaming, medical diagnosis, logistics systems, autonomous driving, language translation, interactive personal assistance, etc.

AI, together with Augmented Reality (AR), Virtual Reality (VR), 5G technology and hyperautomation, is considered as one of the main drivers of the Internet of Sensing (IoS), which is expected to grow significantly in the next decade.

The Internet of Sensing (IoS) is a technology that enhances our senses outside of our bodies. It allows us to experience the world around us using multiple senses, including: enhanced vision, hearing, touch, smell.

The Internet of Things (IoT) connects the digital and physical worlds. It uses sensors to monitor physical objects such as temperature, motion, or other environmental changes. The actuator then receives the signal from the sensor and responds to the change.

  

- AI: The Science of Making Inanimate Objects Smart

From smartphones to chatbots, AI is already ubiquitous in our digital lives. The momentum behind AI is building, in part because computers can collect vast amounts of data about our everyday preferences, purchases and activities. AI research experts use all this data to train machine learning (ML) and predict what we want or hate. 

AI is a general term referring to technology that enables computers to mimic human behavior. It is an umbrella term for a group of technologies. AI deals with computer models and systems that perform human-like cognitive functions, such as reasoning and learning. AI software is able to learn from experience, distinguishing it from more traditional pre-programmed and deterministic software. 

AI does not necessarily mean giving machines intelligence or consciousness in the same way that humans are intelligent and conscious. It simply means that the machine is able to solve a specific problem or class of problems. 

AI helps solve problems by performing tasks involving skills such as pattern recognition, prediction, optimization, and recommendation generation based on data such as video, images, audio, numbers, text, and more. 

 

The AI Resurgence

AI and ML principles have been around for decades. The recent popularity of AI is a direct result of two factors. First, AI/ML algorithms are computationally intensive. The availability of cloud computing makes it possible to actually run these algorithms. Second, training AI/ML models requires a lot of data. The availability of big data platforms and digital data increases the effectiveness of AI/ML, making it better than humans for many applications. 

The speed, availability and sheer scale of the infrastructure enable bolder algorithms to solve more ambitious problems. Not only is the hardware faster and sometimes enhanced with specialized processor arrays such as GPUs, it is also available as a cloud service. What used to run in specialized labs with access to supercomputers can now be deployed to the cloud at little cost and much easier. 

This has democratized access to the hardware platforms needed to run AI, allowing startups to proliferate. Additionally, emerging open source technologies, such as Hadoop, allow for faster development of scaled AI techniques applied to large and distributed datasets. 

Larger players are investing heavily in various AI technologies. These investments go beyond simple R&D expansion of existing products and are often strategic. For example, the size of IBM's investment in Watson, or Google's investment in driverless cars, deep learning (aka DeepMind), or even quantum computing, promises to significantly improve the efficiency of machine learning algorithms.

 

- The Future of AI

AI technologies are already changing how we communicate, how we work and play, and how we shop and health. For businesses, AI has become an absolute necessity to create and maintain a competitive advantage. 

As AI permeates our daily lives and aims to make our lives easier, it will be interesting to see how quickly it develops and evolves, enabling different industries to evolve. Science fiction is slowly becoming a reality as new technological developments appear every day. Who knows what tomorrow will bring?

AI is expected to have a significant impact on the future, with the potential to improve industries, create new jobs, and increase economic growth:

  • Economic growth: AI could increase the world's GDP by 14% by 2030. It could also create new products, services, and industries.
  • Improved industries: AI could improve healthcare, manufacturing, customer service, and other industries. It could also lead to higher-quality experiences for customers and workers.
  • New jobs: AI-driven automation could change the job market, creating new positions and skills.
  • Augmented human capabilities: AI could help humans thrive in their fields by automating repetitive tasks and streamlining workflows.
  • Personalized learning: AI-powered tutoring systems could tailor to individual learning needs.
  • Scientific discovery: AI could help scientists advance their work by extracting data from imagery and performing other tedious tasks.
  • Video creation: AI could be used to create short-form videos for TikTok, video lessons, and corporate presentations.


However, AI also faces challenges, including increased regulation, data privacy concerns, and worries over job losses. If AI falls into the wrong hands, it could be used to expose people's personal information, spread misinformation, and perpetuate social inequalities.

 

- AI Is Evolving to Process the World Like Humans

AI is developing on its own. The software the researchers created draws on concepts from Darwin's theory of evolution, including "survival of the fittest," to build AI programs that can be passed down from generation to generation without human input. AI offers a wide range of technological capabilities that can be applied across all industries, profoundly changing the world around us. 

As AI researchers work to develop and improve their machine learning and AI algorithms, the ultimate goal is to rebuild the human brain. The most perfect AI imaginable would be able to process the world around us through typical sensory input, while leveraging the storage and computing power of supercomputers. 

With this ultimate goal in mind, it's not hard to understand how artificial intelligence is evolving as it continues to evolve. 

Deep learning AI is able to interpret patterns and draw conclusions. Essentially, it's learning how to mimic the way humans process the world around us. That said, from the start, AI often requires typical computer input, such as encoded data. Developing AI that can process the world through audio and visual, sensory input is a daunting task.

 

- The Relationship Between AI, ML, DL, and Neural Networks

Both machine learning (ML) and deep learning (DL) are subsets of AI. But we often use these terms interchangeably. ML is the largest component of AI. All AI-based products or services on the market would not be possible without ML or DL. 

Perhaps both technologies were introduced decades ago. But now, over the past few years, people are using its apps a lot. AI may be the last invention humans need to make.  These three terms -- AI, ML, and DL -- are critical to understanding themselves and their relationships; from sales teams explaining the services they provide, to having to Data scientists who decide which model type to use. 

While each of AI, ML, and DL has its own definition, data requirements, level of sophistication, transparency, and limitations, what that definition is and how they relate to each other is entirely up to the context in which you view them. 

  • Artificial Intelligence (AI): imitating the intelligence or behavioral patterns of humans or any other biological entity. When machines are able to mimic human intelligence through prediction, classification, learning, planning, reasoning and/or perception.
  • Machine Learning (ML): A technique in which computers "learn" from data without using a common set of different rules. This approach is primarily based on training a model from a dataset. Machine learning is a subset of artificial intelligence that combines mathematics and statistics in order to learn from the data itself and improve with experience.
  • Deep Learning (DL): A technique for performing machine learning inspired by our "brain's own network of neurons" - networks that can adapt to new data. Deep learning is a subset of ML that uses neural networks to solve increasingly complex challenges such as image, audio, and video classification.
  • Neural Networks: A beautiful biology-inspired programming paradigm that enables computers to learn from observational data. Deep learning, a set of powerful neural network learning techniques. Neural networks and deep learning currently provide the best solutions for many problems in image recognition, speech recognition, and natural language processing.

 

Machine learning, while widely considered a form of AI, aims to let machines learn from data, not from programming. Its applicable use is to predict outcomes, like we recognize a red octagon sign with white letters and know to stop. 

AI, on the other hand, can determine the best course of action for how to stop, when to stop, etc. Simply put, the difference is: machine learning predicts, artificial intelligence acts. 

 

AI vs ML vs DL vs Neural Networks_120924A
[AI vs ML vs DL vs Neural Networks]

- The Rise of Machine Learning (ML)

Machine learning (ML) is an interdisciplinary field that uses statistics, probability, algorithms to learn from data and provide insights that can be used to build intelligent applications. 

ML is the current application of AI. The technology is based on the idea that we really should be able to give machines access to data and let them learn on their own. 

ML is a technique that uses data to train software models. The model learns from training cases, and we can then use the trained model to make predictions on new data cases. 

ML provides the foundation for AI. Two important breakthroughs have led to the emergence of machine learning, which is advancing AI at the current rate. 

One of them is the realization that instead of teaching a computer everything it needs to understand the world and how to perform tasks, it is better to teach it to teach itself. The second is the advent of the Internet and the enormous growth in the amount of digital information that is generated, stored, and available for analysis. 

Once these innovations were in place, engineers realized that instead of teaching computers and machines how to do everything, they could write code to make them think like humans, and then connect them to the internet, giving them access to all the information in the world. 

ML is concerned with the scientific research, exploration, design, analysis, and application of algorithms that learn concepts, predictive models, behaviors, strategies of action, etc. from observation, reasoning, and experimentation, and the characterization of which classes of precise conditions concepts and behaviors can be learned . 

Learning algorithms can also be used to model various aspects of human and animal learning. ML integrates and builds on advances in algorithms and data structures, statistical inference, information theory, signal processing, and insights gained from neural, behavioral, and cognitive sciences.

 

- Deep Learning (DL)

Deep learning (DL) uses artificial neural networks (ANNs) to perform complex computations on large amounts of data. It is a machine learning based on the structure and function of the human brain. DL algorithms train machines by learning from examples. Industries such as healthcare, e-commerce, entertainment, and advertising commonly use deep learning. 

While deep learning algorithms have self-learning representations, they rely on artificial neural networks that mirror the way the brain computes information. During training, the algorithm uses unknown elements in the input distribution to extract features, group objects, and discover useful data patterns. Like training a machine to learn on its own, this happens at multiple levels, using algorithms to build models. 

DL models use a variety of algorithms. While no network is considered perfect, certain algorithms are better suited to perform specific tasks. To choose the right algorithm, it is best to have a solid understanding of all major algorithms. 

DL is a hot topic these days because it aims to simulate the human mind. It's been getting a lot of attention lately, and for good reason. It is achieving results that were not possible before. In deep learning, computer models learn to perform classification tasks directly from images, text, or sound. 

DL models can achieve state-of-the-art accuracy and sometimes exceed human-level performance. The model is trained by using a large amount of labeled data and a neural network architecture with multiple layers. 

DL is basically ML on steroids that allows for more accurate processing of large amounts of data. Since it is more powerful, it also requires more computing power. Algorithms can determine on their own (without engineer intervention) whether predictions are accurate. 

For example, consider feeding an algorithm thousands of images and videos of cats and dogs. It can see if an animal has whiskers, claws or a furry tail, and uses learning to predict whether new data fed into the system is more likely to be a cat or a dog.

 

- Neural Networks

Neural networks are a family of algorithms that strive to identify potential relationships in a set of data by simulating the way the human brain works. In this sense, a neural network refers to a system of neurons, whether organic or artificial. Neural networks can adapt to changing inputs; thus the network can produce optimal results without redesigning the output criteria. 

Neural networks are a set of algorithms, loosely modeled on the human brain, designed to recognize patterns. They interpret sensory data through a kind of machine perception, labeling or clustering of raw input. The patterns they recognize are numerical and contained in vectors, and all real-world data, whether images, sounds, text, or time series, must be converted into vectors. 

Neural networks help us with clustering and classification. You can think of them as layers of clustering and classification on top of the data you store and manage. They help to group unlabeled data based on similarity between example inputs and to classify data when training on labeled datasets. 

Neural networks can also extract features that are provided to other algorithms for clustering and classification; therefore, you can think of deep neural networks as components of larger ML applications involving reinforcement learning, classification, and regression algorithms.

Neural networks and deep learning currently provide the best solutions for many problems in image recognition, speech recognition, and natural language processing.

 

- Is AI Bubble Bursting?

Some say that the AI bubble is about to burst, while others say that the long-term prospects for AI are strong. 

The AI bubble is a term used to describe the potential for an unsustainable market perception of the value of AI companies, which may not be justified by the underlying economic reality. Some factors that could contribute to an AI bubble burst include:

  • Unsustainable valuations: AI companies that are overvalued may not have the earnings or growth potential to justify their high valuations.
  • Lack of profitable revenue streams: Many AI companies have not been able to show significant revenue increases from their AI investments.
  • Regulatory challenges: AI companies may face regulatory challenges.
  • Public distrust: The general public may be becoming more distrustful of AI.
  • Difficulty making money: Businesses may be finding it difficult to make money from AI.
  • Poor data quality: Some AI projects may fail due to poor data quality.
  • Inadequate risk controls: Some AI projects may fail due to inadequate risk controls.
  • Unclear business value: Some AI projects may fail due to unclear business value.
  • Escalating costs: Some AI projects may fail due to escalating costs.

 

 

Document Actions