Personal tools

Neural Networks in the Brain and AI

Zurich_Switzerland_DSC_0271
(Zurich, Switzerland - Alvin Wei-Cheng Wong)
 
 

- Overview

Neural networks are a type of artificial intelligence (AI) that are based on the structure of the human brain and are used to solve problems. They are made up of artificial neurons, or nodes, that are connected in layers and pass data to each other. Neural networks are used in many AI applications, including speech recognition, image recognition, and medical image analysis. 

Neural networks learn by analyzing training examples, such as labeled images. They use the training data to improve their accuracy over time.

Neural networks are typically organized into three layers:

  • Input layer: Receives data from the outside world, such as images, sound, or text
  • Hidden layers: Analyze the data from the input layer and perform complex processes like pattern recognition
  • Output layer: Provides the final result of the data processing


Each connection between nodes has a weight that determines how much influence one node has on another.

Neural networks are used in many AI applications, including voice assistants, image and facial recognition, online translation services, and search engines.

Researchers in neuroscience use neural networks to model how the brain performs tasks. They hope that these models can help them understand how the brain works.

 

- The Brain's Neural Network vs. Artificial Neural Networks

A neural network is a network that receives input, processes it, and produces an output while learning from the data it acquires. 

The brain's neural network is made up of billions of nerve cells called neurons. Neurons communicate using electrical signals. Dendrites are tree-like structures in neurons that receive communications from other neurons and send them through the cell body. 
Signals are transmitted through axons to other neurons. This process occurs among the billions of neurons in the brain, forming this massive system that eventually becomes self-sustaining (with thoughts, emotions, and desires). 

In contrast, neural networks in artificial intelligence can only use simplified mathematical models to simulate certain parts of neurons, such as dendrites, cell bodies, or axons. It is more specialized for specific tasks because it cannot create or destroy connections between neurons or ignore signal timing.

 
 

- Size

Our brain consists of about 86 billion neurons, while the number of "neurons" in an AI network is about 10-1000. Taken out of context, that is a substantial gap, but the artificial neurons are more powerful in certain aspects. 

Preceptors, the predecessors to artificial neurons, work in a linear fashion where they take inputs on their "dendrites" and generate outputs on their "axon branches." Several perceptrons lie in a single layer of a perceptron network but are not interconnected. 

Whereas deep neural networks usually consist of input neurons, output neurons, and neurons in the hidden layers, in-between. All the layers are usually fully connected to the next layer, implying that artificial neurons, for the most part, have as many connections as there are artificial neurons in the preceding and following layers combined.

 

- Speed

Biological neurons usually fire signals about 200 times a second. These signals travel at various speeds depending on the type of nerve impulse, ranging from 0.60 m/s up to 120 m/s. Information in artificial neurons is alternatively carried over by the continuous, floating-point number values of synaptic weights (strength or amplitude of a connection between two nodes). 

The speed of calculating an algorithm carries no information other than making the model's execution and training faster. Artificial neurons do not experience 'fatigue.' Given artificial neural networks, models can be understood as a bunch of matrix operations and finding derivatives; these calculations can be highly optimized for vector processors and speeded up using GPUs (Graphics Processing Unit) or dedicated hardware.

 

Switzerland_012521A
[Switzerland - Civil Engineering Discoveries]

- Fault Tolerance

Biological neural networks are fault-tolerant due to their synchronous nature. Minor failures do not result in memory loss because the information is stored redundantly. The brain can recover and heal to some extent. Artificial neural networks are not designed for fault tolerance or self-regeneration, as they are part of a network that has asynchronous computing nodes.

 

- Learning

It is still a mystery how the brain learns; how redundant connections store and retrieve information. Fibers in the brain grow and reach out to connect to other neurons, neuroplasticity causes the formation of new connections and areas to shift and alter function, and synapses can be strengthened or weakened based on their importance. By learning, we build on information that is already stored in the brain. 

Our knowledge deepens through repetition and sleep, and tasks that once a focus is required can be performed automatically once mastered. On the other hand, artificial neural networks have a predefined model where no further neurons or connections can be added or removed. 

During training, only the weights of the connections can change. Networks begin with random weight values and attempt to reach a point where further weight changes would no longer improve performance.

 

 

[More to come ...]
 
Document Actions