NEURAL NETWORK OVERVIEW

In this section, I will share my note to learn neural network using Tensorflow and Keras, we start to learn about how neural network can be form and its basic mechanism, after that we will learn how to install Tensorflow and Keras in your laptop. Moreover, i will make all tutorial using my Macbook Pro, which means i am using MacOS Monterey version 12.3.1.


INTRODUCTION

The human brain is a complex system, made up of billions of neurons that reveals new mysteries with each new discovery. And efforts to mimic the structure and function of the human brain gave rise to a new field of study known as Deep Learning. Artificial Neural Networks, also known as Neural Networks, are a component of Artificial Intelligence. They are inspired by the neural networks of the human brain. With hundreds of applications in everyday life, the field has grown at an exponential rate in recent years. Its applications range from spell check to machine translation to facial recognition and can be found everywhere in the real world.


ARTIFICIAL NEURAL NETWORK VS. BIOLOGICAL NEURAL NETWORK (STRUCTURE)

The structure of artificial neural networks is similar to that of biological neural networks and is intended to mimic the neural networks of the human brain. The human brain is a network of billions of densely connected neurons that is extremely complex, nonlinear, and contains trillions of synapses. A neural network is made up of dendrites, axons, cell bodies, synapses, soma, and nuclei. Dendrites are in charge of receiving input from other neurons, while axons are in charge of transmitting information from one to the other. Electrochemical signaling underpins the molecular and biological machinery of neural networks. Neurons only fire electrical impulses when specific conditions are met. Some of the brain's neural structure is present at birth, while other parts develop through learning, particularly in the early stages of life to adapt to the environment (new inputs).

From the figure above: Artificial Neural Networks are composed of layers upon layers of connected input and output units known as neurons. A perceptron is a single layer neural network. An artificial neural network may contain multiple hidden layers. An artificial neuron is made up of input units (receptors), connection weights, a summing function, computation, and output units (effectors). Despite the fact that neurons are slower than silicon logic gates, their massive interconnection compensates for the slower rate. A connection's weight value represents the strength of the specified connection between neurons. To map aggregations of input stimuli to a desired output function, weights are randomly initialized and adjusted using an optimization algorithm.

Perceptron, Feed Forward Neural Network, Multilayer Perceptron, Convolutional Neural Network, Radial Basis Function Neural Network, Recurrent Neural Network, LSTM – Long Short-Term Memory, Sequence to Sequence models, and Modular Neural Network are all examples of neural network architecture. In addition, the learning algorithms can be supervised, unsupervised, or reinforcement methods.


BIOLOGICAL NEURAL NETWORK

Dendrites are nerve fibers that carry electrical signals from the cell to the cell body. The cell body then takes over and calculates a nonlinear function of the inputs. Only a sufficient amount of input is received before the cell fires. The function's output is then sent to other neurons via the axon, which is a single nerve fibre. The input is received by a dendrite of another neuron, and the process continues. A synapse is the point of contact between a neuron's axon and the dendrite of another neuron. It controls the chemical connection whose weight influences the cell's input. Because each neuron has several dendrites, it receives input from a large number of neurons. Every neuron has one axon, which uses its dendrite to pass the output as input to the next neuron. They can tolerate ambiguity in data because of the large number of synapses.


ARTIFICIAL NEURAL NETWORK

Artificial Neural Networks function in a similar way to their biological counterparts. From the figure below: They can be thought of as weighted directed graphs, with neurons acting as nodes and connections between neurons acting as weighted edges. A neuron's processing element receives a large number of signals (both from other neurons and as input signals from the external world). At the receiving synapse, signals are sometimes modified, and the weighted inputs are summed at the processing element. If it crosses the threshold, it becomes input to other neurons (or output to the outside world), and the cycle begins again.

The weights usually represent the strength of interconnection between the neurons. From the figure below: The activation function is a transfer function that is used to obtain the desired output for the designed problem. In the case of a binary classifier, say the desired output is zero or one. The activation function could be the sigmoid function. Linear regression, logistic regression, identity function, bipolar, binary sigmoid, bipolar sigmoid, hyperbolic tangent, sigmoidal hyperbolic, and ReLU are just a few of the activation functions available. Through learning processes, artificial neural networks are specifically designed for a specific function such as binary classification, multi-class classification, pattern recognition, and so on. With the learning process, the weights of the synaptic connections of both neural networks adjust.

The artificial neural network has no more than 1000 neurons, whereas the biological neural network has 86 billions of neurons and trillions of synapses. When comparing their abilities, there are a number of other factors to consider, such as the layers of neurons and the data. When compared to biological neural networks, artificial neural networks require a lot of computational power and release a lot of heat. The artificial neural network continues to improve as a result of the mystery revealed by the biological neural network. As more of the mystery is revealed, the brain spins more webs. Even though it is based on a biological neural network, the functions it performs are more mathematical than biological.