Neural Network
• A neural network is a computational model inspired by the structure and function of the
human brain
• Neural networks consist of interconnected layers of artificial neurons that process data
through weighted connections and non-linear activation functions. Models, loss functions,
and optimizers work together to train the network to accurately learn patterns from data.
Components of a Neural Network
1.Neurons / Perceptron
• Artificial Neural Network is based on Natural Biological Systems.
• Artificial neurons in deep learning models are inspired by biological neurons:
• Input - Dendrites
• Weights/Summing -Cell Body
• Activation Function - Axon
• Output - Synapse
• Each neuron receives weighted inputs, applies an activation function, and passes output to the
next layer.
2.Weights and Biases:
Weights:
Parameters that determine the importance of each input to a neuron.
Adjusted during training to minimize the loss function.
Bias:
A parameter added to the weighted sum of inputs before applying the activation function.
Helps the model to better fit the data.
3 .Activation Functions:
Introduce non-linearity into the network, allowing it to learn complex patterns.
Common functions include ReLU (Rectified Linear Unit), Sigmoid, Tanh, and Softmax.
Architecture of a Neural Network
• Input values get passed through this network of hidden layers until they
eventually converge to the output layer.
• The hidden layers of a Neural Net perform modifications on the data to eventually feel out
what its relationship with the target variable is.
• Each node has a weight, and it multiplies its input value by that weight. Do that over a few
different layers. Calculate the weighted sums.
• The calculated sum of weights is passed as input to the activation function.
• The activation function takes the “weighted sum of input” as the input to the function, adds
a bias, and decides whether the neuron should be fired or not.
• The output layer gives the predicted output.
Neural networks Types:
• Single-layer networks (Perceptron): Simple models where inputs are directly
connected to the output with weights.
• Multi-layer networks (Feedforward Neural Networks): More complex architectures
involving hidden layers between input and output.
Logical Computations with
Neurons
Challenges of Neural Networks:
• Overfitting
• Underfitting
• Vanishing/Exploding Gradients
• Hyperparameter Tuning
• High Computational Cost
• Large Data Requirements
• Lack of Interpretability
• Optimization Issues (Local Minima/Saddle Points)
• Bias in Data
• Poor Generalization