Single Layer Perceptron
P.Vishnu Ganesh,
Assistant Professor,
Department of AI&ML,
GMRIT.
Introduction
A neural network link that contains computations to track features
and uses Artificial Intelligence in the input data is known as
Perceptron.
This neural links to the artificial neurons using simple logic gates with
binary outputs.
An artificial neuron invokes the mathematical function and has node,
input, weights, and output equivalent to the cell nucleus, dendrites,
synapse, and axon, respectively, compared to a biological neuron.
Biological Neuron
Relationship between Biological
and Artificial Neuron
Artificial Neuron
An artificial neuron is a mathematical function based on a
model of biological neurons, where each neuron takes
inputs, weighs them separately, sums them up and
passes this sum through a nonlinear function to produce
output.
Researchers Warren McCullock and Walter Pitts
published their first concept of simplified brain cell in
1943. This was called McCullock-Pitts (MCP) neuron.
Perceptron
Perceptron was introduced by Frank Rosenblatt in 1957.
He proposed a Perceptron learning rule based on the
original MCP neuron.
A Perceptron is an algorithm for supervised learning of
binary classifiers.
This algorithm enables neurons to learn and processes
elements in the training set one at a time.
Perceptron
Training “AND”Gate using
Perceptron
W1=1.2
W2=0.6
THRESHOLD=1
LEARNING RATE n=0.5
BIAS=0.2
EPOCH 1
X X Y W1 W2 Y est ERR STATUS
1 2 t
a
r
0 0 0 1.2 0.6 (0*1.2+0*0.6)+0.2=0.2<-0 0 NO CHANGE
0 1 0 1.2 0.6 (0*1.2+1*0.6)+0.2=0.8<-0 0 NO CHANGE
1 0 0 1.2 0.6 (1*1.2+0*0.6)+0.2=1.4<-1 1 OPTIMIZE
1 1 1 1.2 0.6 (1*1.2+1*0.6)+0.2=2<-1 0 NO CHANGE
Perceptron with error
Weight Modification using
Perceptron
W-new = W-old + n(T-O)Xi
W1=1.2+0.5(0-1)1=0.7
W2=0.6+0.5(0-1)0=0.6
So, updated weights are
W1=0.7, W2=0.6.
EPOCH 2
X X Y W1 W2 Y est ERR STATUS
1 2 t
a
r
0 0 0 0.7 0.6 (0*0.7+0*0.6)+0.2=0.2<-0 0 NO CHANGE
0 1 0 0.7 0.6 (0*0.7+1*0.6)+0.2=0.8<-0 0 NO CHANGE
1 0 0 0.7 0.6 (1*0.7+0*0.6)+0.2=0.9<-0 0 No Change
1 1 1 0.7 0.6 (1*0.7+1*0.6)+0.2=1.5<-1 0 NO CHANGE
Perceptron with 2 inputs
Training “OR”Gate using
Perceptron
W1=0.6
W2=0.6
THRESHOLD=1
LEARNING RATE n=0.5
BIAS=0.2
EPOCH 1
X X Y W1 W2 Y est ERR STATUS
1 2 t
a
r
0 0 0 0.6 0.6 (0*0.6+0*0.6)+0.2=0.2<-0 0 NO CHANGE
0 1 1 0.6 0.6 (0*0.6+1*0.6)+0.2=0.8<-0 1 OPTIMIZE
1 0 1 0.6 0.6 (1*0.6+0*0.6)+0.2=0.8<-0 1 OPTIMIZE
1 1 1 0.6 0.6 (1*0.6+1*0.6)+0.2=1.4<-1 0 NO CHANGE
Weight Modification using
Perceptron
W-new = W-old + n(T-O)Xi
W1=0.6+0.5(1-0)0=0.6
W2=0.6+0.5(1-0)1=1.1
So, updated weights are
W1=0.6, W2=1.1
EPOCH 2
X X Y W1 W2 Y est ERR STATUS
1 2 t
a
r
0 0 0 0.6 1.1 (0*0.6+0*1.1)+0.2=0.2<-0 0 NO CHANGE
0 1 1 0.6 1.1 (0*0.6+1*1.1)+0.2=1.3<-1 0 NO CHANGE
1 0 1 0.6 1.1 (1*0.6+0*1.1)+0.2=0.8<-0 1 OPTIMIZE
1 1 1 0.6 1.1 (1*0.6+1*1.1)+0.2=1.9<-1 0 NO CHANGE
Weight Modification using
Perceptron
W-new = W-old + n(T-O)Xi
W1=0.6+0.5(1-0)1=1.1
W2=1.1+0.5(1-0)1=1.1
So, updated weights are
W1=1.1, W2=1.1
EPOCH 3
X X Y W1 W2 Y est ERR STATUS
1 2 t
a
r
0 0 0 1.1 1.1 (0*1.1+0*1.1)+0.2=0.2<-0 0 NO CHANGE
0 1 1 1.1 1.1 (0*1.1+1*1.1)+0.2=1.3<-1 0 NO CHANGE
1 0 1 1.1 1.1 (1*1.1+0*1.1)+0.2=1.3<-1 0 NO CHANGE
1 1 1 1.1 1.1 (1*1.1+1*1.1)+0.2=2.5<-1 0 NO CHANGE
THANK YOU