PROBLEM DESCRIPTION: Calculate the output of a simple neuron
Contents
● Define neuron parameters
● Define input vector
● Calculate neuron output
● Plot neuron output over the range of inputs
Define neuron parameters
close all, clear all, clc, format compact
% Neuron weights
w = [4 -2]
% Neuron bias
b = -3
% Activation function
func = 'tansig'
% func = 'purelin'
% func = 'hardlim'
% func = 'logsig'
w =
4 -2
b =
-3
func =
tansig
Define input vector
p = [2 3]
p =
2 3
Calculate neuron output
activation_potential = p*w'+b
neuron_output = feval(func, activation_potential)
activation_potential =
-1
neuron_output =
-0.7616
Plot neuron output over the range of inputs
[p1,p2] = meshgrid(-10:.25:10);
z = feval(func, [p1(:) p2(:)]*w'+b );
z = reshape(z,length(p1),length(p2));
plot3(p1,p2,z)
grid on
xlabel('Input 1')
ylabel('Input 2')
zlabel('Neuron output')
PROBLEM DESCRIPTION: Create and view custom neural networks
Contents
● Define one sample: inputs and outputs
● Define and custom network
● Define topology and transfer function
● Configure network
● Train net and calculate neuron output
Define one sample: inputs and outputs
close all, clear all, clc, format compact
inputs = [1:6]' % input vector (6-dimensional pattern)
outputs = [1 2]' % corresponding target output vector
inputs =
1
2
3
4
5
6
outputs =
1
2
Define and custom network
% create network
net = network( ...
1, ... % numInputs, number of inputs,
2, ... % numLayers, number of layers
[1; 0], ... % biasConnect, numLayers-by-1 Boolean vector,
[1; 0], ... % inputConnect, numLayers-by-numInputs Boolean matrix,
[0 0; 1 0], ... % layerConnect, numLayers-by-numLayers Boolean matrix
[0 1] ... % outputConnect, 1-by-numLayers Boolean vector
);
% View network
structure view(net);
Define topology and transfer function
% number of hidden layer
neurons net.layers{1}.size = 5;
% hidden layer transfer function
net.layers{1}.transferFcn =
'logsig'; view(net);
Configure network
net = configure(net,inputs,outputs);
view(net);
Train net and calculate neuron output
% initial network response without training
initial_output = net(inputs)
% network training
net.trainFcn = 'trainlm';
net.performFcn = 'mse';
net = train(net,inputs,outputs);
% network response after training
final_output = net(inputs)
initial_output =
0
0
final_output =
1.0000
2.0000
PROBLEM DESCRIPTION: 4 clusters of data (A,B,C,D) are defined in a 2-dimensional input space. (A,C) and (B,D) clusters represent XOR classification problem. The task is to
define a neural network for solving the XOR problem.
Contents
● Define 4 clusters of input data
● Define output coding for XOR problem
● Prepare inputs & outputs for network training
● Create and train a multilayer perceptron
● plot targets and network response to see how good the network learns the data
● Plot classification result for the complete input space
Define 4 clusters of input data
close all, clear all, clc, format compact
% number of samples of each class
K = 100;
% define 4 clusters of input data
q = .6; % offset of classes
A = [rand(1,K)-q; rand(1,K)+q];
B = [rand(1,K)+q; rand(1,K)+q];
C = [rand(1,K)+q; rand(1,K)-q];
D = [rand(1,K)-q; rand(1,K)-q];
% plot clusters
figure(1)
plot(A(1,:),A(2,:),'k+')
hold on
grid on
plot(B(1,:),B(2,:),'bd')
plot(C(1,:),C(2,:),'k+')
plot(D(1,:),D(2,:),'bd')
% text labels for clusters
text(.5-q,.5+2*q,'Class A')
text(.5+q,.5+2*q,'Class B')
text(.5+q,.5-2*q,'Class A')
text(.5-q,.5-2*q,'Class B')
Define output coding for XOR problem
% encode clusters a and c as one class, and b and d as another class
a = -1; % a | b
c = -1; % -------
b = 1; % d | c
d = 1; %
Prepare inputs & outputs for network training
% define inputs (combine samples from all four classes)
P=[ABCD];
% define targets
repmat(c,1,length(C)) repmat(d,1,length(D)) ];
% view inputs |outputs
%[P' T']
Create and train a multilayer perceptron
% create a neural network
net = feedforwardnet([5 3]);
% train net
net.divideParam.trainRatio = 1; % training set [%]
net.divideParam.valRatio = 0; % validation set [%]
net.divideParam.testRatio = 0; % test set [%]
% train a neural network
[net,tr,Y,E] = train(net,P,T);
% show network
view(net)
plot targets and network response to see how good the network learns the data
figure(2)
plot(T','linewidth',2)
hold on
plot(Y','r--')
grid on
legend('Targets','Network response','location','best')
ylim([-1.25 1.25])
Plot classification result for the complete input space
% generate a grid
span = -1:.005:2;
[P1,P2] = meshgrid(span,span);
pp = [P1(:) P2(:)]';
% simulate neural network on a grid
aa = net(pp);
% translate output into [-1,1]
%aa = -1 + 2*(aa>0);
% plot classification regions figure(1)
mesh(P1,P2,reshape(aa,length(span),length(span))-5);
colormap cool
view(2)