Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
320 views43 pages

Lecture13 ANFIS

The document provides an overview of Adaptive Neuro-Fuzzy Inference Systems (ANFIS). It discusses ANFIS architectures based on both Mamdani and Sugeno fuzzy inference systems. For Mamdani-based ANFIS, it describes the 5 layers - input, fuzzification, rule, output membership, and defuzzification layers. For Sugeno-based ANFIS, it outlines the layers including fuzzification, rule, normalization, and defuzzification. ANFIS allows fuzzy systems to learn from data using a hybrid learning procedure, combining least-squares and backpropagation. This enables automatic fuzzy rule generation and tuning of membership functions from numerical data.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
320 views43 pages

Lecture13 ANFIS

The document provides an overview of Adaptive Neuro-Fuzzy Inference Systems (ANFIS). It discusses ANFIS architectures based on both Mamdani and Sugeno fuzzy inference systems. For Mamdani-based ANFIS, it describes the 5 layers - input, fuzzification, rule, output membership, and defuzzification layers. For Sugeno-based ANFIS, it outlines the layers including fuzzification, rule, normalization, and defuzzification. ANFIS allows fuzzy systems to learn from data using a hybrid learning procedure, combining least-squares and backpropagation. This enables automatic fuzzy rule generation and tuning of membership functions from numerical data.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 43

ICS 581: Advanced Artificial Intelligence

Lecture 13

Adaptive Neuro-Fuzzy
Inference Systems (ANFIS)

Dr. Emad A. A. El-Sebakhy

Term 061
Meeting Time: 6:30 -7:45
Location: Building 22, Room 132
Fuzzy Sets

Sets with fuzzy boundaries

A = Set of tall people

Crisp set A Fuzzy set A


1.0 1.0
.9
.5 Membership
function

170 170 180 Heights


Heights
(cm) (cm)

2020-02-27 2
Membership Functions (MFs)

About MFs
 Subjective measures
 Not probability functions

MFs “tall” in Taiwan

.8
.5 “tall” in the US

“tall” in NBA
.1
180 Heights
(cm)

2020-02-27 3
Fuzzy If-Then Rules

• Mamdani style
If pressure is high then volume is small
high small

• Sugeno style
If speed is medium then resistance = 5*speed

medium
resistance = 5*speed

2020-02-27 4
Fuzzy Inference System (FIS)

If speed is low then resistance = 2


If speed is medium then resistance = 4*speed
If speed is high then resistance = 8*speed

MFs low medium high


.8

.3
.1
2 Speed
Rule 1: w1 = .3; r1 = 2
Rule 2: w2 = .8; r2 = 4*2 Resistance = S(wi*ri) / Swi
Rule 3: w3 = .1; r3 = 8*2 = 7.12

2020-02-27 5
ANFIS: Mamdani’s model
 Layer 1: input layer
 Layer 2: input membership or fuzzification layer
 Neurons represent fuzzy sets used in the antecedents of fuzzy rules determine the
membership degree of the input.
 Activation fn: the membership fn.
 Layer 3: fuzzy rule layer
 Each neuron corresponds to a single fuzzy rule.
 Conjunction of the rule antecedents: product
 Output: the firing strength of fuzzy rule R i , R i = Aj  Bk
 The weights b/t layer 3 and layer 4 : the normalized degree (k.a. certainty factors)
of confidence of the corresponding fuzzy rules.
They’re adjusted during training.
 Layer 4: Output membership layer
 Disjunction of the outputs: C i = Rj  Rj = ∑ Rj : sum
 the integrated firing strength of fuzzy rule neurons R j and R k.
 Activation fn: the output membership fn.
 Layer 5: defuzzification layer
 Each neuron represents a single output.
 E.g.) centroid method.
ANFIS: Mamdani’s model

 Learning
 A various learning algorithm may be applied: Back propagation
 Adjustment of weights and modification of input/output membership functions.

 Sum-Product composition and centroid defuzzification wass adopted, a


corresponding ANFIS was constructed easily.
 Extra complexity with max-min composition – no better learning capability or
approximation power.
 More complicated than Sugeno ANFIS or Tsukamoto ANFIS
ANFIS: Mamdani’s model
First-Order Sugeno FIS

• Rule base
If X is A1 and Y is B1 then Z = p1*x + q1*y + r1
If X is A2 and Y is B2 then Z = p2*x + q2*y + r2
• Fuzzy reasoning
A1 B1 z1 =
w1
p1*x+q1*y+r1

X Y
A2 B2 z2 =
w2 p2*x+q2*y+r2

X Y w1*z1+w2*z2
x=3 y=2 z=
P w1+w2
2020-02-27 9
Adaptive Networks

x
z
y

 Architecture:
 Feedforward networks with diff. node functions
 Squares: nodes with parameters
 Circles: nodes without parameters
 Goal:
 To achieve an I/O mapping specified by training data
 Basic training method:
 Backpropagation or steepest descent

2020-02-27 11
Derivative-Based Optimization

 Based on first derivatives:


 Steepest descent
 Conjugate gradient method
 Gauss-Newton method
 Levenberg-Marquardt method
 And many others
 Based on second derivatives:
 Newton method
 And many others

2020-02-27 12
Fuzzy Modeling

x1 Unknown target system y


...

xn Fuzzy Inference system y*

• Given desired i/o pairs (training data set) of the form


(x1, ..., xn; y), construct a FIS to match the i/o pairs

• Two steps in fuzzy modeling


structure identification --- input selection, MF numbers
parameter identification --- optimal parameters

2020-02-27 13
Neuro-Fuzzy Modeling

Basic approach of ANFIS

Adaptive networks

Generalization Specialization

Neural networks Fuzzy inference


systems
ANFIS

2020-02-27 14
ANFIS

• Fuzzy reasoning
A1 B1 z1 =
w1
p1*x+q1*y+r1
w1*z1+w2*z2
z= w1+w2
A2 B2 z2 =
w2
p2*x+q2*y+r2
x y
• ANFIS (Adaptive Neuro-Fuzzy Inference System)
A1 w1 w1*z1
x P
S Swi*zi
A2

B1
P w2*z2 / z
y Sw i
B2 w2 S
2020-02-27 15
Four-Rule ANFIS

• Input space partitioning


y
A1 A2
B2
x
B1 B2
B1
y A1 A2 x
• ANFIS (Adaptive Neuro-Fuzzy Inference System)
w1
A1 P w1*z1
x A2 P
S Swi*zi
B1 P
/ z
y B2 P w4*z4 Sw i
w4
S
2020-02-27 16
Neuro Fuzzy System

 a neuro fuzzy system is capable of identifying bad rules in prior/existing


knowledge supplied by a domain expert.
 e.g.) 5-rule neuro-fuzzy system for XOR operation.
 Use back propagation to adjust the weights and to modify input-output membership fns.
 Continue training until the error (e.g. sum of least mean square) is less than e.g.) 0.001
 Rule 2 is false and removed.
Neuro Fuzzy System
 a neuro fuzzy system which can automatically generate a complete set of f
uzzy if-then rules, given input-output linguistic values.
 Extract fuzzy rules directly from numerical data.
 e.g.) 8 rule neuro fuzzy system for XOR operation: 22  2 = 8 rules.
 Set the initial weights b/t layer 3-4 to 0.5.
 After training, eliminate all rules whose certainty factors are less than some sufficient
ly small number, e.g. 0.1.
ANFIS Architecture: Sugeno’s ANFIS

 Assume that FIS has 2 inputs x, y and one output z.


 Sugeno’s ANFIS:
 Rule1: If x is A1 and y is B1, then f1 = p1x+q1y+r1.
 Rule2: If x is A2 and y is B2, then f2 = p2x+q2y+r2.
ANFIS Architecture: Sugeno’s ANFIS

 Layer 1: fuzzification layer


 Every node I in the layer 1 is an adaptive node with a node function
 O1,I = Ai(x) for i=1,2 or : membership grade of a fuzzy set A1,A2
 O1,I = Bi-2(y) for i=3,4
 Parameters in this layer: premise (or antecedent) parameters.
 Layer 2: rule layer
 a fixed node labeled P whose output is the product of all the incoming signals:
 O2,I = wi = Ai(x) Bi(y) for i=1,2 : firing strength of a rule.
 Layer 3: normalization layer
 a fixed node labeled N.
 The i-th node calculates the ratio of the i-th rule’s firing strength to the sum of all rules’ firing
strength: O3,I = wi = wi / (wi + wi ) for i=1,2
 Outputs of this layer are called normalized firing strengths.
 Layer 4: defuzzification layer
 an adaptive node with a node fn O4,I = wi fi = wi (pi x + qi y + ri ) for i=1,2
where wi is a normalized firing strength from layer 3 and {pi , qi ri } is the parameter set of
this node – Consequent Parameters.
 Layer 5: summation neuron
 a fixed node which computes the overall output as the summation of all incoming signals
 Overall output = O5, 1 = ∑ wi fi = ∑ wi fi / ∑ wi
ANFIS Architecture: Sugeno’s ANFIS

 How does an ANFIS learn?


 A learning algorithm of the least-squares estimator + the gradient descent method
 Forward Pass: adjustment of consequent parameters, pi, qi, ri.
 Rule consequent parameters are identified by the least-square estimator.
 Find a least-square estimate of k=[r1 p1 q1.. r2 p2 q2 .. rn pn qn] , k*, that minimizes the squared error
e=|Od – O|2.
 E = e2 / 2 = (Od – O)2 / 2
 The consequent parameters are adjusted while the antecedent parameters remain fixed.
 Backward Pass: adjustment of antecedent parameters
 The antecedent parameters are tuned while the consequent parameters are kept fixed.
 E.g.) Bell activation fn: [1 + ((x-a)/c)2b] -1.
Consider a correction applied to parameter of a, a., a= a + a..

where
ANFIS Architecture: Sugeno’s ANFIS

 The structure of the network is not unique.


ANFIS Architecture: Tsukamoto ANFIS

 Tsukamoto ANFIS:
ANFIS Architecture
 Improvement: 2 input first-order Sugeno fuzzy model with 9 rules
 How the 2-dimensional input space is partitioned into 9 overlapping fuzzy
regions, each of which is governed by a fuzzy if-then rule.
 i.e. The premise part of a rule defines a fuzzy region, while the consequent
part specifies the output within the region.
Questions?
Automatics and Code Design:
FUZZY LOGIC Systems GUI

 Fuzzy Logic Systems Package facilitates


the development of fuzzy-logic systems
using:
 graphical user interface (GUI) tools
 command line functionality

 The package can be used for building


 Fuzzy Logic Expert Systems
 Adaptive Neuro-Fuzzy Inference Systems
(ANFIS)
Graphical User Interface (GUI)

There are five


primary GUI tools
for building, editing,
and observing fuzzy
inference systems in
the Fuzzy Logic
package:

1. Fuzzy Inference System (FIS) Editor


2. Membership Function Editor
3. Rule Editor
4. Rule Viewer
5. Surface Viewer
Application: Employer Salary Raise Fuzzy Logic Model

• Employer Salary Raise Model


•Extension Principle:
– one to one
– many to one
– n-D Cartesian product to y
Fuzzy Logic Model for Employer Salary Raise

• COMMON SENSE RULES


1. If teaching quality is bad, raise is low.
2. If teaching quality is good, raise is average.
3. If teaching quality is excellent, raise is generous
4. If research level is bad, raise is low
5. If research level is excellent, raise is generous
• COMBINE RULES
1. If teaching is poor or research is poor, raise is low
2. If teaching is good, raise is average
3. If teaching or research is excellent, raise is excellent

(interpreted) (assigned)
Generic Fuzzy Logic Code for Teacher Salary Raise Model

%Establish constants
Teach_Ratio = 0.8
Lo_Raise =0.01;Avg_Raise=0.05;Hi_Raise=0.1;

• Naïve model
Raise_Range=Hi_Raise-Lo_Raise;
Bad_Teach = 0;OK_Teach = 3; Good_Teach = 7; Great_Teach = 10;
Teach_Range = Great_Teach-Bad_Teach;
Bad_Res = 0; Great_Res = 10;
Res_Range = Great_Res-Bad_res;
• Base salary raise +
%If teaching is poor or research is poor, raise is low Work performance
• Base + Development
if teaching < OK_Teach
raise=((Avg_Raise - Lo_Rasie)/(OK_Teach - Bad_Teach)
*teaching + Lo_Raise)*Teach_Ratio
+ (1 - Teach_ratio)(Raise_Range/Res_Range*research
+ Lo_Raise);
& research performance
%If teaching is good, raise is good
elseif teaching < Good_Teach • Base + 80%
raise=Avg_raise*Teach_ratio
+ (1 - Teach_ratio)*(Raise_Range/res_range*research Development and 20%
+ Lo_Raise);
%If teaching or research is excellent, raise is excellent
else
research
raise = ((Hi_Raise - Avg_Raise)/(Great_Teach - Good_teach)
*(teach - Good_teach + Avg_Raise)*Teach_Ratio
+ (1 - Teach_Ratio)
*(Raise_Range/Res_Range*research+Lo_Raise);
Fuzzy Logic Model for Employer Salary Raise

Fuzzy Inference
System Editor

Rule Editor
Fuzzy Logic Model for Employer Salary Raise

Membership
function
Editor
Fuzzy Logic Model for Employer Salary Raise

Rule
Viewer

Surface
Viewer
Fuzzy Logic Model for Employer Salary Raise

(interpreted) (assigned)

1. If teaching is poor or research is poor, raise is low


2. If teaching is good, raise is average
3. If teaching or research is excellent, raise is excellent

(interpreted as (assigned to be:


good, poor,excellent) low, average, generous)
IF-THEN RULES
if x is A the y is B
if teaching = good => raise = average
BINARY LOGIC p -->q
FUZZY LOGIC 0.5 p --> 0.5 q
%=====================================
% Initialization parameters for Fuzzy
%=====================================
% Number of training epochs
% epoch_n = 10;
% Number of membership functions assigned to an input variable
numMFs = 2;
% Type of the membership function or'gbellmf'; 'gaussmf'; 'trapmf';
% 'trimf';
MFTypes = 'gaussmf';
%MFTypes = 'trimf';
%MFTypes = varargin{3};
%For selected partition: The default is 5,'gbellmf'
%in_fis = genfis1([inputMatrix outputColumn],numMFs,mfType);
%in_fis = genfis2(x,perm,0.5);
%For a grid partition
epoch_n = 20;
%in_fis = genfis2(X0Tr, YNr0(:,k),0.5);
in_fis = genfis1([OT_tr_XN,OT_tr_YN(:,k)],numMFs,MFTypes);
out_fis = anfis([OT_tr_XN OT_tr_YN(:,k)],in_fis,epoch_n);
out_fis = anfis([OT_tr_XN OT_tr_YN(:,k)],in_fis,epoch_n);

figure('name',
['Initial plots for the Membership functions for Y',
...
num2str(k)],'NumberTitle','off');

for i=1:col_X
[x,mf] = plotmf(in_fis,'input',i);
subplot(col_X,1,i),
plot(x,mf);
xlabel('input 1');
end

yHat = evalfis(OT_tr_XN ,out_fis)';


yHats = evalfis(OT_ts_XN ,out_fis)';
ANFIS for Classifying Salt data

Final membership functions of the input variables


Initial membership functions of the input variables

1
1

Deg.Memb.
Deg.Memb.

0.5
0.5

0
0
10 15 20 25 30 35 10 15 20 25 30 35

Temprature (T) Temprature (T)

1 1

Deg.Memb.
Deg.Memb.

0.5 0.5

0 0
0.945 0.95 0.955 0.96 0.965 0.97 0.975 0.98 0.985 0.945 0.95 0.955 0.96 0.965 0.97 0.975 0.98 0.985
water activity (Aw) water activity (Aw)
ANFIS for Classifying Salt data
Final membership functions of the input variables

Deg.Memb.
0.5

0
10 15 20 25 30 35
Temprature (T)

1
Deg.Memb.

0.5

0
0.945 0.95 0.955 0.96 0.965 0.97 0.975 0.98 0.985
water activity (Aw)
ANFIS for Classifying Salt data
Training confusion matrix

100

50

0
2
1.8 2
1.6 1.8
1.4 1.6
1.2 1.4
1.2
1 1
Test confusion matrix

30

20

10

0
2
1.8 2
1.6 1.8
1.4 1.6
1.2 1.4
1.2
1 1
ANFIS for Classifying Salt data
 CCR = 0.94

 Conf_Mat_training =
 41.00 6.00
 2.00 77.00

 CCR = 0.91

 Conf_Mat_test =
 24.00 5.00
 0 24.00

 Computational time: = 2.59375


ANFIS for Classifying Salt data

Reg
0.95
KNN

0.9 PNN

FFN
0.85
ANFIS

0.8
Mean of CCR

0.75

0.7

0.65

0.6

0.55

0.02 0.04 0.06 0.08 0.1 0.12 0.14 0.16 0.18 0.2 0.22
Standard deviation of CCR
ANFIS for Classifying Salt data
References and WWW Resources
 References:
 “Neuro-Fuzzy and Soft Computing”, J.-S. R. Jang, C.-T. Sun and E. Mizut
ani, Prentice Hall, 1996
 “Neuro-Fuzzy Modeling and Control”, J.-S. R. Jang and C.-T. Sun, the Pr
oceedings of IEEE, March 1995.
 “ANFIS: Adaptive-Network-based Fuzzy Inference Systems,”, J.-S. R. Ja
ng, IEEE Trans. on Systems, Man, and Cybernetics, May 1993.
 Internet resources:

2020-02-27 44

You might also like