274 - Soft Computing LECTURE NOTES
274 - Soft Computing LECTURE NOTES
03
386
SC – GA - Introduction
• Introduction
Solving problems mean looking for solutions, which is best among others.
j unlike older AI systems, the GA's do not break easily even if the
387
1.1 Optimization
• In the panel design, we want to limit the weight and put constrain
on its shape.
388
• Optimization Methods
− low cost,
− high performance,
− low loss
Optimization
Methods
Linear Non-Linear
Programming Programming
Each of these methods are briefly discussed indicating the nature of the
problem they are more applicable.
389
■ Linear Programming
− the optimal solution, is the one that minimizes (or maximizes) the
objective function.
390
Enumerative search goes through every point (one point at a time )
related to the function's domain space. At each point, all possible
solutions are generated and tested to find optimum solution. It is easy to
implement but usually require significant computation. In the field of
artificial intelligence, the enumerative methods are subdivided into two
categories:
09
391
SC – GA - Introduction
which means
The two other search methodologies, shown below, the Classical and the
Enumerative methods, are first briefly explained. Later the Stochastic
methods are discussed in detail. All these methods belong to Non-Linear
search.
Search
Optimization
392
Evolutionary Genetic Simulated
10
393
SC – GA - Introduction
• Indirect methods :
394
11
395
SC – GA - Introduction
■ Enumerative Search
Here the search goes through every point (one point at a time) related to
the function's domain space.
− At each point, all possible solutions are generated and tested to find
optimum solution.
• Informed methods :
396
Next slide shows, the taxonomy of enumerative search in AI domain.
12
397
SC – GA - Introduction
− there are many control structures for search; the depth-first search
Enumerative Search
Queue: g(n)
Climbin
Search Search Search -and-test g
398
Impose fixed
Priority
depth limit
Queue: h(n)
Depth
Best first Problem Constraint Mean-end-
Limited satisfactio
search Reduction n analysis
Search
Priority Queue:
A*
Search AO* Search
Iterative
Deepening
DFS
13
399
SC – GA - Introduction
• Stochastic Search
The stochastic search techniques are grouped into two major subclasses :
− Evolutionary algorithms.
400
− the search evolves throughout generations, improving the
features of potential solutions by means of biological inspired
operations.
14
401
SC – GA - Introduction
Search
Optimization
Techniques
Indirect
Direct
Uninformed Informed
method
method
Search Search
Newton Finonacci
Genetic Genetic
Programming Algorithms
402
Fig. Taxonomy of Search Optimization techniques
- Genetic Programming
15
403
SC – GA - Introduction
Development History
EC = GP + ES + EP + GA
404
Evolutionary Genetic Evolution Evolutionary Genetic
16
405
SC – GA - Introduction
17
406
SC – GA - Introduction
Possible settings for a trait (e.g. blue, brown) are called alleles.
Each gene has its own position in the chromosome called its locus.
When two organisms mate they share their genes; the resultant
offspring may end up having half the genes from one parent and half
from the other. This process is called recombination (cross over) .
The new created offspring can then be mutated. Mutation means, that
the elements of DNA are a bit changed. This changes are mainly
caused by errors in copying genes from parents.
407
The fitness of an organism is measured by success of the organism in
its life (survival).
18
408
SC – GA - Introduction
Parents
Parents
Initialization
Recombination
Population
Mutation
Termination
Offspring
Survivor
Pseudo-Code
BEGIN
409
REPEAT UNTIL (termination condition ) is satisfied DO
■ SELECT parents;
END.
19
410
SC – GA - Introduction
• Search Space
In solving problems, some solution will be the best among others. The
space of all feasible solutions (among which the desired solution
− Each possible solution can be "marked" by its value (or fitness) for the
problem.
− Looking for a solution is then equal to looking for some extreme value
(minimum or maximum) in the search space.
− At times the search space may be well defined, but usually only a few
points in the search space are known.
20
411
SC – GA - Introduction
- Working Principles
Working principles :
412
Genetic algorithm begins with a set of solutions (represented by
chromosomes) called the population.
− Solutions from one population are taken and used to form a new
population. This is motivated by the possibility that the new population
will be better than the old one.
21
413
SC – GA - Introduction
• [Test] If the end condition is satisfied, stop, and return the best
solution in current population
• [Loop] Go to step 2
414
22
415
SC – GA - Introduction
Start
Seed Population
to each individual
No
Yes
Survival of Fittest
416
new individual into
population
Apply Mutation operator
No offspring
Terminate?
Finish
23
417
SC – GA - Encoding
◊ Encoding
Example :
A Gene represents some data (eye color, hair color, sight, etc.).
Chromosome 1 : 1101100100110110
Chromosome 2 : 1101111000011110
− There are many other ways of encoding, e.g., encoding values as integer or
real numbers or some permutations and so on.
418
− The virtue of these encoding method depends on the problem to work on .
24
419
SC – GA - Encoding
■ Binary Encoding
Chromosome 1: 101100101100101011100101
Chromosome 2: 111111100000110000011111
− This encoding is often not natural for many problems and sometimes
corrections must be made after crossover and/or mutation.
Example 1:
420
2 0010 8 1000 14 1110
4 0100 10 1010
5 0101 11 1011
25
421
SC – GA - Encoding
Example 2 :
Every variable will have both upper and lower limits as X iL ≤ Xi ≤ XiU
Because 4-bit string can represent integers from 0 to 15,
( X1
L
, X2L ) and ( X1
U
, X2
U
) respectively.
2 5 0 0x2 0 =0 K=ni - 1
Σ2kSk
2 2 1
1 =2
1x2 k=0
3
1x2 =8 Sn-1 . . . . S3 S2 S1 S0
10
422
Consider a 4-bit string (0111),
23 x 0 + 22 x 1 + 21 x 1 + 20 x 1 = 7
U L
(Xi − Xi )
L
Xi =Xi + --------------- x (decoded value of string)
ni
(2 −1)
− For e.g. a variable Xi ; let XiL =2 , and XiU = 17, find what value the
4-bit string Xi = (1010) would represent. First get decoded value for
Si = 1010 = 23 x 1 + 22 x 0 + 21 x 1 + 20 x 0 = 10 then
(17 -2)
Xi = 2 + ----------- x 10 = 12
(24 - 1)
26
423
SC – GA - Encoding
- Value Encoding
The Value encoding can be used in problems where values such as real
numbers are used. Use of binary encoding for this type of problems would
be difficult.
Examples :
Chromosome B ABDJEIFJDHDIERJFDLDFLFEGT
27
424
SC – GA - Encoding
- Permutation Encoding
Chromosome A 153264798
Chromosome B 856723149
Examples :
There are eight queens. Find a way to place them on a chess board so
that no two queens attack each other. Here, encoding describes the
position of a queen on each row.
425
28
426
SC – GA - Encoding
• Tree Encoding
Example :
Chromosome A Chromosome B
+
do untill
x /
step wall
5 y
427
Note : Tree encoding is good for evolving programs. The programming
language LISP is often used. Programs in LISP can be easily parsed as a
tree, so the crossover and mutation is relatively easy.
29
428
SC – GA - Operators
Genetic operators are analogous to those which occur in the natural world:
− Mutation.
− Population size says how many chromosomes are in population (in one
generation).
− If there are only few chromosomes, then GA would have a few possibilities
to perform crossover and only a small part of search space is explored.
− Research shows that after some limit, it is not useful to increase population
size, because it does not help in solving the problem faster. The population
size depends on the type of encoding and the problem.
429
30
430
SC – GA - Operators
Many reproduction operators exists and they all essentially do same thing.
They pick from current population the strings of above average and insert
their multiple copies in the mating pool in a probabilistic manner.
431
− Roulette wheel selection,
− Rank selection
− Boltzmann selection, −
Tournament selection,
− Steady state selection.
The Roulette wheel and Boltzmann selections methods are illustrated next.
31
432
SC – GA - Operators
• Example of Selection
Evolutionary Algorithms is
to maximize the function f(x) = x2 with x in
i.e., x = 0, 1, . . . 30, 31.
the integer interval [0 , 31],
433
4. is no of individuals in the population, is population size, n=4
n * pi is expected count
32
434
SC – GA - Operators
In fitness-proportionate selection :
5%
8 2
20% 9%
13%
8%
8%
17%
20%
435
− the fitness of the individuals is
th
would choose the 5 individual more
than other individuals .
n
Probability of i th string is pi = F i / (Σ F j ) , where
j=1
F
make copies of the ith string.
N=5
Cumulative Probability5 = Σ pi
i=1
33
436
SC – GA - Operators
• Boltzmann Selection
34
437
SC – GA - Operators
3.2 Crossover
− the others are Two Point, Uniform, Arithmetic, and Heuristic crossovers.
The operators are selected based on the way chromosomes are encoded.
35
438
SC – GA - Operators
• One-Point Crossover
Parent 1 11011|00100110110
Parent 2 11011|11000011110
Offspring 1 1 1 0 1 1 | 1 1 0 0 0 0 1 1 1 1 0
Offspring 2 1 1 0 1 1 | 0 0 1 0 0 1 1 0 1 1 0
36
439
SC – GA - Operators
► Two-Point Crossover
Parent 1 11011|0010011|0110
Parent 2 11011|1100001|1110
Offspring 1 1 1 0 1 1 | 0 0 1 0 0 1 1 | 0 1 1 0
Offspring 2 1 1 0 1 1 | 0 0 1 0 0 1 1 | 0 1 1 0
37
440
SC – GA - Operators
► Uniform Crossover
Parent 1 1 1 0 1 1 0 0 1 0 0 1 1 0 1 1 0
Parent 2 1 1 0 1 1 1 1 0 0 0 0 1 1 1 1 0
If the mixing ratio is 0.5 approximately, then half of the genes in the
offspring will come from parent 1 and other half will come from parent 2.
The possible set of offspring after uniform crossover would be:
Offspring 1 11 12 02 11 11 12 12 02 01 01 02 11 12 11 11 02
Offspring 2 12 11 01 12 12 01 01 11 02 02 11 12 01 12 12 01
Note: The subscripts indicate which parent the gene came from.
38
441
SC – GA - Operators
• Arithmetic
Applying the above two equations and assuming the weighting factor a =
0.7, applying above equations, we get two resulting offspring. The possible
39
442
SC – GA - Operators
ii Heuristic
Heuristic crossover operator uses the fitness values of the two parent
chromosomes to determine the direction of the search.
Offspring2 = BestParent
40
443
SC – GA - Operators
3.3 Mutation
Mutation alters one or more gene values in a chromosome from its initial
state. This can result in entirely new gene values being added to the gene
pool. With the new gene values, the genetic algorithm may be able to
arrive at better solution than was previously possible.
The operators are selected based on the way chromosomes are encoded .
444
41
445
SC – GA - Operators
■ Flip Bit
The mutation operator simply inverts the value of the chosen gene. i.e. 0
goes to 1 and 1 goes to 0.
Original offspring 1 1 1 0 1 1 1 10 0 0 0 1 1 1 1 0
Original offspring 2 1 1 0 1 1 0 01 0 0 1 1 0 1 1 0
Mutated offspring 1 1 1 0 0 1 1 10 0 0 0 1 1 1 1 0
Mutated offspring 2 1 1 0 1 1 0 11 0 0 1 1 0 1 0 0
42
446
SC – GA - Operators
• Boundary
The mutation operator replaces the value of the chosen gene with either
the upper or lower bound for that gene (chosen randomly).
This mutation operator can only be used for integer and float genes.
• Non-Uniform
The mutation operator increases the probability such that the amount of
the mutation will be close to 0 as the generation number increases. This
mutation operator prevents the population from stagnating in the early
stages of the evolution then allows the genetic algorithm to fine tune the
solution in the later stages of evolution.
This mutation operator can only be used for integer and float genes.
• Uniform
The mutation operator replaces the value of the chosen gene with a
uniform random value selected between the user-specified upper and
lower bounds for that gene.
This mutation operator can only be used for integer and float genes.
• Gaussian
This mutation operator can only be used for integer and float genes.
447
Examples to demonstrate and explain : Random population, Fitness, Selection,
Crossover, Mutation, and Accepting.
Example 1 :
Maximize the function f(x) = x2 over the range of integers from 0 . . . 31.
− Repeat until x = 31
1.
Devise a means to represent a solution to the problem :
2.
Devise a heuristic for evaluating the fitness of any particular solution :
The function f(x) is simple, so it is easy to use the f(x) value itself to rate
the fitness of a solution; else we might have considered a more simpler
heuristic that would more or less serve the same purpose.
3.
Coding - Binary and the String length :
448
GAs often process binary representations of solutions. This works well,
because crossover and mutation can be clearly defined for binary solutions. A
Binary string of length 5 can represents 32 numbers (0 to 31).
4.
Randomly generate a set of solutions :
5.
Evaluate the fitness of each member of the population :
11000
01101 → 13; → 24; 01000 → 8; 10011 → 19;
13 24
→ 169; → 576; 8 → 64; 19 → 361.
449
Max 576 0.49 1.97
SC – GA - Examples
6.
Produce a new generation of solutions by picking from the existing pool
of solutions with a preference for solutions which are better suited than
others:
We divide the range into four bins, sized according to the relative fitness of
the solutions which they represent.
By generating 4 uniform (0, 1) random values and seeing which bin they fall
into we pick the four strings that will form the basis for the next generation.
7.
Randomly pair the members of the new generation
Random number generator decides for us to mate the first two strings
together and the second two strings together.
8.
Within each pair swap parts of the members solutions to create
offspring which are a mixture of the parents :
450
For the first pair of strings: 01101 , 11000
01101 ⇒ 0 1 1 0 |1 ⇒ 01100
11000 ⇒ 1 1 0 0 |0 ⇒ 11001
11000 ⇒ 1 1 |0 0 0 ⇒ 11011
10011 ⇒ 1 0 |0 1 1 ⇒ 10000
9.
Randomly mutate a very small fraction of genes in the population :
With a typical mutation probability of per bit it happens that none of the bits
in our population are mutated.
10.
Go back and re-evaluate fitness of the population (new generation) :
451
Total (sum) 1754 1.000 4.000
Observe that :
01101,11000, 01000,10011
01100,11001, 1 1 0 11 , 10000
■ The total fitness has gone from 1170 to 1754 in a single generation.
■ The algorithm has already come up with the string 11011 (i.e x = 27) as
a possible solution.
at A. Let a
horizontal force P acts at C.
ℓ1
θ 1
Find : Equilibrium configuration of the system if
W1
ℓ2
C
θ
2 P Solution : Since there are two unknowns θ 1 and
XU - XL 90 - 0
452
W2 Accuracy = ----------- = --------- = 60
24 - 1 15
Fig. Two bar pendulum
Hence, the binary coding and the corresponding angles Xi are given as
XiU - XiL
th
Xi = XiL + ----------- Si where Si is decoded Value of the i chromosome.
24 - 1
e.g. the 6th chromosome binary code (0 1 0 1) would have the corresponding
angle given by Si = 0 1 0 1 = 23 x 0 + 22 x 1 + 21 x 0 + 20 x 1 = 5
90 - 0
Xi = 0 + ----------- x 5 = 30
15
The binary coding and the angles are given in the table below.
Si Xi Si Xi
1 0000 0 9 1000 48
2 0001 6 10 1001 54
3 0010 12 11 1010 60
4 0011 18 12 1011 66
5 0100 24 13 1100 72
6 0101 30 14 1101 78
7 0110 36 15 1110 84
8 0111 42 16 1111 90
(c) = - P[(ℓ1 sinθ 1 + ℓ2 sinθ 2 )] - (W1 ℓ1 /2)cosθ 1 - W2 [(ℓ2 /2) cosθ 2 + ℓ1 cosθ 1]
(Eq.1)
453
θ1,θ2
lies between 0 and 90 both inclusive ie 0 ≤ θ 1 , θ 2 ≤ 90 (Eq. 3)
Since the objective function is –ve , instead of minimizing the function f let us
maximize -f = f ’ . The maximum value of f ’ = 8 when θ 1 and θ 2 are zero.
48
454
First randomly generate 8 population with 8 bit strings as shown in table below.
θ1 , θ2
1 0000 0000 0 0 1
These angles and the corresponding to fitness function are shown below.
455
− GA begins with a population of random strings.
− If the termination criteria are not met, the population is iteratively operated by
the three operators and evaluated until the termination criteria are met.
49
456
Hybrid Systems
Integration of NN FL GA
What is Hybridization ?
− Auxiliary hybrid system: the one technology calls the other technology
as subroutine;
457
− Fuzzy logic addresses the imprecision or vagueness in input and output,
− Genetic algorithms are inspired by biological evolution, can systemize
ƒ Introduction :
Fuzzy logic, Neural networks and Genetic algorithms are soft computing
methods which are inspired by biological computational processes and nature's
problem solving strategies.
Neural Networks (NNs) are highly simplified model of human nervous system
which mimic our ability to adapt to circumstances and learn from past experience.
Neural Networks systems are represented by different architectures like single and
multilayer feed forward network. The networks offers back proposition
generalization, associative memory and adaptive resonance theory.
458
Each of these technologies have provided efficient solution to wide range of
problems belonging to different domains. However, each of these technologies
suffer from advantages and disadvantages.
pipelining fashion.
459
fashion. Thus, one technology's output becomes another technology's
input and it goes on. However, this is one of the weakest form of
hybridization since an integrated combination of technologies is not
present.
460
1.2 Neural Networks, Fuzzy Logic, and Genetic Algorithms Hybrids
distinct technologies.
of other.
■ Neuro-Fuzzy Hybrid
Neural Networks :
squares errors; the training time required is quite large; the training
data has to be chosen over entire range where the variables are
expected to change.
Fuzzy logic :
461
− Merits : Fuzzy logic system, addresses the imprecision of inputs and
outputs defined by fuzzy sets and allow greater flexibility in formulating
detail system description.
extend the capabilities of the systems beyond either of these two technologies
applied individually. The integrated systems have turned out to be useful in :
- Neuro-Genetic Hybrids
462
Neural Networks : can learn various tasks from examples, classify
phenomena and model nonlinear relationships.
463
Genetic Hybrids
Fuzzy
-
The fuzzy systems like NNs (feed forward) are universal approximator in
the sense that they exhibit the capability to approximate general nonlinear
functions to any desired degree of accuracy.
464
The adjustments of system parameters called for in the process, so that
the system output matches the training data, have been tackled using
GAs. Several parameters which a fuzzy system is involved with like
input/output variables and the membership function that define the fuzzy
systems, have been optimized using GAs.
Neural networks (NNs) are the adaptive system that changes its structure based
on external or internal information that flows through the network. Neural network
solve problems by self-learning and self-organizing.
465
The steps involved are:
− The pattern of activation arriving at the output layer is compared with the
Limitations of BPN :
− BPN can recognize patterns similar to those they have learnt, but do not
have the ability to recognize new patterns.
Genetic Algorithms (GAs) are adaptive search and optimization algorithms, mimic
the principles of nature.
− The BPN determines its weight based on gradient search technique and
466
− GAs do not guarantee to find global optimum solution, but are good in
finding quickly good acceptable solution.
The GA based techniques for determining weights in a BPN are explained next.
15
467
2.1 GA based techniques for determining weights in a BPN
− low fit individuals are kept out from reproduction and so die,
their ancestors,
468
− a fitness function is formulated,
All these aspects of GAs for determining weights of BPN are illustrated in
next few slides.
16
469
SC – Hybrid Systems – GA based BPN
• Coding
chromosomes.
Example :
W11 V11 = ( 2 + 2) . 2 = 8
1 1 1
− each weight is real number and
W12 V12
470
2 2 2 − string S representing
= 40 in length
Fig. BPN with 2 – 2 - 2
ie choose 40 chromosomes
Chromosome
Chromosome
17
471
SC – Hybrid Systems – GA based BPN
• Weight Extraction
Let xkd+1 , xkd+2 , . . x(k + 1)d represent kth gene (k ≥ 0) in the chromosomes.
+ 10d-2
wk =
d-2
xkd +2 10 + xkd +3 10d-3 + . . . + x(k + 1)d , if 0 ≤ xkd +1 < 5
10d-2
The Chromosomes are stated in the Fig. The weights extracted from all
• Gene 0 : 84321 ,
4 x 103 + 3 x 102 + 2 x 10 + 1
W0 = + 3
= +4.321
10
■ Gene 1 : 46234 ,
472
0 ≤ x6 = 4 ≤ 5. Hence, the weight extracted is
6 x 103 + 2 x 102 + 3 x 10 + 4
W1 = − 103 = − 6.234
18
473
SC – Hybrid Systems – GA based BPN
■ Fitness Function :
Example :
The matrix on the right, represents a set of input I (I11 , I21) (T11 , T21)
Let w 01 , w 0
2 , . . . . w 040 be the weight sets extracted, using the Eq.
0
Let o 01 , o 2 , o 03 be the calculated outputs of BPN.
Compute Fitness F1 :
474
The fitness for the chromosome
F1
C01 is given by
F1 = 1 / E .
475
Algorithm
Keeping w i as a fixed weight, train the BPN for the N input instances;
Calculate error E i for each of the input instances using the formula below
2
E i =Σ ( T j i – O ji ) where O i is the output vector calculated by BPN;
1/2
i.e. E =(( Σ E i )/N)
Calculate the Fitness value F i for each of the individual string of the
population as F i =1/E
476
}
Thus
the Fitness values Fi for all chromosomes in the initial
population are
computed. The population size so
is p = 40, Fi,i=1
■ 2 , . . , 40 are computed.
Initial
Population of Extracted
C 01 w 01
Compute
Training BPN
Fitness
0 0
C 2 w 2
----
--- F i =1/E
---
C040 W
0
40 Fitness
Values
477
Fig. Computation of Fitness values for the population
21
478
SC – Hybrid Systems – GA based BPN
• Reproduction of Offspring
i.e., the best fit individuals have multiple copies while worst fit
Having formed the mating pool, select parent pair at random. Chromosomes
of respective pairs are combined using crossover operator. Fig. below shows
:
Pa Pb
Parent
Chromosomes A B
479
Offspring B A
Oa Ob
22
480
Example :
C 01 F1 C01 F1
value Fmax
Chromosomes C0 k Fk C0 k Fk
---- ----
max
F min is replaced by F
481
k Selection of Parent Chromosomes
Here, sample "Selection Of Parents" for the "Two Points Crossover" operator
The Crossover Points of the Chromosomes are randomly chosen for each
parent pairs as shown in the Fig. below.
Crossover
points
482
Selected Parent Pairs
The Genes are exchanged for Mutation as shown in the Fig. below.
New Population P1
24
483
SC – Hybrid Systems – GA based BPN
• Convergence
Example :
and crossover.
− the best individuals replicated and the reproduction carried out using two-
point crossover operators form the next generation P2 of the
chromosomes.
− at that stage, the weights extracted from the population Pi are the final
weights to be used by BPN.
484
• Fuzzy Back Propagation Network
Neural Networks and Fuzzy logic (NN-FL) represents two distinct methodologies
and the integration of NN and FL is called Neuro-Fuzzy systems.
Fuzzy-BPN architecture, maps fuzzy inputs to crisp outputs. Here, the Neurons
uses LR-type fuzzy numbers.
• Definition
~
m–x
L for x≤m, α 0
µ ~ (x) = α
M m–x
R for x ≤ m , β 0
485
β
R is a right reference, ~
m–x
m–x
L = max (0,1- )
α α
m–x
m–x
R = max (0,1- )
α α
~
LR-type fuzzy number M can be represented as (m, α, β) LR shown below.
Member ship
deg µ ~ (x)
00 α m, β x
486
Note : If α and β are both zero, then L-R type function indicates a crisp
value. The choice of L and R functions is specific to problem.
487
• Operations on LR-type Fuzzy Numbers
~ ~ = (n, γ , δ)
Let M = (m, α , β) LR and N LR be two L R-type fuzzy
• Addition
(m, α , β) LR (n, γ , δ) LR = (m + n, α + γ , β + δ ) LR
• Substraction
(m, α , β) LR (n, γ , δ) LR = (m - n, α + δ , β + γ ) LR
■ Multiplicaion
( m,
α , β) LR , , δ) LR = (mn , mα - mδ , nβ - mγ ) RL for m<0 , n≥ 0
(n,
β) LR γ
n<0
■ Scalar Multiplicaion
λ*(m, α , β) LR = (λm,
, λβ) , 0 , λ
∀
λ R
≥
LR
λα ∈
λ*(m,
, β) = (λm, - , -λβ) , λ<0 , λ
∀
α LR RL R
∈
λα
488
489
SC – Hybrid Systems – Fuzzy BPN
■ Fuzzy Neuron
The fuzzy neuron is the basic element of Fuzzy BP network. Fig. below
shows the architecture of the fuzzy neuron.
n ~ ~ ~
i=1
~ n ~ ~
i=0
~
490
~ ~ ~ ~
~ ~ ~ ~ ~ ~ ~
■ Architecture of Fuzzy BP
491
492
SC – Hybrid Systems – Fuzzy AM
A fuzzy logic system contains the sets used to categorize input data (i.e.,
fuzzification), the decision rules that are applied to each set, and then a way
of generating an output from the rule results (i.e., defuzzification).
(non-fuzzy) output .
Associative memory allows a fuzzy rule base to be stored. The inputs are
the degrees of membership, and the outputs are the fuzzy system’s output.
33
493
The problem indicates, that there are two inputs and one-output
variables. The inference engineer is constructed based on fuzzy rule :
Weight (X)
S M L
S M L L
Stream (Y) M S M L
L S S L
35
494
• Fuzzy Representation :
■ Defuzzification
Z COG = Σn µc (Z j ) Z j / Σn µc (Z j ) where
j=1 j=1
Z j
is the control output at the quantization level j,
µc (Z j )
represents its membership value in the output fuzzy set.
Referring to Fig in the previous slide and the formula for COG, we
get the fuzzy set of the washing time as w = { 0.8/20, 0.4/35, 0.2/60
} The calculated washing time using COG formula T = 41.025 min.
495
■ Simplified Fuzzy ARTMAP
− The Supervised ART algorithms that are named with the suffix
"MAP", as ARTMAP. Here the algorithms cluster both the inputs and
ART1:
targets and associate two sets of clusters.
ART2 :
The ART systems have many variations : ART1, ART2, Fuzzy ART,
ARTMAP.
ART-1 or ART-2 units into a supervised learning structure. Here, the first
unit takes the input data and the second unit takes the correct output
data, then used to make the minimum possible adjustment of the
vigilance parameter in the first unit in order to make the correct
classification.
496
The Fuzzy ARTMAP model is fuzzy logic based computations incorporated
inter-ART module called the Map Field. The Map Field forms predictive
associations between categories of the ART modules and realizes a
match tracking rule. If ARTa and ARTb are disconnected then each
module would be of self-organize category, groupings their respective
input sets.
497
In supervised mode, the mappings are learned between input vectors
a and b. A familiar example of supervised neural networks are feed-
forward networks with back-propagation of errors.
498
− ARTMAP systems can learn both in a fast as well as in a slow match
configuration, while, the BP networks can only learn in slow
mismatch configuration. This means that an ARTMAP system learns,
or adapts its weights, only when the input matches an established
category, while BP networks learn when the input does not match
an established category.
499