0 ratings0% found this document useful (0 votes) 215 views42 pagesSoft Computing Notes
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here.
Available Formats
Download as PDF or read online on Scribd
Soft Compuhing eel®
he. aCombudin =
:
Antecedint (x) —2P FoQ FY Conneaut
ings ox am eve Hh
{ins ran cnt easy {fouled}
Nahin At
Praja of ConfahrgSyahn: tatty eden
Rracisa Solalion Contd aulon
Unonebiguons 7 Accunade a
Matfmlical MactL
Ty he of Compulng:— Hand Oompahing Hand Combubing +
Sof Computing, D Ravull mat be Pacer
Hy Laid Comply (Quant 0 oF) ppaon-x
Solt Compuhing.: @ Contael Aelion must be
© Tope Sa
: Form ;
one Sw Numunical Rasblom
Um cortomi same ® Saanching. Probl
a y Sameer © Soaking "abe
@ Low Sobalion Cost ore eer
© Deo not sepi d Mabhums cot modi
| yalalionany [Ganaki Alien
Fens, Logit 5 Newel Njo roltionany [| bt tig
aod =
Boctr Ve P Seo exctte Roll Lonel on
é oe fsione cla
: ae a
Hy laid Coomfpubiong, alan Sobveddy 66
-~
io So
afhse US] Hybid
eRracisiom
Tmbncision
Bared 6 din Cased on Fi '
wei 7 Names so bl degic,*
a. Neocles
Seedt Tela, Deab, with amlinn 7 nee
Sbictly Sequombl Allous- prroLbeD Compubolh
Coubainly. al bin
6)
oa Seay! + Ml fate Clee lay) brecers the Ip il Net Lp
Dendrite: long inyadanly. shohedl fulamands ablabled te He Some - hp isha
Axon Prob Ys dink ae f th wm omare Pps channel | ofp
thet cantires the dompulres
Gulp of Fhe Axon? Vablage os ea
Schumardit Drigram of Biodngcal Newren
> Synapre
Negcliws oer Ni
Somme, <
Ww
call Jody.
Daondnites
Coane u
spy Metloubeal Madd of axtificiel Neuron
Cc f
oO ¥ y
oS ee eotbet CS
AN oo,
Chornunss
on tech Seat aan tpecified ig teu desi. ember:
© Commechions > Single - boyan seb fennel ne
3 i, dhe pe ea sath ale onan jpotbac
> Ginglebog acum nfo
Feb
pho lage sent Nf
Srbuurid Lusaming. wb bb tW ejay Gipaath Sipmsie 4
Unoewbauiied Hanns wo
Rainfrvemment Learn, re ria Side Signi 4@ Single node with own dfecd-back NoAd vhow
OD Tebolily Ys Linn JP feo=x
Tin Yes care. 0/p Jumains bare. os T/p
on
DP Talundity achivelion pi
@ Binay Sob fel: Joo = {4 Jno wf —
6 J 2<0 6
he 0 Honbold valu <>} %
Gonnably waa im Aingl~ Lagan. Wa) do Comuh the No &/p to o/p in inary foun-w
@© Bibebor Sts 2: rs)
wus! Hxr6 oles at —
4 {\ yg eile
this dis ured Jin gringh— dye Wo te come the io |
wd x/p to ofp sin dipole (41 00 -D)
i Be sulolionshib Le™
@® Supoida) fulbs— Used im hack prepagolion jg Hecawss of
He ral of Ha fol of a fo? Hf veloe.of Hecho te
Reduce He comastonal Auncln alvving’ taainseg
jgmoidel Jos tyes :
ee 1 jg Steines parameden
0 Binary Stymie we {oe = ae
Ung Soe 19° sar of tobe th foo abe 4)
Unisex Samad 149° “sanyo signi BP fom onl;
© hel pie! | ae ant
Te
i sang of Hin fob AS™ —1 41
Deira oie obve He
fered pe) 4
© Ramp ol:- fof ae
© apx] =[o8,0°%6 on4]
[1,2 J =[oy 03 762]
sas = 073° lecaae th yp olueged
Yas bt Be “
Yin = O83
(Dd
z 1 Sal = 062-
J=4Oun) 4 ete e038
r 2 =o.
@ y= flim) = wr —1=0:259 =
2, Linpbrand AND fol unig MP mewron *
x. ; ee y the onby amas i
1 ise dake weights oy =) 7 a=)
|
° 6
i (W Sip = 200) a0n= i HM 22
° 5 (LO Yan =)
° io (01) Yn et
(40) Yy=0Yin @ ot CHD : : ®
Yin = 4 ot Lip i awa fol? %p » Ty A dh te Tpprore hy).
el ot (on Net infu = 2 for C1)
Sp Aosedd on thus vada. net x/p voll Hassell .is seh
Yinzo at (0)
4
Hans if He Till val 40. > gual Hom en equal le 2 Yun raucen frss-
oP aewis net is Se (e= 2)
O=2 abe Ae pod Ay
@ >nW-P
ne2 wel po
O22-0= OF2
So ofp of newenY Can he written 0s
ye dlign=f! Fine?
oh Yin <2
fap: 3 Limplemonit XOR ful” ung Melulloch-Pitts neuron
Sob TT of xor my Ha
Sng Augie fo
iY 2B + 1 2
ys ztZ2
wh zeam%m (wy
Zaz Ha (p42)
yeztZe (429)
that means Aingl degen f= nehsock.
ax nob aubficuard ty cupresent the oA
Se na al byes pees
¥
4 . Xor canned be, aspensenbeel dy ainply
i
°Cort Pyum Wy, =!
Wa =!
z
o nbz/P 0 Zin = oxi rom =O
7 CV Za =1
U0) Zan =I
e Ch) Zay =2
Hone E+ is ned powibl fy full z, using Hse weight
Gone y=] Wy=-l fea
wt E/p (0,0) - Zinz OXI tO¥-1 =O |O
Ox > Cond (0,1) zmzonl tix =AF\0
ea ~ Ub zm eter tox! = Th
(ol) zin = a +181 =O IO
S® gpnwtP
pyar
or!
> often) -{! Bind!
O 2Zn!
for 3? fut y= 2, +Zo at, Ls
=. =o, ee Rasim Vi=l Vel
Ore er Yin =Z1Vy + 20M,
'
if F, 1 (0.0) Yn =O NI TOMI=O
Oe meee Cop Wa cont tte 5 | 40
G9 Yin = rep tow) =) y
GUY = onr+orl=O VelLinea Separability -
@l bru Separability as the comcept asfencin the
Atpotahan of He Z/p space into aaglons Habis
daseel on wfuther the nfo respomse is posilsre on
nigabive»
© | Deciwien Line is clrawn te suparek positioe nugahic cxaporse}
©| Deciven Line 02 clesnisn-mabring Lins on oleision- seppodt Line
# Hum agheodl baa
To chasify the halhon Linear Sepnebihty camcephip used te
Conceppis hered! om output Rupense
Wp dapat Samm b+ $00
Ol Respanse Bored
Tf Aipabr beh achrahen fal? a weal over the caleuDaled
mut x/p (yy,) the value of He fo™ ds 1 4orvie 7 -1 for —ine T/p-
hed: unm Ha ane Aoundug 4” Y,>° % JinSO
So cher'sren Aounclorg obetumine dy. Ha selon
et Zvi =o
Far twe X/p
A+ Wy2t) + Wa %2= 0Nebwook
ru Recsrcing 40 Hebd mu “a de weight yeotor is founel Jo
Anusaye proporbionatedy Yo He produc} of te zh SF Loaxming signal:
Meaning Signal meant He rewron's o/p
height eelede any HAM, Rule
{Cay = Wj (ol!) + xy} =
Hebb uh ia mare atuided Jor Lipoler clad Ham inary, dodo +
Lf brnagy chde ss used thon weipht updehon formule Canmat obishingunel
two camcbihon namely -
@ then Abrorrntsg pasr sn which am x/p ami bo ft
© when A troirvig tn sn ech ot do unit hay Yel
Toning pow thn— ohh 9 asta He weil tegdl mag be tuo:
4:6: Wi=0 for det ton whee n= Tatol No of T/p nusw0y
Shp2: Shp ates tare tote perfec for
Gok ligne eye bi
\ Sit
Sb 3: Tp units achrasons are set Ganeredly achrahon
Jo 0h ofp.bayir 0 donb ly ult
=; fadtziton
Stab4: ofp: unit selpetine au aut
Y=
Ships: weight of Bias acljustmnint-are onfororedl:
00s (ne) = Od; (old) +9
ep (news) = bold) +Y
Hull. Rue is uted for pahteon chesieabse , liyseizeden if ansehen:YY , . a
Gah, Deum a Hebb. net ts imp donot gical AD FP? (se Aipalen £/p of tengeds)
m4 x + Y
Th ryfo ib raimeeh wring Helle fo training. olgpiften
Trikially cacights g bias ath ty zen
W,=0 Wy= 0 4=0
Fe the Fish £/p [x 02 b= [11 1] F tayél= | (ona0s 9=1)
wp Uwe) = 0} (eld) 365 y
(nur) = 0, (ory Boral =!
WCrus) = OzColl) tay Pot IXl =]
pp bows) = +l) +9 3otl =]
Foo Seaind Z/p Ligh 1] tug =o
(ald) rary 1+ Ie =O
, (nas) = Y)
! An-l=2
(nut) =uplold) +2aY SI +
An) =4leld+y >! +p =?
Feu Thixd Z/p [1 ta)) tay =H!
0 (ww) = w, (old tay) @O FT x1 =!
walrasy= enor any 22 +61 =!
weh(ws) = doldty sorta
For folk peta feagel =-1
wares) = ela) ry BL PD = Z
; Wp( nus) = Walold) +92¥> | + (--) =2
Fra la ts jue) = bold ty > 140) =?
L,
ie ek y wily Wr) 20
eel 3 :
oh eh i 220
2 ieee Pas insted oh6 Fax seperoing dre eft ay given dy ah = fe 7 ae if
oundeeg, divs 1h” thy valuos 2
20/940 hel rah girth de
tire vebhonue on one Arde ¥
ait putponse am Ale siden th
gia as
Ant C/I) +2g = 0
4a- O ye oy sh wad
Fou rrp Dit) wer ur esq!
wan D> wy = 1-1 —D
fu x" Up fad ee. a=2 bao
AqQ=OXX)—6 => %=O —0
Fou mrp [iy 1] wet yeas b= Al
y= she aoe aye yt! —@)
Fx r/p Etj-1 1] 22 meee ph=-d
ag ae ha ES eq = 41 #1 ——D
a a
oo) qe oe] aly
bgt BP | ums +
xed x, ; al =
hy y-| Cay ae
i - bd oy Gn iy
(a oy
xt T/p Ran x xp mew r/p
(Douision easy fox AD J ning Hubb gulb be cack buryg bus)
ayPhen Neo ett co) |
Unit (I/puut)
@ Noseclader umit ( Hidden unit)
© Rusponse Unit Co/e unit)
ofp (0/1) ofp Doind
ised te oft o/P
sd ° ea
x\xfx <| See
Lad @-o*
Susi @ ®@ Ym tn
Somme gd — Rasactabon- ‘
shel unt Ree ;
© Seuoy uni, connted to arscieror arts wth forad nigh Iaucrg vodbss \y 0,1
ice is assignuc) of wxamcom *
© Brracy achialon fl Bs wel in eta antl and ctasciche unit
fame unit Basan exbrelon of t01-14 fle inary sip itl ed
tube! 0 is wud or FE hralon qo pscahe “ge us
tab muons the ofp signal Ha axe seat from avsecrahion Welt Pe
aurponse unit ane areby binary:
© ofp of He prclaon no 2» y= -fsy) wa Chen) = Meiehon fe
I ah n>
dim |5 4-aege
oy af Yn<9
eet ae au ofp for weight udelion el’ asciae ¢rasnse
© Pun coleulhre ssluudl on He camasirom of ts values of faagebs ont
ps@ % waights an He connections feom He units Hat sere] the nwnrer
wll gd djs satel # eB
© weights ail apo om he Ass of naming oul ef azn: Aas ox
for 0. prcticulan tralning pattern
co, (tans) = Cay old) pet; ‘: i
plus) doll) +24 { wd omg ole
Huo 4 16 welt apdehen J hance the Peorming
xf No er2s2 pcctles ,
Flas Chad for Singh oh Persbben Wo
procs hpAND i nN) :
(ametigad eas
a ee
ne orl, YxQeb=0 ¢O=0 x21
,oco-l 2 f
re 2 @__wizo le w (ns) = O + 1X1 XL =
| 4 : ies —>y\ w,(nns) =o + lyr) ¥1
AQm sotlet =~
Te Refoons = A lon ve complete all dexp Gud) Ba
= bedtorns Hun 4 ebeh epoch comple. YWo% 1 tla +en7e) =
Fu DM rfp HL] eta) - Se!
Neb L/e -Yag= Ae Pas t%2 a tes
ee 6 i)
ra jin=O 1 ud — “ge Xt)
aleulole voy ded 44 =e oO {
jin =
A th igXO
> Yineo ms
lin Xi
Chuck wrhtle toy Ho t=1 PIr?
S t#y
Heo. tees the weight es
waaay = (ald) 72 = erininn
wy (nus) = o+ 1x1 * =|
J (ns) = ot Xl =)
bys | pedal gba
Fa rp =o <7
Yin= + iat 4
se y=flad=!
chek te-1 ye) > FY)
incre) = Leet On)
Spt led = 2 0
elo) 202
Eafe Cit t=!
Yn = OF ORT TE
46 y=4Un)= |
t+#7 Cit)
agen Upelade Wx
2x) =2Og) Find he nigh
7m ting, poicapbron ric fr Plosing fotteors- ron all He
ppsndre) only. one Hine - (sL=1 8=0) CO Grpabs)
weg
aa x. Yn wy eb e008
1 1 -) ° oa oe
\ a 1 -t ~ Oo -20
a ; =! Zt o -20
ay “4 a 2 I poe!
yee \
Find He peghe suypind Jo pro te following clifton weg
hon nfo
off yectns Url, Io 1) ane) Cy ty-tsl) ebb beongng te es 2 Ges d
and yectos Giotto) and CL“ lene ot Lelongirg ie chose Choget=)
Panui, Lemons ake ot ood smitjol Ma of egr=0
IF
Epoch
eee a) i tog 2b = (010 1010)
(oat! 1 ° ° tod 1 ital
a-' a4 ) I -I Owe DO 2
fe alee eae el ee 4 ! 4p vest oll
1 -) cl tal ! | 22 0090 0
Eboch~
; ! 0 ° -1i9,1,1 1
le 3 ! -1 3 1 1
yn wd
[ies | 22 O20
-| - =] ~ 20
Epoch-3 1-2 ) 2 2 10
von on» p22! -2.2 0 2°
7 Zz J —2 2 0 20
oe
-2 2 020
—-22 02°
-1 -2 IAdaptive Hineen, Neen (Adaline) /nyo @
ee peed
Tn peecepdren Nita Hearnés caulk dase en Heldian Reb
But Im Adaline Learmeng Rule cis Lesed om Della Ru
© Downed om, Gunacliond- 7 do
Adjusting the cocights of iA potbern (s'=1 ton) he
my cl ele ar
Wig este oak ¢ iy) ge \ anne
woe : ny a opoun |e Seal ofr et
oye Baie,
AW = & (4y ~ Yj) \ Jor Sucnel ofp units
Yin aut Sos
Fog }-—
Yio
Tabne|e_e=tt- Yan) ; aie
[esto prearE —> f
Turing. Algorithm <> Stcbp— cond bios volwe un grande bad nod cere
y et Le thie ore a
Step 2 Penfoum a fo 7 when tobi candidien issfebe
Sttha Baform aif t-te 6 fer each icpalee tara pair Sx}
bib rt So acadions fv Fp uni, d=1 don
ej = 8}
Sthe Cabeslale the wih yp etl ofP une
Yin = = bt SW;® skbe Utpaete the cocighh-9f dies de iziten
4 (nud) = 0) (600) + O-9 i
p(n) = eel) + Ct-I7'n)
Qkby Ty pighut coer change tad ocountedd alating aang ie
7 =H ited ei Ialerance thon top He tromns pseu
ele confi nue (Stehfeng. Condition of We)
Topo? of -parcher wth Life ps and targa orm elas.
x 2 +
Iotot | wield As cabubled unt fe esc
&,
a
a) i
a - 7
sol My cw, = bro! f tao!
Rafe gol in act begsCHAPTER 3 / SUPERVISED EARNING
Nt
: ~SWWoy
12 ks are assumed t0 be stall nindow valygg ek
here the Fes mean Sauer gua Oy
an square error is obtained, ‘ON
= 0.1 and the les
we calculate the 4
Illy al ee eit a
the leaming rate isalso sot 00,
weights are calculated
The initial weights 3 eae
For the tet input sumple, x1 = be
obs ran
ial
Jin = + xy wy + XW
yin =O1+1 x 0141 0.1
Yin = 0.3
Now compute (t ~ yin) = (1 ~ 03) = 0.7.
Updating the weights we obtain,
w;(new) = w,(old) + a(t ~ yin) x;
where a(t — y4n)x; is called as weight change Aw;. The new weigt
rng ey
tea
Net Input ay “OL
x
hts are obtained as
wy (new) = wy (old) + Aun
0.1401 % 0.7 x 1
L+0.1% 0.7 x 1
=0.17
b(new) = b(old) + Ab
=0140.1 0.7
b(new) = 0.17
where
Avr = alt — yin)
Aw = a(t ~ 549)
x2
Ab = a(t — y,,)
Now we calculate the error:
Es(-y,)
= (0.7) = 0.49
ing first in
The final weights after Present Put sampl
le are
w=([01
and enor E = 049 10.17 0.17 0.17)E
+ 103
The above calculations are performed for all the input samples and the error is calcu-
lated. One epoch is completed when all the input patterns are presented. Summing up
all the errors obtained for each input sample during one epoch will give the total mean
square error of that epoch. The network training is continued until this error is minimized
toa very small value.
Adopting the method above, the network training is done for OR function using
‘Adaline network and is tabulated below for @ = 0.1.
Weights
Net i
ee rarer ace Weight changes =e ee
nal t Ya tin Aw, Aw Ab (0.10.1 0.1) (tin)?
EPOCH
rit ot 03 07 007 007 «007 0.17 0.17 0.17049
1-11 10.17 0.830.083 -0.083 0.083 0.253 0087 0.253 0.69
111 1 0087 0.913 0.0913 0.0913 0.0913 0.1617 0.1783 03443 0.83
o
-1 ifn <012 SOLVED PROBLEMS
Hence,
* 107
a= f(ain) = (0.55) =
2 = F(2in2) = (0.45) = 1
+ After computing the output of the
hidden units, then find the net input entering into
the output unit:
“Yin = b3 + yy, +a,
=05+1x 054105
Yin = 1,5
+ Apply the activation function over the net input y,, to calculate the out
Y= F(vin) = F(1.5) = 1
+ Since t # y, weight updation has to be performed. Also since ¢ = -1
updated on z and zp that have positive net input. Since here both n
Xin? ate positive, updating the weights act bias
tput y:
» the weights are
et inputs Zin and
on both hidden units, we obtain
wij (new) = wi; (old) + (t — Zins) x;
bj(mew) = b (old) +a( — inj)
This implies:
wu (new) = wi (old) + Ot — Zing) xy
= 0.05 +.0.5(—1 ~ 0.55) x 1
—0,725
wi2(new) = wy(old) + a(t = xino)xy
0.1 +0.5(-1 ~ 0.45) x¢ 1
—0.625
bi(new) = by (old) +a(t — Zin)
=034+0.5(-1~ 0.55)
= -0.475
wai (old) + c(t — xin) xp
.2 + 0.5(—1 ~ 0.55) x 1
0.575
22(old) + a(t — zina)xp
.2 +.0.5(-1 ~ 0.45) x 1
0.525
ba (new) = by(old) + a(t — zin2)
= 0.15 + 0.5(—1 — 0.45)
=~0.575 :
adjusted. This
All the weights and bias between the input layer and a OT the weight
Completes the training for the first epoch. The same pro:108 «
i CHAPTER 3 / SUPERVISED LEARNING Nery,
i beware cabal Oke
. sing back
converges. It is found that the weight converges at the ae epochs. The table bely
shows the training performance ‘of Madaline network for X furiction.
r qi
acu : 15-058 +0.475
Jf =k 0.55 0.45 11 LS (1 --0.725 =0: | er
VET oh, 0628 ~0675 <1 -1 -05 -1 00875 -139 0.34 0625. -O525 if
-1o1l 1 1.1375 -0.475* -1 -1 705 =I | 0.0875 0.34 -1.3625 0.2125 op
1.4065 10.069 =0.98 0.207 1.309 ay
Si-rt =I 16375 W3HS ESD
Epoch-2
=1 03565, 0.168 1 1
2 =1.66. +079 0.207 454
i-l by 0.1845 —3.154..-1 1
1
LL
=].068 0.791 0.785) 5
1068-129 0.765 —1g
-1 1 -3.728 “0.002. -1 +1 \ d
HT-11 -1 =1.0495 -L071. =I =I £1,068 21:29": 41.29 hg
Epoch-3
1o1t -1 #10865 -1.083. -1 -1 -05 -1 EUMLOT sd: 129-195
I-11 1 15915-3655 1-105 T =1.34 -1.07 -1.29 129 ~109
a1odb 2 1 =3.228' 1501 <1 1) 132., -£34 -L07 =1.29 1.29
uW 1.0495 =1,701 1 07 1.29
is shown in Gigure 1
The ringed for Madaline network with final weights for XOR function
1
fe
Figure 11
eeey
Madaline network for XOR fut
—..
inction (Final weights given),® Guak-Pashogolon Neloock:— rad
@ Grkins Ls rdliligr fd fowree, wa
il socked -
tg Backword - rspagalion of error
6/P Layer
th {Reurons pusent im tideln sf Of ayer “have iases- Bian terms acl 2b voy
@® T/p send ve Bpn # O/P obhbntel rom ribisok could be’ efthn trary (on)
o1 dibolar 124)
® Rolivatton -fanchton could be amy-fl® lied increnass rmonsbaigally sf
dbo who clei fre Oaishs
Ls aa po gti meee es108
3 / SUPERVISED LEARNING Nety,,
ET hy
__ chapter 3 / SUPERUSES
. verges at the end of 3 epochs. The table hy,
converges Ie found hat PE rating Bee pork for XOR function.
rows the training performanc’ ©
a "
ah
Inputs Target
Ena 5 Lo 1g 1 0.75, 7058 0475 -0615 -0525 ~05,
rir te ous Sp a1 05-1 0.0875 7138 034 0625-0515. -o55
ie 29615 Ty Tog 1 00815 oh O34 13825 02125 og,
ata el OMe tL SE 168 chose 098 -0207 13 ~O4
onl 1 oss 075-166, -0791 -0207
a ise a1 Y5p0s 134-1068 0791 37
uae =0.002 Tt [pos <134 -1.068 129 OS
Typ <1 = 10495-1072 zt T3905 134 1068 127 129
“ath 2 107-29 129
rat o-l -1,0865 -1.083 -1 -1 -05 -1 1.32 -134 1 e i
V-ll 1 1.5915 -3.655 1-1 05 1 132 134 -1.07 1.29 1.29
150i -L- 1, Osa 1 ggio? 7134-107 -129 109
- 3.18
EE eet SB trae 134 101 3 1.29
Tyert-1=1.0495_-L-7oL_=1 a1
The network architecture for Madaline network with final weights for XOR function
is shown in Figure 11.
9. Using back-propagation network, find the new weights for the net shown in Figure 12:
is presented with the input pattern (0, 1] and the target output is 1. Use a learning re
= 0.25 and binary sigmoidal activation function.
Solution: The new weights are calculated based on the training algorithm in Section3.54
we pritial weights are (v1 021 toi] = [0.6 ~0.1 0.3], [or2 v22 Yea] = [-0.3 0.4 0.5] and
{wy wy wol = [0.4 0.1 —0.2}, and the learning rate is a = 0.25.
activation function used is binary sigmoidal activation function and is given by
Fj
‘igure 11 Madaline network for XOR function (final weights given).pike
BPN wsshown in Figure 3-19. The tern,
cess
fo! sO
15.3 Flowehart fOr Te pce ag ae oll
The the ein the KE)
vi , Xn
pe owe etn (e200)
f vi (the seen aces
viutput VEC
acuvation function, the mn.
peter
yg rate paral
Hace the input EYEE
same.)
tact
fear
input anit
pnt signals here
[aay on pth Iidden unit
jason kth ouput unit
F Hraklen unit j. The net input oT
ny = Yo + Ym
uses denny
and the output is
x = f (an)
yx = output unit k, The net input to yk is
Yink = Wok + DE TZ) Wyk
7
and the oucput is
= fink)
error correction weight adjustment for wj, that is due to an error at output un
which is back-propagated to the hidden units that feed into unit yx |
error correction weight adjustment for v,, that is due to the back-propagatix |
error to the hidden unit )-
Also, it should be noted that the commonly used activation functions are binary sigm
and bipolar sigmoidal activation functions (discussed in Section 2.3.3). These functior?
used in the BPN because of the following characteristics:
1. Continuity
2. Differentiability
3. Nondecresing monotony.
The range of binary sigmoid is from 0 to 1, and for bipolar sigmoid it is from —1 to +1.
.4 Training Algorithm
The error back-prop:
Step 0: Initialize weights and learning rate (take some stall random walues).
ion leaming algorithm can be outlined in the following algorith®
Step 1: Perform Steps 2-9 when stopping condition is false.
Step 2: Perform Steps 3-8 for each traming parr.
|lton)-
vf phe
‘ i
t receives input signal x, and sends it to the hidden unit (
put unt
a ‘aleulate net
ah j = 1 to p) sums its weighted input signals to calculate ne
ach bible wine 2
a toy = ty + ky
e ouput of the hidden unit by applying its activation functions over Ziny
solar sigmoidal activation function):
= = £ Gn)
nalsend the output signal from the hidden unit to the input of output layer units.
Foreach output unit x, (k = 1 tom), calculate the net input:
wat yu
ie
and apply the activation function to compute output signal
cateulat
hmary oF Bi
seep
Yank
Y= fink)
Bach propagation of error (Phase II):
Step 6: Each output unit ¥4(k = 1 to m) receives a target pattern corresponding to the input
training pattern and computes the error correction ter
5x = (ue — ye) f(dink)
hts
On the hasis of the calculated error correction term, update the change in w
and bin:
Aw), = az,
Away = 05,
Aho, send 4 to the hidden layer backwards.
Step 7: Each hidden unit (z,, j = 1 to p) sums its delta inputs from the output units:
= Yawn
het
Fm 8,4, gets multiplied with the denvauve of f(%m)) to calculate the error
4
The
tenn:
Bin f(a)
On the bias of the calculated 8, , update the change in weights and bias:
An, = 0,2,
Ary, = 03,ER 3 / SUPERVISED LEARNING NETWc,
Weight and bias upulatin (Phase Hl):
Step 8: Each output unit (5p, k = 1 tom) updates the bias and weight:
wy (new) = wy (old) + Ow ,%
wry (new) = wig (old) + Awok
Each Indden anit (2), j = 1 to p) updates its bias and weights:
1, (new) = u, (old) + Sv,
v9, (new) = vp, (ol) + B00,
Step 9: Check for the stopping condition. The stopping condition may be certain number a
ached or when the actual output equals the target output.
epochs
The above algorithi uses the incremental approach for updation of weights, ie., the
weights are being changed iinmediately after a taining pattern is presented. There is another
way of traning called batch-node training, where the weights are changed only after all the
training patterns are presented, The effectiveness of two approaches depends on the problem,
but batch-mode training requires additional local torage for cach connection to maintain the
te weight changes. When a BPN is used a sitter, it is equivalent to the opt
a discriminant function for asymptotically large sets of statisucally independent
named
inal Bayesian
training pattems.
thie ence i whether the back-nrons9. Using back-propagation network, find the new weights for the net shown in Figure 12.
is presented with the input pattern (0, 1] and the target output is 1. Use a learning rate
@ = 0.25 and binary sigmoidal activation function.
Solution: The new weights are calculated based on the training algorithm in Section 3.5.4.
The initial weights are [v1 vz) v1] = [0.6 -0.1 0.3], [viz v22 vo2] = [—0.3 0.4 0.5] and
[w) w2 wo] = [0.4 0.1 -0.2], and the learning rate is @ = 0.25.
Activation function used is binary sigmoidal activation function and is given by
il
fa) =e®)
“4
Figure 12 Network.
Given the output sample [x1, x)] =
* Calculate the net input.
For 2 layer
(0, 1] and target e = 1,
inl = Vor + x11 + x2vy
=03+0x06+4+1x-01
int = 0.2
For z) layer
Rin? = Vor + x1Uj2 + x2V27
=0.5+0x 0341 x 0.4
in2 = 0.9
Applying activation to calculate the output, we obtain
1
z= fun) = Term
1 1
n= $m) = Ta = Tees
= 0.7109
+ Calculate the net input entering the output layer. For y layer
Jin = Wo + BW + BU
0.2 + 0.5498 x 0.4 + 0.7109 x 0.1
Yin = 0.09101CHAPIEN 21 ov eOTiPEU LEARNING
— output, we obtain
aa alculate the out
$106
g acavaTions &
Applving 260 i
1
ose = Tp enna = 0.5227
1+e7*
samute the ernor portion Bk: ’
+ Comp 5 = (tq — MS Cink)
™ fin) = FOIL = £0)
“= 0,5227[1 - 0.5227]
f'(in) = 0.2495
This implies
4) = (1 — 0.5227) (0.2495)
= 0.1191
Find the changes in weights berween hidden and output layer:
Aw, = a8) = 0.25 x 0.1191 x 0.5498 = 0.0164
Aw? = a8}22 = 0.25 x 0.1191 x 0.7109 = 0.02117
Awy = ad) = 0.25 x 0.1191 = 0.02978
+ Compute the error portion 8, between input and hidden layer (j = 1 t02):
§) = 85) f'Cinj)
§in = 410; [only one output neuron]
= Sot = 8ywn1 = 0.1191 x 0.4 = 0.04764
> 42 = Siw = 0.1191 x 0.1 = 0.01191
Enon 8 = bu) fn)
Fm) = Com) = Fad
549811 ~ 0.5498)
= 0.2475
dai Ma)
= 0.04764 x 0.2475
$1 = 0.0118e111
Error, 5; = Sinz fin)
= f(a = fCanadl
= 0.7109[1 = 0.7109]
82 = Bina {'Cain2)
O.CLI9T x 0.2055
00245
Now find the changes in weights between input and hidden la
Avi = a5)x, = 0.25 x 0.0118 x 0=0
Avn = a8)x7 = 0.25 x 0.0118 x 1 = 0.00295
81 = 0.25 x 0.0118 = 0.00295
Aviz = 82x, = 0.25 x 0.00245 x 0= 0
Avy, = a63x7 = 0.25 x 0.00245 x 1 = 0,0006125
Avor = a6; = 0.25 x 0.00245 = 0,0006125
+ Compute the final weights of the network:
(new) = vi; (old) + Avy = 0.6 +0 = 0.6
¥12(new) = vj2(old) + Avy, = -0.3 +40 = -03
vy (new) = vy (old) + Aun, = —0.1 + 0.00295 = -0.09705
u22(new) = vz(old) + Avz2 = 0.4 + 0.0006125 = 0.4006125
wy (new) = w;(old) + Aw, = 0.4 + 0.0164 = 0.4164
w2(new) = w2(old) + Aw, = 0.1 + 0.02117 = 0.12117
v9 (new) = voi (old) + Avo, = 0.3 + 0.00295 = 0.30295
Up7 (new) = voz (old) + Avgr = 0.5 + 0.0006125 = 0.506125
wo(new) = wolold) + Awy = -0.2 + 0.02978 = -0.1702210. Find the new weights, using back-propagation network for the network shown in Figure 13.
The network is presented with the input pattern [—1, 1] and the target output is +1. Use
a leaming rate of a = 0.25 and bipolar sigmoidal activation function.
Solution: The initial weights are [v11¥21 v1] = [0.6 —0.1 0.3], [urz v22 vo2] = [-0.3 0.4
0.5] and [w; w2 wo] = [0.4 0.1 —0.2], and the learning rate is a = 0.25.
Activation function used is binary sigmoidal activation function and is given by
1
l+e*
ent
f(x) =
l+e*
Given the output sample [x1, x2] =[—1, 1] and target ¢ = 1:
* Calculate the net input._ LOAF IEK $ / SUPERVISED LEARny,
NG Wy
oN
03
Figure 13 Network.
For z layer
Unt = Vor + X11 + x2V21
0.3+(-1) x 0.6+1x-0.1
int = —0.4
For z layer
%in2 = Vor + X1Vj2 + x2V72
5+ (-1) x 0.341% 0.4
Zing = 1.2
Applying activation to calculate the output, we obtain
Lent 1. 904
z= f (zim) = Trevis = peas =~ 0.1974
mt | gta
+e Te“?
+ Calculate the net input entering the output layer. For y layer
22 = f (zin2) = = 0.537
Yin = Wo + Z1wi + Zw?
—0.2 + (—0,1974) x 0.4 + 0.537 x 0.1
0.22526
Yin
Applying activations to calculate the output, we obtain
1 "2022526
y= {Oi = = Typ eoaisTe2113
45 12 SOLVED PROBLEMS
+ Compute the error portion 5:
5 = (tu — yn) f"Cyink)
Now
Lyin) = 0.501 + Fyn II [L = find]
0.5[1 — 0.1122){1 + 0.1122]
Lyin) = 0.4937
This implies
41 = (1 +0.1122) (0.4937)
= 0.5491
Find the changes in weights between hidden and output layer:
Aw; = 08)21 = 0.25 x 0.5491 x —0.1974 = —0.0271
Aw) = a5) 27 = 0.25 x 0.5491 x 0.537 = 0.0737
Awy = a8) = 0.25 x 0.5491 = 0.1373
+ Compute the error portion 5 between input and hidden layer (j = 1 to 2):
5; = Sing f'(Zinj)
Sing =D bewje
it
[-- only one output neuron]
> bint = 0.5491 x 0.4 = 0.21964
= bina = 61wn1 = 0.5491 x 0.1 = 0.05491
Error, 41 = dint f'(zini)
= 0.21964 x 0.5 x (1 +0.1974)(1 — 0.1974)
5 1056
Error, 52 = din2 f'(zinz)
= 0.05491 x 0.5 x (1 — 0.537)(1 + 0.537)
5) = 0.0195
Now find the changes in weights between input and hidden layer:
Avi = 06) x, = 0.25 x 0.1056 x — —0.0264 '
Avy = 051 x2 = 0.25 x 0.1056 x 1 = 0.0264 |
Avo 5, = 0.25 x 0.1056 = 0.0264
Avyz = a5)x, = 0.25 x 0.0195 x -1 = 0.0049
Avzz = 052x2 .25 x 0.0195 x 1 = 0.0049
Avo) = a6; = 0.25 x 0.0195 = 0.0049x
114 ¢ CHAPTER 3/ SUPERVISED ta
RN
Compute the final weights of the network: G4,
uy (new) = vy (old) + Avis = 06 ~ 0.0264 = 0.575
v12(new) = vi2(old) + Avi2 = —0.3 — 0.0049
vy (new) = vn; (old) + Avr: = —0.1 + 0.0264
0.073
um (new) = v22(old) + Avzz = 0.4 + 0.0049 = 0.4049 6
4 — 0.0271 = 0.3729
w;{new) = wy(old) + Aw;
w;(new) = wp(old) + Awz = 0.1 + 0.0737 = 0.1737
up (new) = up; (old) + Avo; = 0.3 + 0.0264 = 0.3264
v9(new) = vpz(old) + Avoz = 0.5 + 0.0049 = 0.5049
wo(new) = wo(old) + Awo = —0.2 + 0.1373 = 0.0627
Thus, the final weight has been computed for the network shown in Figure 13,
0.3049