Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
3 views17 pages

Machine Learning For Signal Processing Lecture 30

Uploaded by

mergushashank
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views17 pages

Machine Learning For Signal Processing Lecture 30

Uploaded by

mergushashank
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 17

Machine Learning for

Signal Processing

CIS 4190/5190: Applied Machine Learning (Fall 2022) (upenn.edu)


Support Vector Machine (SVM) – Introduction
 SVM is one of the mort popular Supervised Learning algorithms,
which is used for classification as well as regression problems.

 Primarily, it is used for Classification problems in Machine Learning.

 The goal of the SVM algorithm is to create the best line or decision
boundary that can segregate n-dimensional space into classes so that we
can easily put the new data point in the correct category in the future.
Support Vector Machine (SVM) – Hyperplane
 There can be multiple lines/decision boundaries to segregate the classes
in n-dimensional space.
 We need to find out the best decision boundary that helps to classify the
data points.
 This best boundary is known as the hyperplane of SVM.
Support Vector Machine (SVM) – Hyperplane
 The dimensions of the hyperplane depend on the features present in the
dataset..
 If there are 2 features, then the hyperplane will be a straight line.
 If there are 3 features, then the hyperplane will be a 2-dimensional
place.
 A hyperplane providing the maximum distance between the data points
(maximum margin) is selected.
 The data points or vectors that are closest to the hyperplane and which
affect the position of the hyperplane are termed as Support Vector.
Support Vector Machine (SVM) – Types
 Linear SVM:
o Used for linearly separable data meaning if a dataset can be classified
into two classes using a single straight line.
o The classifier is called as linear SVM classifier.
 Non-Linear SVM:
o Used for nonlinearly separable data meaning if a dataset cannot be
classified into by using a single straight line.
o The classifier is called as non-linear SVM classifier.
Support Vector Machine (SVM) – Types
 Linear SVM:
Support Vector Machine (SVM) – Types
 Linear SVM:
Linear SVM – Solved Example
 Suppose we are given the following set of data points.

o Positively labelled data points of class 𝜔1 : 𝑿1 = 𝑥1 , 𝑥2 =


3,1 𝑇 , 3, −1 𝑇 , 6,1 𝑇 , 6, −1 𝑇

o Negatively labelled data points of class 𝜔2 : 𝑿2 = 𝑥1 , 𝑥2 =


1,0 𝑇 , 0,1 𝑇 , 0, −1 𝑇 , −1,0 𝑇
Linear SVM – Solved Example
Linear SVM – Solved Example
 The three support vectors by inspection are:
Linear SVM – Solved Example
Linear SVM – Solved Example
 Each vector is augmented with a 1 as a bias input.

 So, if

 Similarly, if

 and, if
Linear SVM – Solved Example
 Calculate 𝛼1 , 𝛼2 , and 𝛼3 from the following three equations:
Linear SVM – Solved Example
 Calculate 𝛼1 , 𝛼2 , and 𝛼3 from the following three equations:
Linear SVM – Solved Example
 Calculate 𝛼1 , 𝛼2 , and 𝛼3 from the following three equations:
Linear SVM – Solved Example
 Get:

 As the vector was augmented with a bias 1, the last entry of 𝑤


෥ gives the
hyperplane offset 𝑏.
1
 Hyperplane equation: 𝑦 = 𝑤.
෥ 𝑥෤ + 𝑏, where 𝑤
෥= and 𝑏 = −2.
0
Linear SVM – Solved Example

You might also like