Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
8 views1 page

09 PracticeAssignment SVM

The document discusses various concepts related to Support Vector Machines (SVM) and perceptrons, including the differences in their decision boundaries, the role of support vectors, and the use of Lagrangian multipliers for optimization. It also addresses how to handle noisy data and the introduction of slack variables in SVM formulations, as well as the 'kernel trick' for making non-linearly separable data linearly separable. Additionally, it includes a specific dataset for which decision boundaries and margins need to be computed.

Uploaded by

jiofijiofi0
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views1 page

09 PracticeAssignment SVM

The document discusses various concepts related to Support Vector Machines (SVM) and perceptrons, including the differences in their decision boundaries, the role of support vectors, and the use of Lagrangian multipliers for optimization. It also addresses how to handle noisy data and the introduction of slack variables in SVM formulations, as well as the 'kernel trick' for making non-linearly separable data linearly separable. Additionally, it includes a specific dataset for which decision boundaries and margins need to be computed.

Uploaded by

jiofijiofi0
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

1.

With an example explain the difference between the decision boundary of a perceptron and
the decision boundary in an SVM.

2. What are support vectors in SVM? Explain with an example.

3. Solve the following using Lagrangian multipliers method:

min 𝑓(𝑥, 𝑦) = 2𝑥 2 + 𝑦 2
𝑥,𝑦
subject to 𝑔(𝑥, 𝑦) = 𝑥 + 𝑦 − 2 = 0

4. For the dataset given below, the Lagrangian multipliers have been obtained solving through
quadratic programming. Compute the decision boundary and the margin.

Lagrange
x1 x2 y multipliers
0.96 0.93 -1 0
0.75 0.25 -1 2.0812
0.91 0.52 -1 0
0.83 0.57 -1 0
0.98 0.06 -1 0
0.82 0.25 -1 0
0.18 0.06 1 2.0812
0.07 0.79 1 0
0.21 0.14 1 0
0.08 0.92 1 0

5. How do we deal with noisy data in SVM formulation.

6. What is the purpose of introducing slack variables in the formulation of SVM.

7. Explain “kernel trick”. Given an example where data becomes separable after applying the
“kernel trick”.

You might also like