21AI66
Model Question Paper-1/2 with effect from 2021(CBCS Scheme)
USN
Sixth Semester B.E. Degree Examination
Machine Learning
TIME: 03 Hours Max. Marks: 100
Note: Answer any FIVE full questions, choosing at least ONE question from each
MODULE.
*Bloom’s COs
Module -1 Taxonomy Marks
Level
Q.01 a Explain different types of Machine Learning Systems in brief L2 1 4
b Illustrate some of the basic design issues and approaches to machine L2 1 6
learning considering designing a program to learn to play checkers.
c Apply the Candidate Elimination algorithm to a set of training examples to L3 1 10
demonstrate how it identifies the boundary hypotheses in the version space.
OR
Q.02 a Define the following terms: (i) Concept Learning (ii)Version Space (iii) L2 1 6
Hypothesis Space (iv)General Boundary (v) Specific Boundary
b Discuss limitations of Find-S over Candidate Elimination Algorithm L2 1 8
c Explain inductive bias in brief L2 1 6
Module-2
Q. a Explain training of binary classifier L2 2 10
03
b Explain Error Analysis in brief L2 2 10
OR
Q.04 a L2 2 10
Explain the following with example:
a) Multi output classification
b) Multi-Label classification
c) Multi class Classification
b Explain the steps involved in classification using MNIST Dataset. L2 2 10
Module-3
Q. a Explain the following: (i) Batch Gradient Descent (ii) Stochastic Gradient L2 3 6
05 Descent (iii) Mini-batch Gradient Descent
b Explain Linear Support Vector Classification in brief L2 3 8
c Explain Polynomial Regression in brief L2 3 6
OR
Q. a Explain Regularized Linear Models in brief L2 3 4
06
b Explain the difference between Linear and Non-Linear SVM Classification L2 3 6
02082024
Page 01 of 02
21AI66
c Explain Logistic Regression in brief L2 3 10
Module-4
Q. a Demonstrate working of Voting classifier with code that creates and trains L2 4 8
07 it.
b Explain Bagging and Pasting in brief. L2 4 4
c Demonstrate how new predictors can correct its predecessor by using L3 4 8
training instances of underfitted predecessor.
OR
Q. a Explain the following in brief: a) Training and visualizing of decision tree b) L2 4 6
08 Making predictions
b Explain Stack Generalization in brief. L2 4 4
c Construct a regression using the following data which consists of 10 data L3 4 10
instances and three attributes “Assessment’, ‘Assignment’ and Project.
Module-5
Q. a L2 5 4
09 Explain Gibbs Algorithm in brief
b Given the conditional probability, using the Naïve Bayes Classifier, estimate L3 5 10
the probability of having heart disease for a person with high blood pressure.
c Explain EM algorithm in brief L2 6
OR
Q. a Estimate the conditional probability of each attribute {color, legs, height, L3 5 10
10 smelly} for the species classes:{M, H} using the data given in the table
behind. Using these probabilities estimate the probability values for the new
instance-{color=green, legs=2, height=tall, and smelly=No}
02082024
Page 02 of 02
21AI66
b Explain Minimum Description Length Principle in brief. L2 6
c Explain Maximum Likelihood in brief L2 4
*Bloom’s Taxonomy Level: Indicate as L1, L2, L3, L4, etc. It is also desirable to indicate the COs and POs to be
attained by every bit of question.
02082024
Page 03 of 02