COURSE YEAR OF
CODE COURSE NAME L-T-P-C INTRODUCTION
EC467 PATTERN RECOGNITION 3-0-0-3 2016
Prerequisite: NIL
Course objectives:
To introduce the fundamental algorithms for pattern recognition
To instigate the various classification and clustering techniques
Syllabus: Review of Probability Theory and Probability distributions, Introduction to Pattern
Recognition and its applications, Bayesian decision theory, Bayesian estimation: Gaussian
distribution, ML estimation, EM algorithm, Supervised and unsupervised learning, Feature
selection, Linear Discriminant Functions, Non-parametric methods, Hidden Markov models for
sequential data classification, Linear models for regression and classification, Clustering
Expected outcome:
The students will be able to
i. Design and construct a pattern recognition system
ii. Know the major approaches in statistical and syntactic pattern recognition.
iii. Become aware of the theoretical issues involved in pattern recognition system design
such as the curse of dimensionality.
iv. Implement pattern recognition techniques
Text Books
1. C M Bishop, Pattern Recognition and Machine Learning, Springer
2. R O Duda, P.E. Hart and D.G. Stork, Pattern Classification and scene analysis, John
Wiley
References
1. Morton Nadier and Eric Smith P., Pattern Recognition Engineering, John Wiley & Sons,
New York, 1993.
2. Robert J. Schalkoff, Pattern Recognition : Statistical, Structural and Neural Approaches,
John Wiley & Sons Inc., New York, 2007.
3. S.Theodoridis and K. Koutroumbas, Pattern Recognition, 4/e, Academic Press, 2009.
4. Tom Mitchell, Machine Learning, McGraw-Hill
5. Tou and Gonzales, Pattern Recognition Principles, Wesley Publication Company,
London, 1974.
Course Plan
Module Course content End
Sem
Hours
Exam
Marks
Introduction: Basics of pattern recognition system, various
applications, Machine Perception, classification of pattern 3
I recognition systems 15%
Design of Pattern recognition system, Pattern recognition Life Cycle 2
Statistical Pattern Recognition: Review of probability theory,
Gaussian distribution, Bayes decision theory and Classifiers, 4
Optimal solutions for minimum error and minimum risk criteria,
Normal density and discriminant functions, Decision surfaces
Parameter estimation methods: Maximum-Likelihood estimation,
Expectation-maximization method, Bayesian parameter estimation
2
II Concept of feature extraction and dimensionality, Curse of 15%
dimensionality, Dimension reduction methods - Fisher discriminant
analysis, Principal component analysis 6
Hidden Markov Models (HMM) basic concepts, Gaussian mixture
models.
FIRST INTERNAL EXAM
Non-Parameter methods: Non-parametric techniques for density 3
estimation - Parzen-window method, K-Nearest Neighbour method.
III Non-metric methods for pattern classification: Non-numeric data or 15%
nominal data 3
Decision trees: Concept of construction, splitting of nodes, choosing
of attributes, overfitting, pruning
IV Linear Discriminant based algorithm: Perceptron, Support Vector 5 15%
Machines
SECOND INTERNAL EXAM
Multilayer perceptrons, Back Propagation algorithm, Artificial 4
V Neural networks 20%
Classifier Ensembles: Bagging, Boosting / AdaBoost 3
Unsupervised learning: Clustering - Criterion functions for
VI clustering, Algorithms for clustering: K-means and Hierarchical 5 20%
methods, Cluster validation
END SEMESTER EXAM
Question Paper Pattern
The question paper shall consist of three parts. Part A covers modules I and II, Part B covers
modules III and IV, and Part C covers modules V and VI. Each part has three questions
uniformly covering the two modules and each question can have maximum four subdivisions.
In each part, any two questions are to be answered. Mark patterns are as per the syllabus with
70% for theory and 30% for logical/numerical problems, derivation and proof.