GNR-652 (ML for RS-I)
Instructor: Prof. Biplab Banerjee
Agenda for today
• Motivation for learning machine learning
• Course structure
• Exam pattern
• Audit requirements
• Resources
Motivation behind the course
Mostly we do…
Real-world examples (concerning images)
Some more examples
In other domains
• Bio-informatics
• Video analysis
• Haptics
• Text analysis
• Natural language processing
• Speech analysis
• And the list goes on…
Machine learning & AI
Course structure
• Major types of learning paradigms:
• Supervised (availability of labeled training data)
• Unsupervised (no availability of training data)
• Semi-supervised (availability of little amount of labeled training data)
• Reinforcement (goal-oriented approaches) (if time permits)
• Selected topics in deep learning (CNN, RNN, mostly supervised)
• We will learn different techniques related to all these major learning
paradigms
Some terminologies
• Data / features – Vectorial (ideally numerical) representations of
entities.
• Similarity – Paradigm to assess whether two entities share similar
identity (label)
• Training – Building the ML model given known data
• Testing (inference) – Prediction on future data
• Generalization capability – How good the system is for testing?
Supervised learning
• Classification vs regression
Approaches towards classification/regression
• Probabilistic vs deterministic
• Discriminative vs generative
• Active vs passive
• Distance based (K-nearest neighbor)
• Decision tree (CART, ID3)
• Support vector machine (margin based)
• Neural networks (mimics the human perception flow, basis for deep learning)
• Bayesian (probabilistic), graphical model
• Linear or non-linear curve fitting (regression)
• Model ensemble
• Ranking**
• Theoretical foundation for supervised learning (idea of risk, bias/variance trade-off,
regularization)
Unsupervised learning
• Clustering & Density estimation
Semi-supervised learning
• Graph based methods
• Probabilistic methods
• Transductive SVMs
Reinforcement learning
Policy learning idea, Q learning
Feature engineering
• Feature pre-processing and normalization
• Dimensionality reduction
• Filter and Wrapper based DR
• Principal component analysis
• Independent component analysis
• Auto-encoders
Quick glances to some advanced learning
techniques
• Transfer learning & Domain adaptation
• Zero-shot, few-shot learning
• Weakly-supervised learning
• Self-supervised learning
• Lifelong learning
• Learning a good distance measure – metric learning
Deep learning
• Convolutional networks (specific for images)
• Recurrent networks (more suitable for sequential data)
Basic knowledge of math (vector space, matrix, basic calculus) is desired. If not, follow some
Standard book to quickly review the basic concepts.
Programming assignments are encouraged to do in Python (but other programming language
Including matlab is fine)
Evaluation rules to be followed
• 25% Mid-semester exam
• 40% End semester exam
• 20% Course Project
• Implement any paper
• Use ML for some applications (preferably using satellite images)
• 15% Assignments (Coding and Paper review)
• Coding assignments – 2-3 (10%)
• Paper Review (5%)
Audit - Mid Sem / End Sem / Course Project + Assignments
Resources
• Understanding Machine Learning. Shai Shalev-Shwartz and Shai Ben-
David. Cambridge University Press. 2017. Available online.
• Pattern recognition and machine learning by Christopher Bishop,
Springer Verlag, 2006.
• Hastie, Tibshirani, Friedman The elements of Statistical Learning
Springer Verlag
• T. Mitchell. Machine Learning. McGraw-Hill, 1997.
• Deep Learning by Ian Goodfellow, Yoshua Bengio and Aaron Courville,
MIT Press, 2016.
Videos
• https://www.microsoft.com/en-
us/research/people/cmbishop/?from=http%3A%2F%2Fresearch.micr
osoft.com%2Fen-us%2Fum%2Fpeople%2Fcmbishop%2Fprml%2F
• Videolectures
• Youtube courses of – Ali Ghodsi, Nando De Freitas and many more
Lets learn to learn!