Academics / Courses / DescriptionsELEC_ENG 433: Statistical Pattern Recognition
VIEW ALL COURSE TIMES AND SESSIONS
Prerequisites
ELEC_ENG 302 or equivalent.Description
CATALOG DESCRIPTION: Fundamental and advanced topics in statistical pattern recognition including Bayesian decision theory, Maximum-likelihood and Bayesian estimation, Nonparametric density estimation, Component Analysis and Discriminants, Kernel machines, Feature selection, dimension reduction and embedding, Boosting, Minimum description length, Mixture models and clustering, Spectral clustering, Bayesian network and Hidden Markov models, with the applications to image and video pattern recognition.
REQUIRED TEXT: None
REFERENCE TEXT: R. Duda, P. Hart and D. Stork, Pattern Classification, 2ndEdition, Wiley-Interscience, 2001
COURSE DIRECTOR: Prof. Ying Wu
COURSE GOALS: To gain a profound understanding of the theories, algorithms, and applications of the state-of-the-art of statistical pattern recognition, various mathematical approaches, and the applications to image and video pattern analysis and recognition. This is a research-orientated course.
PREREQUISITES BY COURSES: ELEC_ENG 302 or equivalent.
PREREQUISITES BY TOPIC:
- Linear algebra
- Probability theory
- Signal and Systems
- C/C++ or MATLAB
DETAILED COURSE TOPICS:
- Introduction to pattern recognition systems and problems.
- Bayesian decision theory, Minimum-error-rate classification, Chernoff bound and Bhattacharyya bound, Missing features.
- Maximum-likelihood estimation, Bayesian estimation, Sufficient statistics and exponential family, Overfitting, and Expectation-Maximization (EM).
- Principal component analysis (PCA), Linear discriminant analysis (LDA), and Independent Component analysis (ICA).
- Nonparametric density estimation, Parzen Windows, Mean-shift, Nearest-neighbor classification, Metric learning.
- Support vector machines (SVM), Kernel machines, Maximum margin classification, Generalizability, and VC dimension.
- Feature selection, Dimension reduction and embedding, ISOMAP, Local linear embedding (LLE), Multidimensional scaling (MDS), Manifold learning.
- Boosting, Bagging, Bootstrapping, Cross validation, and Component classifiers.
- Bayesian networks, dynamic Bayesian networks, Hidden Markov models, and Markov random fields.
- Mixture models and clustering, Spectral clustering, Hierarchical clustering.
GRADES:
- Homeworks and labs--- 30%
- Final Projects and presentations--- 70%
COURSE OBJECTIVES:
When a student completes this course, s/he should be able to:
- Understand the core theories and algorithms of statistical pattern recognition
- Understand the state-of-the-art of statistical pattern recognition,
- Perform parametric classifier design,
- Perform nonparametric classifier design,
- Perform feature selection and dimension reduction,
- Perform unsupervised data clustering,
- Understand the applications such as face recognition, face detection, object detection, gesture recognition, speech recognition, etc.
COURSE SCHEDULE (TENTATIVE):
Week 1: Intro and Bayesian classification
L1: Intro (classification/regression/density, feature+selection+PR, OCR, speech, object, action, gesture, generative/discriminative, Bayesian/non-bayesian)
L2: Bayesian decision, Minimum-error-rate classification, Chernoff bound, Bhattacharyya bound, ML/Bayesian estimation
Week 2: PCA/LDA/ICA
L3: PCA/LDA
L4: ICA
Week 3: Nearest Neighbor Classifier
L5: K-NN classification
L6: Advanced NN
Week 4: Nonparametric density estimation and clustering
L7: Parzen window, Mean-shift
L8: Mixture models, EM, spectral clustering
Week 5: Kernel machines
L9: SVM
L10: Kernel machines
Week 6: Dimension reduction and embedding
L11: ISOMAP
L12: LLE and MDS
Week 7: Feature selection and Boosting
L13: Boosting
L14: Advanced boosting
Week 8: Generative models: BN and HMM
L15: Bayesian networks
L16: HMM
Week 9: Generative models: MRF and CRF
L17: MRF
L18: CRF
Week 10: (option) PAC learnability and VC dimension
Week 11: (option) Learning Gibbs distributions