Academics / Courses / DescriptionsELEC_ENG 375, 475: Machine Learning: Foundations, Applications, and Algorithms
VIEW ALL COURSE TIMES AND SESSIONS
Prerequisites
A thorough understanding of Linear Algebra and Vector Calculus (e.g., students should be able to easily compute gradients/Hessians of a multivariate function), as well as basic understanding of the Python or MATLAB/OCTAVE programming environments.Description
From robotics, speech recognition, and analytics to finance and social network analysis, machine learning has become one of the most useful set of scientific tools of our age. With this course we want to bring interested students and researchers from a wide array of disciplines up to speed on the power and wide applicability of machine learning. The ultimate aim of the course is to equip you with all the modelling and optimization tools you’ll need in order to formulate and solve problems of interest in a machine learning framework. We hope to help build these skills through lectures and reading materials which introduce machine learning in the context of its many applications, as well as by describing in a detailed but user-friendly manner the modern techniques from nonlinear optimization used to solve them. In addition to a well curated collection of reference materials, registered students will receive a draft of a forthcoming manuscript authored by the instructors on machine learning to use as class notes.
- Course is crosslisted with Data_Sci 423
COURSE INSTRUCTOR: Prof. Aggelos Katsaggelos
COURSE OUTLINE:
1. Introduction
-What kinds of things can you build with machine learning tools?
-How does machine learning work? The 5 minute elevator pitch edition
-Predictive models - our basic building blocks
-Feature design and learning - what makes things distinct?
-Numerical optimization - the workhorse of machine learning
2. Fundamentals of numerical optimization
-Calculus defined optimality
-Using calculus to build useful algorithms
-Gradient descent
-Newton's method
3.Regression
-Linear regression - applications in climate science, feature selection, compression, neuroscience, and marketing
-Knowledge-driven feature design for regression
-Nonlinear regression
-The L-2 regularizer
4.Classification
-The perceptron
-Logistic regression/Support Vector Machines
-Multiclass classification
-Knowledge driven feature design for classification- examples from computer vision (object/face detection and recognition), text mining, and speech recognition
5. Probabilistic Formulation
A. Regression
-Bayesian linear regression
-Non-linear regression
-Sparse linear regression
B. Classification
-Bayesian logistic regression
-Non-linear logistic regression
-Boosting
6. Feature learning
-Function approximation and bases of features
-Feed-forward neural network bases, deep learning, and kernels
-Cross-validation
7. Special topics
-Step length determination for gradient methods
-Advanced gradient descent schemes: stochastic gradient descent and momentum
-Dimension reduction: K-means clustering and Principal Component Analysis
TEACHING METHOD: Weekly problem sets will be assigned and graded. Homeworks will be assigned on Fridays and will be due the following Friday. Late homeworks will not be accepted.
Final grades for the course will be based entirely on homework assignment grades.
Up to 1 percentage point of extra credit can be earned by the first student to report a particular error found in the class text. Additional extra credit points will be considered for constructive suggestions for improving the text.
EVALUATION METHOD: There are no exams in this course.
CLASS MATERIALS (REQUIRED): J. Watt, R. Borhani, and A. K. Katsaggelos, Machine Learning Refined: Foundations, Algorithms, and Applications, Cambridge University Press, 2016.