The schedule is tentative and subject to change (e.g., snow days).
Lecture | Topic |
---|---|
1/21 | Course overview and introduction. Linear classification and the Perceptron algorithm. |
1/23, 1/28 | Review of concepts from linear algebra and multivariate calculus. |
1/30, 2/4 | Introduction to optimization, examples of optimization problems. |
2/6 | Optimality conditions for general problems. |
2/11, 2/13 | Convex functions and sets, optimality conditions for convex problems. |
2/18 | No class (Monday schedule) |
2/20 | Oracle models, iterative methods, and gradient descent. |
2/25, 2/27 | Gradient descent algorithms for convex optimization problems. |
3/4 | Supervised learning. Linear models. Algorithms for linear regression. |
3/6 | In-class midterm exam (covers all material up to and including 2/27 lecture) |
3/11, 3/13 | No classes (Spring break) |
3/18 | Algorithms for linear classification. |
3/20, 3/25 | Neural network models. Stochastic gradient descent. |
3/27 | Adaptive gradient descent algorithms. |
4/1, 4/3 | Introduction to linear programming. Modeling using LPs. LP duality. |
4/8, 4/10 | Algorithmic frameworks based on LPs and duality. |
4/15, 4/17 | Further applications of duality: flows and cuts, zero-sum games, boosting. |
4/22, 4/24 | Prediction using expert advice. Majority algorithms. Multiplicative weights update algorithm. |
4/29, 5/1 | Applications to algorithm design: Winnow algorithm for supervised classification, packing/covering LPs, zero-sum games, boosting. |
Acknowledgments:I am indebted to my colleagues at other institutions for some of the material in the lectures: Amir Ali Ahmadi's course at Princeton, Yaron Singer's course at Harvard, Nick Harvey's course at UBC, ... . The specific references/credits are on the References slide at the end of each lecture.