Aaron Sidford (sidford@stanford.edu)

Welcome

This page has the content from the course Introduction to Optimization Theory (MS&E213 / CS 269O) which I taught in Spring 2017. Here you can find the syllabus, lecture notes, and more. The Piazza page for class discussion is no longer active, but if you have questions about the content, please feel free to email me directly. I hope you enjoy the content as much as I enjoyed teaching the class.

Course Overview

This class will introduce the theoretical foundations of continuous optimization. Starting from first principles we show how to design and analyze simple iterative methods for efficiently solving broad classes of optimization problems. The focus of the course will be on achieving provable convergence rates for solving numerous complex large-scale problems. For more information about the course see the syllabus.

Quick Course Details:

- Lectures: MW / Th 1:30PM - 2:50PM - 380-380D
- Office Hours: W 3:30PM - 4:30PM and Th 5:30 - 6:30 PM in Huang 358
- Additional office hours available apointmentt, email me.
- Email: sidford@stanford.edu

Email: sidford@stanford.edu

Here are the links for the course lecture notes. Feedback is welcome and if there is anything that is unclear or you would like added, please feel free to contact me. My apologies in advance for missing citations and references; these will be added over time in future course offerings

**Chapter 1: Introduction**: The notes for this chapter ar here.**Chapter 2: Smoothness**: The notes for this chapter are here.**Chapter 3: Convexity**: The notes for this chapter are here.**Chapter 4: Acceleration**: The notes for this chapter are here.**Chapter 5: Smooth Extensions**: The notes for this chapter are here.**Chapter 6: Non-smooth Convex Functions**: The notes for this chapter are here.**Chapter 7: Cutting Plane Methods**: The notes for this chapter are here.**Chapter 8**:**Subgradient / Mirror Descent**: The notes for this chapter so far are here.**Chapter 9: Interior Point Methods**: The notes for this chapter so far are here.**Appendix A: Norms**: The notes for this chapter are here.

Here is the tentative schedule of material for the course. This will be updated to reflect what we actually cover.

Week 1

**Lecture #1 (M 4/3)**: Introduction – oracles, effeciency, and why optimization is impossible**Lecture #2 (W 4/5)**: Introduction – why optimization is doable, but expensive (Lipschitz functions)

Week 2

**Lecture #3 (M 4/10)**: Smoothness - computing critical points dimension free**Lecture #4 (W 4/12):***(*Strong*)*Convexity – computing global optima

Week 3

**Lecture #5 (M 4/17)**: Acceleration – tight rates for smooth convex functions**Lecture #6 (W 4/19)**: Acceleration – tight rates for smooth convex functions

Week 4

**Lecture #7 (M 4/24)**: Smooth Convex Optimization – extensions (norms, composite, coordinate, etc.)**Lecture #8 (W 4/26)**: Smooth Convex Optimization – extensions (norms, composite, coordinate, etc.)

Week 5

**Lecture #9 (M 5/1)**: Nonsmooth Convex Optimization – separating hyperplanes**Lecture #10 (W 5/3)**: Nonsmooth Convex Optimization –separating hyperplanes / cutting plane methods

Week 6

**Lecture #11 (M 5/8)**: Nonsmooth Convex Optimization – reduction to feasibility / cutting plane methods**Lecture #12 (W 5/10)**:Nonsmooth Convex Optimization – cutting plane methods / start of subgradient methods

Week 7

**Lecture #13 (M 5/15)**: Subgradient Descent – online linear optimization and follow the regularized leader (FTRL)**Lecture #14 (W 5/17)**: Subgradient Descent - FTRL for convex optimization and expert learning

Week 8

**Lecture #15 (M 5/22)**: Subgradient Descent – mirror descent and SGD**Lecture #16 (W 5/24)**: Subgradient Descent - SGD and variance reduction

Week 9

**Lecture #--- (M 5/29)**: Memorial Day (holiday, no classes)**Lecture #17 (W 5/31)**: Newton's Method - non-convex convergence extension and self-concordance introduction

Week 10

**Lecture #18 (M 6/5)**: Interior Point Methods – intro to path finding*(End-Quarter Period)***Lecture #29 (W 6/7)**: Linear Programming*(End-Quarter Period)*

The material in the lecture notes is based primarily on my own experience with optimization and the following two texts:

- Introductory Lectures on Convex Programming Volume I: Basic Course by Yurii Nesterov. (The printed version is perhaps the closest prior work to the course, but it is not required reading.)
- Convex Optimization: Algorithms and Complexity by Sébastien Bubeck.

Additional resources that may be helpful include the following:

- Convex Optimization by Stephen Boyd and Lieven Vandenberghe
- CSE 599: Interplay between Convex Optimization and Geometry a course by Yin Tat Lee