TTIC 31070: Convex Optimization (Spring 2026)

Cross-listed as TTIC 31070 / CAAM 31015 / CMSC 35470 / BUSF 36903 / STAT 31015.

Instructor Zhiyuan Li
Email ttic-31070-convex-optimization-2026@ttic.edu
Schedule Tuesday & Thursday, 2:00 – 3:20 PM
Location TTIC 530
Office Hours Zhiyuan Li: Tuesday & Thursday 3:20 – 3:50 PM, TTIC 508. TA office hours TBD.
TAs Shuo Xie, Marko Medvedev, Beining Wu, Richard Xu
Canvas canvas.uchicago.edu/courses/71971

Course Description

The course will cover techniques in unconstrained and constrained convex optimization and a practical introduction to convex duality. The course will focus on (1) formulating and understanding convex optimization problems and studying their properties; (2) presenting and understanding optimization approaches; and (3) understanding the dual problem.

Topics include:

  • Formalization of optimization problems
  • First-order optimization methods: gradient descent, mirror descent, non-Euclidean steepest descent, acceleration, and Newton’s method
  • Standard formulations of constrained optimization: linear, quadratic, conic, and semidefinite programming
  • KKT optimality conditions
  • Lagrangian duality, constraint qualification, weak and strong duality
  • Fenchel conjugacy and its relationship to Lagrangian duality
  • Equality-constrained Newton method
  • Log barrier (central path) methods and primal-dual optimization methods
  • Cutting-plane methods: center of mass and ellipsoid algorithm

Prerequisites

Linear Algebra, Vector Calculus, and Algorithms at undergraduate level, OR Matrix Computation (STAT/CAAM 30900). CMSC 25300 can replace Linear Algebra and Vector Analysis, but an algorithms course would still be required.

Registration:

  • PhD students in programs where the course is listed (TTIC, STAT, CAM, CS, or Booth) may register independently after checking they have the required prerequisites.
  • Other graduate students, including master students and students in other PhD programs, may register if they took Matrix Computation, or after emailing explaining their prerequisites and receiving permission from the instructor.
  • Undergraduate students may only register after receiving permission from the instructor based on the prerequisites and background.

Schedule and Materials

Schedule is subject to change.

# Date Topic Materials
1 Tue, Mar 24 Introduction and Convexity Notes Slides Lean
2 Thu, Mar 26 Separation and Duality Notes Slides Lean
3 Tue, Mar 31 Linear programming Notes Slides Lean
4 Thu, Apr 2 Convex conjugates, and Marginal duality Notes Slides Lean
5 Tue, Apr 7 Lagrange multipliers, and KKT Notes Slides Lean
6 Thu, Apr 9 Cutting-plane methods Notes Slides
7 Tue, Apr 14 Steepest descent and descent lemmas Notes Slides
8 Thu, Apr 16 Mirror descent and Bregman divergences Notes Slides
9 Tue, Apr 21 Online Convex Optimization, Subgradient Mirror Descent, and Stochastic Reduction
10 Thu, Apr 23 Adaptive Optimization and Well-structured Preconditioners
11 Tue, Apr 28 Frank–Wolfe and Non-Euclidean Descent
12 Thu, Apr 30 Oracle Complexity Lower Bounds
13 Tue, May 5 Accelerated Gradient Descent
14 Thu, May 7 Hessians and Newton's Method
15 Tue, May 12 Conic Optimization
16 Thu, May 14 Self-Concordant Functions and Barrier Geometry
17 Tue, May 19 Central Path and Primal-Dual Interior-Point Methods
18 Thu, May 21 Review and Synthesis

Homework

Students are expected to do homeworks, but are not required to submit them. The final evaluation is determined by final exam and bonus points throughout the quarter. There is a high probability that the final exam samples from the homework.

Assignment Posted Due Materials
HW 1 Apr 2 Not required PDF Solutions (updated Apr 10)
HW 2 Apr 14 Not required PDF (updated Apr 14)

LLM Usage Policy

You are welcome to use large language models (e.g., ChatGPT, Claude, Gemini) freely throughout this course, unless a specific assignment or exam explicitly states otherwise.

In fact, we encourage you to treat LLMs as an active learning tool. One of the most effective ways to solidify your understanding of the material is to engage in a dialogue with an LLM: ask it to explain a concept you find confusing, request a worked example, or have it walk you through a proof step by step. When a single answer raises a new question, follow that thread — dig deeper before moving on. This kind of depth-first exploration often builds stronger intuition than a single pass through the notes.

That said, keep in mind that LLMs can produce plausible-sounding but incorrect mathematics. Always verify any non-trivial claim against the course notes, textbooks, or your own reasoning. The goal is to use LLMs to accelerate your learning, not to substitute for it.

Communication

  • Questions about course material — ask on Canvas, in person during TA or instructor office hours, or during lectures. Do not ask by direct email.
  • Homework clarifications or logistics — ask on Canvas (preferred) or during TA office hours.
  • Help with homework — seek help during TA office hours.
  • General comments or feedback — via Canvas, anonymously via Canvas, to the staff email list, or directly to the relevant staff.
  • Personally sensitive issue — you may contact any staff member you are comfortable contacting.

Resources

  • S. Bubeck, Convex Optimization: Algorithms and Complexity, Foundations and Trends in ML, 2015. [pdf]
  • A. Beck, First-Order Methods in Optimization, SIAM, 2017. [SIAM]
  • S. Boyd and L. Vandenberghe, Convex Optimization, Cambridge University Press, 2004. [free online]
  • D. Bertsekas, Nonlinear Programming, 2nd ed., Athena Scientific, 1999.
  • J. Nocedal and S. Wright, Numerical Optimization, 2nd ed., Springer, 2006.
  • D. Bertsekas, A. Nedic, and A. Ozdaglar, Convex Analysis and Optimization, Athena Scientific, 2003.
  • A. Ben-Tal and A. Nemirovski, Lecture Notes on Modern Convex Optimization, 2013.
  • A. Nemirovski, Information Based Complexity of Convex Programming, 1994/5.