ISBN: 1-886529-28-0, 978-1-886529-28-1
Publication: February, 2015, 576 pages, hardcover
Price: $89.00
Contents and Preface, Chapter 1, Course Material from MIT OCW
This book aims at an up-to-date and accessible development of algorithms for solving convex optimization problems. The book covers almost all the major classes of convex optimization algorithms. Principal among these are gradient, subgradient, polyhedral approximation, proximal, and interior point methods. Most of these methods rely on convexity (but not necessarily differentiability) in the cost and constraint functions, and are often connected in various ways to duality. The book contains numerous examples describing in detail applications to specially structured problems.
The book complements our Convex Optimization Theory (Athena Scientific, 2009) book, but can be read independently. The latter book focuses on convexity theory and optimization duality, while the present book focuses on algorithmic issues. The two books share mathematical prerequisites, notation, and style, and together cover the entire finite-dimensional convex optimization field. Both books rely on rigorous mathematical analysis, but also aim at an intuitive exposition that makes use of visualization where possible. This is facilitated by the extensive use of analytical and algorithmic concepts of duality, which by nature lend themselves to geometrical interpretation.
The book may be used as a text for a convex optimization course with a focus on algorithms; the author has taught several variants of such a course at MIT and elsewhere over the last fifteen years. It may also be used as a supplementary source for nonlinear programming classes, and as an algorithmic foundation for classes focused on convex optimization models. It is an excellent supplement to several of our books: Convex Optimization Theory (Athena Scientific, 2009), Nonlinear Programming (Athena Scientific, 2016), Network Optimization (Athena Scientific, 1998), and Introduction to Linear Optimization (Athena Scientific, 1997).
Among its features, the book:
develops comprehensively the theory of descent and approximation methods, including gradient and subgradient projection methods, cutting plane and simplicial decomposition methods, and proximal methods
describes and analyzes augmented Lagrangian methods, and alternating direction methods of multipliers
develops the modern theory of coordinate descent methods, including distributed asynchronous convergence analysis
comprehensively covers incremental gradient, subgradient, proximal, and constraint projection methods
includes optimal algorithms based on extrapolation techniques, and associated rate of convergence analysis
describes a broad variety of applications of large-scale optimization and machine learning
contains many examples, illustrations, and exercises
is structured to be used conveniently either as a standalone text for a class on convex analysis and optimization, or as a theoretical supplement to either an applications/convex optimization models class or a nonlinear programming class
Dimitri P. Bertsekas is Fulton Professor of Computational Decision Making at the Arizona State University, McAfee Professor of Engineering at the Massachusetts Institute of Technology, and a member of the prestigious United States National Academy of Engineering. He is the recipient of the 2001 A. R. Raggazini ACC education award and the 2009 INFORMS expository writing award. He has also received 2014 ACC Richard E. Bellman Control Heritage Award for "contributions to the foundations of deterministic and stochastic optimization-based methods in systems and control," the 2014 Khachiyan Prize for Life-Time Accomplishments in Optimization, the SIAM/MOS 2015 George B. Dantzig Prize, and the 2022 IEEE Control Systems Award. Together with his coauthor John Tsitsiklis, he was awarded the 2018 INFORMS John von Neumann Theory Prize, for the contributions of the research monographs "Parallel and Distributed Computation" and "Neuro-Dynamic Programming".
The material listed below can be freely downloaded, reproduced, and distributed. The following sets of slides reflect an increasing emphasis on algorithms over time.