by Dimitri P. Bertsekas and Steven E. Shreve
ISBN: 1-886529-03-5
Publication: 1996, 330 pages, softcover
Price: $34.50
EBOOK at Google Play
Preview at Google Books
This research monograph, first published in 1978 by Academic Press, remains the authoritative and comprehensive treatment of the mathematical foundations of stochastic optimal control of discrete-time systems, including the treatment of the intricate measure-theoretic issues. It is an excellent supplement to the first author's Dynamic Programming and Optimal Control (Athena Scientific, 2000).
Review of the 1978 printing:
"Bertsekas and Shreve have written a fine book. The exposition is extremely clear and a helpful
introductory chapter provides orientation and a guide to the rather intimidating mass of literature
on the subject. Apart from anything else, the book serves as an excellent introduction to the arcane
world of analytic sets and other lesser known byways of measure theory."
Mark H. A. Davis,
Imperial College, in IEEE Trans. on Automatic Control
resolves definitively the mathematical issues of discrete-time stochastic optimal control problems, including Borel models, and semi-continuous models
establishes the most general possible theory of finite and infinite horizon stochastic dynamic programming models, through the use of analytic sets and universally measurable policies
develops general frameworks for dynamic programming based on abstract contraction and monotone mappings
provides extensive background on analytic sets, Borel spaces and their probability measures
contains much in depth research not found in any other textbook
Dimitri P. Bertsekas is McAfee Professor of Engineering at the Massachusetts Institute of Technology and a member of the National Academy of Engineering. Steven Shreve is Professor of Mathematics at the Carnegie Mellon University.